<rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/" version="2.0">
<channel>
<title>Daniel's Journal</title>
<description><![CDATA[  ]]></description>
<link>https://danielraffel.me</link>

<lastBuildDate>Mon, 06 Apr 2026 19:03:13 -0700</lastBuildDate>
<atom:link href="https://danielraffel.me" rel="self" type="application/rss+xml"/>
<ttl>60</ttl>

    <item>
        <title><![CDATA[ How to Install RepoPrompt Globally in Claude Code ]]></title>
        <description><![CDATA[ By default, Repo Prompt installs its MCP server for Claude Code per-project via .mcp.json files. Here&#39;s how I made it available globally in every Claude Code session. ]]></description>
        <link>https://danielraffel.me/til/2026/03/24/how-to-install-repoprompt-globally-in-claude-code/</link>
        <guid isPermaLink="false">69c2e70c7f6675036c092126</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 24 Mar 2026 16:16:51 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/03/D613E844-8C74-437A-B47E-A50215674642.png" medium="image"/>
        <content:encoded><![CDATA[ <p><a href="https://repoprompt.com/?atp=jRImXV&ref=danielraffel.me" rel="noreferrer">RepoPrompt</a> is a context-focused IDE with agent tooling designed for reliability and token efficiency. I use it with agents like Claude/Codex when working with monorepos, large or familiar codebases, and projects that have become complex. In practice, it consistently improves my codegen success rate. It's been described to me as a:</p><blockquote>• highly context-efficient agent tooling with <a href="https://en.wikipedia.org/wiki/Abstract_syntax_tree?ref=danielraffel.me" rel="noreferrer">AST-aware</a> codemaps that let you target specific functionality precisely,<br>• a context builder that assembles deep, structured prompts from those tools,<br>• a chat/oracle layer that uses that context for extended analysis.<br><br>The core issue is that agents rarely have enough context to see the full picture, so they mix research, planning, and review instead of treating them as separate steps.<br><br>RepoPrompt functions as a context pump, applying context engineering to generate highly efficient analysis prompts.</blockquote><p>By default, its MCP server for Claude Code is configured per-project via&nbsp;<code>.mcp.json</code> files. But if you want it available in every Claude Code session, regardless of which directory you're in, you can install it globally.</p><h2 id="agent-mode-is-amazing">Agent Mode Is Amazing</h2><p>I run Repo Prompt in <a href="https://repoprompt.com/docs?ref=danielraffel.me#s=agent-mode&ss=sessions" rel="noreferrer">Agent Mode</a> with Claude connected via MCP, so the agent can call Repo Prompt’s context engine natively inside the session. The result is that I rarely think about prompting. I just describe what I want, and the agent automatically pulls the right context or invokes Repo Prompt when needed. It feels like a clean separation of intent and execution, where I stay high-level and the system has a new super power that assists with discovery, context, and reasoning.</p><h2 id="prerequisites">Prerequisites</h2><ul><li><a href="https://repoprompt.com/?ref=danielraffel.me">RepoPrompt</a>&nbsp;installed (the macOS app)</li><li><a href="https://docs.anthropic.com/en/docs/claude-code?ref=danielraffel.me">Claude Code</a>&nbsp;installed</li></ul><h2 id="background-context">Background Context</h2><p>Claude Code has two config files that look like they'd be the right place for global MCP servers:</p><ul><li><code>~/.claude/settings.json</code>&nbsp;— global settings (permissions, plugins, env vars)</li><li><code>~/.claude.json</code>&nbsp;— user-level config (MCP servers, preferences)</li></ul><p>Despite the naming,&nbsp;<strong>MCP servers go in&nbsp;<code>~/.claude.json</code></strong>, not&nbsp;<code>~/.claude/settings.json</code>. </p><h2 id="step-1-locate-the-repoprompt-mcp-binary">Step 1: Locate the RepoPrompt MCP Binary</h2><p>RepoPrompt ships an MCP server binary inside its app bundle. It also creates a convenient symlink during setup:</p><pre><code class="language-bash"># The symlink (created by RepoPrompt's setup)
~/RepoPrompt/repoprompt_cli
# Which points to the actual binary inside the app bundle
/Applications/Repo\ Prompt.app/Contents/MacOS/repoprompt-mcp</code></pre><p>You can verify this on your machine:</p><pre><code class="language-bash"># Check the symlink exists and where it points
ls -la ~/RepoPrompt/repoprompt_cli</code></pre><p>If the symlink doesn't exist, you can use the full app bundle path directly.</p><h2 id="step-2-add-repoprompt-to-claudejson">Step 2: Add RepoPrompt to ~/.claude.json</h2><p>Open&nbsp;<code>~/.claude.json</code>&nbsp;and find the&nbsp;<code>"mcpServers"</code>&nbsp;object. Add the RepoPrompt entry alongside your other servers:</p><pre><code class="language-jsonc">{
  // ... other config (numStartups, autoUpdates, etc.) ...
  "mcpServers": {
    // ... your other MCP servers ...
    // Add RepoPrompt globally — uses the symlink created by RepoPrompt's setup
    "RepoPrompt": {
      "command": "/Users/YOUR_USERNAME/RepoPrompt/repoprompt_cli",
      "args": []
    }
  }
  // ... rest of config ...
}</code></pre><p>Replace&nbsp;<code>YOUR_USERNAME</code>&nbsp;with your macOS username. Alternatively, use the full app bundle path if you prefer not to rely on the symlink:</p><pre><code class="language-jsonc">{
  "mcpServers": {
    "RepoPrompt": {
      // Direct path to the binary inside the app bundle
      "command": "/Applications/Repo Prompt.app/Contents/MacOS/repoprompt-mcp",
      "args": []
    }
  }
}</code></pre><h2 id="step-3-restart-claude-code">Step 3: Restart Claude Code</h2><p>Restart Claude Code, then run&nbsp;<code>/mcp</code>&nbsp;to verify. You should see:</p><pre><code class="language-bash">RepoPrompt · ✔ connected</code></pre><p>That's it. RepoPrompt's MCP tools:&nbsp;<code>context_builder</code>,&nbsp;<code>file_search</code>,&nbsp;<code>apply_edits</code>,&nbsp;<code>get_file_tree</code>, and more are now available globally in every Claude Code session.</p><hr><h2 id="agent-prompt">Agent Prompt</h2><p>Alternatively, skip the steps above and just use this prompt with Claude and let it do it for you:</p><pre><code class="language-bash">Install RepoPrompt's MCP server globally for Claude Code. Here's what you need to know:
1. Find the RepoPrompt MCP binary. It's either at ~/RepoPrompt/repoprompt_cli (a symlink created by the app) or directly at /Applications/Repo Prompt.app/Contents/MacOS/repoprompt-mcp. Verify which path exists on my system.
2. Add it to ~/.claude.json (NOT ~/.claude/settings.json — MCP servers live in ~/.claude.json despite the naming). Find the "mcpServers" object and add a "RepoPrompt" entry with the binary path and empty args array.
3. Tell me to restart Claude Code and run /mcp to verify it shows as connected. </code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ My Oscar 2026 Picks ]]></title>
        <description><![CDATA[ Every year I make my Oscar predictions. This year I put together a small website with my ballot that updates every five minutes during the awards show using data scraped from the Oscars&#39; realtime feed via Chrome DevTools, published with Claude using the new scheduled tasks /loop feature. ]]></description>
        <link>https://danielraffel.me/2026/03/16/my-oscar-2026-picks/</link>
        <guid isPermaLink="false">69b757a620d322036d676c54</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sun, 15 Mar 2026 18:16:42 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/03/36D8C19F-2708-48DC-8479-E25B82963E4B.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Every year I make my Oscar predictions. This year I put together a <a href="https://www.generouscorp.com/oscars-2026/?ref=danielraffel.me#ballot-picks" rel="noreferrer">small website</a> with my ballot that updates every five minutes during the awards show using data scraped from the <a href="https://www.oscars.org/oscars/ceremonies/2026?ref=danielraffel.me" rel="noreferrer">Oscars</a>' realtime feed via <a href="https://github.com/ChromeDevTools/chrome-devtools-mcp?ref=danielraffel.me" rel="noreferrer">Chrome DevTools</a>, published with Claude using the new <a href="https://code.claude.com/docs/en/scheduled-tasks?ref=danielraffel.me" rel="noreferrer">scheduled tasks</a> <code>/loop</code> bundled skill.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://www.generouscorp.com/oscars-2026/?ref=danielraffel.me#ballot-picks"><div class="kg-bookmark-content"><div class="kg-bookmark-title">2026 Oscar Pool Picks &amp; Predictions</div><div class="kg-bookmark-description">Category-by-category 2026 Academy Awards predictions, Oscar pool picks, and a printable ballot cheat sheet to help you win Oscar night.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://static.ghost.org/v5.0.0/images/link-icon.svg" alt=""><span class="kg-bookmark-author">Academy Awards Prediction Guide</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/oscar.png" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>To set this up, I  just asked Claude to set up a cron, which it interpreted as creating a schedule. And it runs this task every five minutes for the duration of the show, which I predicted to be about 3.5 hours long.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/PNG-image.png" class="kg-image" alt="" loading="lazy" width="2000" height="583" srcset="https://danielraffel.me/content/images/size/w600/2026/03/PNG-image.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/PNG-image.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/03/PNG-image.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/03/PNG-image.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Claude is so smart 🤯</span></figcaption></figure><p>Then it just visits this website in Chrome and gets the data. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/image-5.png" class="kg-image" alt="" loading="lazy" width="2000" height="1518" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image-5.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/image-5.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/03/image-5.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/03/image-5.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Chrome being controlled by Claude</span></figcaption></figure><p>Then Claude updates the HTML, commits the change, and pushes it. Because it’s a <a href="https://docs.github.com/en/pages?ref=danielraffel.me" rel="noreferrer">GitHub Pages</a> site, the update is deployed automatically via a <a href="https://github.com/features/actions?ref=danielraffel.me" rel="noreferrer">GitHub Action</a>.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/image-6.png" class="kg-image" alt="" loading="lazy" width="796" height="202" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image-6.png 600w, https://danielraffel.me/content/images/2026/03/image-6.png 796w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Final Results for 2026</span></figcaption></figure><hr><p>BTW the craziest part was Claude actually detected the tie and handled that perfectly.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/2026-Oscar-Pool-Picks---Predictions--Academy-Awards-Prediction-Guide.png" class="kg-image" alt="" loading="lazy" width="1206" height="2622" srcset="https://danielraffel.me/content/images/size/w600/2026/03/2026-Oscar-Pool-Picks---Predictions--Academy-Awards-Prediction-Guide.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/2026-Oscar-Pool-Picks---Predictions--Academy-Awards-Prediction-Guide.png 1000w, https://danielraffel.me/content/images/2026/03/2026-Oscar-Pool-Picks---Predictions--Academy-Awards-Prediction-Guide.png 1206w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Claude got the tie.</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/image-7.png" class="kg-image" alt="" loading="lazy" width="1608" height="556" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image-7.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/image-7.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/03/image-7.png 1600w, https://danielraffel.me/content/images/2026/03/image-7.png 1608w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">And wrapped up with a nice ending</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Control Ableton Live with Natural Language (MCP Setup) ]]></title>
        <description><![CDATA[ Today, I stumbled across an Ableton Live MCP server that lets tools like Claude or Codex control Ableton using natural language. Here&#39;s how I set it up. ]]></description>
        <link>https://danielraffel.me/til/2026/03/14/control-ableton-live-with-natural-language-mcp-setup/</link>
        <guid isPermaLink="false">69b5c8af83207c036ae4b622</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 14 Mar 2026 14:00:46 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/03/343AAABC-4504-4EE4-8344-4B31D566ECBA.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Today, I stumbled across an <a href="https://github.com/hidingwill/AbletonBridge?ref=danielraffel.me" rel="noreferrer">Ableton Live MCP server</a> that lets AI agents like Claude or Codex control <a href="https://www.ableton.com/?ref=danielraffel.me" rel="noreferrer">Ableton</a> using natural language.</p><p>For example, you can ask things like:</p><ul><li>Create a MIDI track with a drum rack</li><li>Add a reverb to the current track</li><li>Launch scene 3</li></ul><p>The project lives here:</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/hidingwill/AbletonBridge?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - hidingwill/AbletonBridge: 322 tools connecting LLM’s to Ableton Live</div><div class="kg-bookmark-description">322 tools connecting LLM’s to Ableton Live. Contribute to hidingwill/AbletonBridge development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-22.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">hidingwill</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/AbletonBridge" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>Below are the commands I used to install it on macOS.</p><hr><h2 id="1-install-uv-required">1. Install uv (required)</h2><p>Both projects use <a href="https://github.com/astral-sh/uv?ref=danielraffel.me">uv</a> to run the MCP server.</p><pre><code class="language-bash">curl -LsSf https://astral.sh/uv/install.sh | sh
</code></pre><hr><h2 id="2-install-the-remote-script">2. Install the Remote Script</h2><p>Clone the repo and copy the Remote Script into Ableton's User Library:</p><pre><code class="language-bash">git clone https://github.com/hidingwill/AbletonBridge.git ~/Code/AbletonBridge

cp -R ~/Code/AbletonBridge/AbletonBridge_Remote_Script \
  "$HOME/Music/Ableton/User Library/Remote Scripts/AbletonBridge"
</code></pre><hr><h2 id="3-register-the-mcp-server">3. Register the MCP server</h2><p>I use Claude and Codex, so I set them both up:</p><pre><code class="language-bash">claude mcp add --scope user AbletonBridge -- uv run --directory "$HOME/Code/AbletonBridge" ableton-bridge
codex mcp add AbletonBridge -- uv run --directory "$HOME/Code/AbletonBridge" ableton-bridge
</code></pre><hr><h2 id="4-configure-ableton-live">4. Configure Ableton Live</h2><p>Launch&nbsp;<strong>Ableton Live</strong>, then go to:</p><p><strong>Settings → Link, Tempo &amp; MIDI</strong></p><p>Set:</p><pre><code class="language-bash">Control Surface: AbletonBridge
Input: None
Output: None</code></pre><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2026/03/image.png" class="kg-image" alt="" loading="lazy" width="1348" height="1458" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/image.png 1000w, https://danielraffel.me/content/images/2026/03/image.png 1348w" sizes="(min-width: 720px) 720px"></figure><hr><h2 id="4-ask-your-agent-to-control-ableton">4. Ask Your Agent to Control Ableton</h2><p>After installing the MCP server, quit and reopen your agent so it registers the server. For example, if you’re using Claude, close and relaunch it.</p><p>Then open Ableton Live and ask the agent to do something like: </p><pre><code class="language-bash">set up a new session with a 909 drum track at 148 BPM, add a vintage-style synth, a 70s-style bass sound, and generate a few simple melodies</code></pre><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/image-1.png" class="kg-image" alt="" loading="lazy" width="2000" height="1111" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/image-1.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/03/image-1.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/03/image-1.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Starting the prompt in Claude</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/image-4.png" class="kg-image" alt="" loading="lazy" width="2000" height="1111" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image-4.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/image-4.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/03/image-4.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/03/image-4.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">When the prompt finished it explained what it did in Ableton</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/03/image-2.png" class="kg-image" alt="" loading="lazy" width="2000" height="1307" srcset="https://danielraffel.me/content/images/size/w600/2026/03/image-2.png 600w, https://danielraffel.me/content/images/size/w1000/2026/03/image-2.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/03/image-2.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/03/image-2.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Opening the Ableton Session that Claude setup</span></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="200" height="150" src="https://www.youtube.com/embed/Q00VmUNjg4I?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="" title="Live controlled by AbletonBridge MCP"></iframe><figcaption><p><span style="white-space: pre-wrap;">What it sounds like</span></p></figcaption></figure><p>I don’t see myself actually using this, but it’s pretty interesting that it’s even possible. It’s impressive that Ableton exposes a mature API that makes this kind of integration feasible.</p><p>While it wouldn’t really inspire me to make music this way, I could see it being useful for learning Ableton interactively or for automatically setting up more complex sessions.</p><hr><h2 id="windows-note"><strong>Windows note</strong></h2><p>If you’re on Windows and the install fails, <a href="https://github.com/ahujasid/ableton-mcp/issues/6?ref=danielraffel.me" rel="noreferrer">review this issue</a> from the original source repository (this post links to a more developed fork) before proceeding. The <strong>Remote Script path may differ</strong>, and installing into the&nbsp;<strong>User Library</strong>&nbsp;may not work on some systems.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ The Hardware Liberation Front ]]></title>
        <description><![CDATA[ I’ve been spending some of my free time writing software to keep perfectly good hardware from becoming e-waste. ]]></description>
        <link>https://danielraffel.me/2026/03/11/the-hardware-liberation-front/</link>
        <guid isPermaLink="false">69b1c625f8f36e88e7c8e7db</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 11 Mar 2026 13:15:13 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/03/hardware-liberation-front.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I’ve been spending some of my free time writing software to keep perfectly good hardware from becoming e-waste.</p><p>Some of that work includes:</p><ul><li>Writing modern&nbsp;<a href="https://developer.apple.com/documentation/driverkit?ref=danielraffel.me" rel="noreferrer"><strong>DriverKit</strong></a><strong> (USB &amp; Thunderbolt) audio drivers</strong>&nbsp;and companion desktop software for a legacy audio interface that was sunset by its original developers. It now works again on&nbsp;<strong>macOS 26</strong>, and hopefully future releases as well. <em>Currently in active testing before proposing to the rights owner next steps and how it might be publicly released.</em></li><li>Working with a company on&nbsp;<strong>new firmware for a hardware sequencer</strong>&nbsp;that is no longer being maintained. <em>Still in the exploratory phase, with source code access expected soon.</em></li><li>Releasing&nbsp;<a href="https://www.generouscorp.com/printer-bridge/?ref=danielraffel.me" rel="noreferrer"><strong>Printer Bridge</strong></a>, a macOS utility that enables&nbsp;<strong>AirPrint support for printers that already work with a Mac</strong>. <em>A small afternoon side project that came together quickly while I was doing other things.</em></li></ul><p>It took about six months of outreach to build enough trust with one of these companies for them to feel comfortable sharing the source code with me. The process taught me a lot—especially about diplomacy and how to earn support from companies who have very little direct upside and every reason to be hesitant.</p><p>When I mentioned the timeline to someone who doesn’t know me well, they laughed at me and said that I don’t value my time.</p><p>I disagree.</p><p>One of the best things we can do as stewards of the planet is keep useful things alive for as long as possible. Not only that, I love learning how things work by cracking them open and rebuilding them!</p><p>There is so much hardware that still works. It still sounds great, prints fine, feels good to use, or solves a real problem. What usually disappears isn’t the hardware — it’s the software around it. A driver stops getting updated. A desktop app falls behind. A missing feature makes something harder to use in a modern setup. Slowly, something valuable starts looking obsolete.</p><p>The good news is that&nbsp;<strong>software has never been easier to make</strong>. The tools are better, iteration is faster, and more people have the skills to build software that extends the life of existing devices.</p><p>Sometimes I reach out to companies that can no longer maintain older products and ask if they’d consider letting me help. The goal is simple: earn trust from the team, move slowly, and explore whether the software can be modernized without creating any work for them. If open sourcing becomes possible, even better. It's early days but I've done this a few times in the past 6-months and hope to do it a lot more!</p><p>There are enthusiastic users, customers, and developers who would gladly help keep great hardware alive. That’s good for the hardware, good for the people who bought it, and good for the companies that made it.</p><p>Lately I’ve been calling this idea the&nbsp;<strong>Hardware Liberation Front</strong>.</p><p>A loose group of people who write software to keep great hardware alive for as long as possible—drivers, firmware updates, utilities, compatibility layers, bridges between old devices and modern systems.</p><p>This idea isn’t new. People like <a href="https://en.wikipedia.org/wiki/Yvon_Chouinard?ref=danielraffel.me" rel="noreferrer">Yvon Chouinard</a> have encouraged repairing and extending the life of things for decades. And the open-source community has long done the same in software, maintaining the code much of the modern world runs on. What’s changing now is that better tools—and AI in particular—make it possible for many more people (like me) to easily participate.</p><p>If the idea resonates with you, join in.</p><p>Keep your gear alive.<br>Help someone else keep theirs alive.<br>Write an update. Share the tool. Open source the fix.</p><p>A little software can go a long way.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ A Claude Code Plugin for Building JUCE Audio Plugins ]]></title>
        <description><![CDATA[ A while back I wrote about how to start developing audio apps and plugins on macOS using the JUCE-Plugin-Starter I created. I&#39;ve since turned that workflow into a Claude Code plugin called juce-dev. ]]></description>
        <link>https://danielraffel.me/2026/03/06/a-claude-code-plugin-for-building-juce-audio-plugins/</link>
        <guid isPermaLink="false">69aa407313b74a036ec51784</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 05 Mar 2026 19:20:12 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/03/40C91E63-F528-4C0D-982B-D9B85D77DFB4.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>TL;DR:</strong>&nbsp;If you want to build audio apps/plugins on macOS (with GPU UIs via Metal), try the&nbsp;<a href="https://www.generouscorp.com/generous-corp-marketplace/plugins/juce-dev/?ref=danielraffel.me" rel="noreferrer"><strong>juce-dev</strong></a>&nbsp;Claude Code plugin. It automates all the boilerplate so you can go from zero to a compiling plugin with one command.</blockquote><p>A while back I wrote about <a href="https://danielraffel.me/2025/05/30/how-to-start-developing-audio-plugins-on-macos/">how to start developing audio apps and plugins on macOS</a> using the <a href="https://github.com/danielraffel/JUCE-Plugin-Starter?ref=danielraffel.me">JUCE-Plugin-Starter</a> I created. That template automates a lot of the tedious setup — dependency installation, project scaffolding, code signing, Xcode project generation — through a set of shell scripts.</p><p>I've since turned that workflow into a Claude Code plugin called <a href="https://www.generouscorp.com/generous-corp-marketplace/plugins/juce-dev/?ref=danielraffel.me" rel="noreferrer"><strong>juce-dev</strong></a>. Instead of running scripts in your terminal and answering prompts, you type <code>/juce-dev:create "My Plugin"</code> in Claude Code and it walks you through the whole thing interactively.</p><h2 id="what-it-does">What it does</h2><p>The plugin's <code>create</code> command handles the full project setup:</p><ul><li>Checks that your Mac has the required tools (Xcode CLT, Homebrew, CMake) and offers to install anything missing</li><li>Finds your JUCE-Plugin-Starter template and checks if your JUCE version is current</li><li>Pulls your developer settings (Apple ID, Team ID, certificates) from the template's <code>.env</code> so you don't re-enter them for every project</li><li>Generates all the derived values — class names, bundle IDs, 4-letter JUCE plugin codes — from the plugin name you provide</li><li>Creates a new project folder, replaces template placeholders, initializes git, and optionally creates a public or private GitHub repo</li><li>Optionally sets up <a href="https://github.com/danielraffel/JUCE-Plugin-Starter/tree/main/Tools/DiagnosticKit?ref=danielraffel.me" rel="noreferrer">DiagnosticKit</a> a Swift app that can collect user bugs and publish diagnostics with attachments via GitHub Issues (and walks you through configuring the token you'll need to configure that repo)</li></ul><p>There are also commands for building (<code>/juce-dev:build</code>) and adding Visage or iOS targets to existing projects.</p><h2 id="visage-gpu-ui">Visage GPU UI</h2><p>The plugin has built-in support for <a href="https://github.com/VitalAudio/visage?ref=danielraffel.me">Visage</a>, a GPU-accelerated UI framework that renders via <a href="https://developer.apple.com/metal/?ref=danielraffel.me" rel="noreferrer">Metal</a>. Pass <code>--visage</code> when creating a project and the plugin will clone <a href="https://github.com/danielraffel/visage?ref=danielraffel.me">my Visage fork</a> (which includes <a href="https://github.com/danielraffel/visage/pull/11?ref=danielraffel.me" rel="noreferrer">iOS touch support</a> and several DAW compatibility patches), copy the JUCE↔Visage bridge files, wire up CMake, and give you a project that renders with Metal out of the box.</p><p>The plugin includes a companion skill called <a href="https://github.com/danielraffel/generous-corp-marketplace/tree/master/skills/claude/juce-visage?ref=danielraffel.me" rel="noreferrer"><strong>juce-visage</strong></a> that provides detailed guidance for working with Visage inside <a href="https://github.com/juce-framework/JUCE?ref=danielraffel.me" rel="noreferrer">JUCE</a> — things like embedding an MTKView in a plugin window, bridging keyboard and mouse events between JUCE and Visage, handling focus in DAW hosts, building popups and modals inside the GPU layer, and managing the tricky destruction ordering that Metal's display link requires.</p><h2 id="who-this-is-for">Who this is for</h2><p>Primarily people who are developing macOS or iOS audio apps and:</p><ul><li>Want to build an audio plugin but haven't set up their Mac for it yet</li><li>Have never worked with JUCE before and want to skip the boilerplate</li><li>Want to get from zero to a compiling, running plugin as quickly as possible</li></ul><p>You don't need an <a href="https://developer.apple.com/?ref=danielraffel.me" rel="noreferrer">Apple Developer</a> account to build and test locally. If you do have one, the plugin will configure code signing and notarization so your builds are ready to distribute.</p><p>If you already know JUCE well, the plugin still saves time on the repetitive project setup — generating Xcode projects, managing <code>.env</code> configurations, and wiring up optional dependencies like Visage.</p><h2 id="codex-users">Codex users</h2><p>The plugin is for Claude Code, but if you use OpenAI's Codex CLI, there's a standalone <strong>juce-visage</strong> skill you can install. It provides the same Visage integration guidance (Metal embedding, event bridging, iOS touch handling, etc.) but not the automated project creation commands.</p><p>Install it with a sparse checkout:</p><pre><code class="language-bash">mkdir -p ~/.codex/skills
cd ~/.codex/skills

git clone --no-checkout --depth 1 --filter=blob:none --sparse \
  https://github.com/danielraffel/generous-corp-marketplace.git tmp-juce-visage
cd tmp-juce-visage

git sparse-checkout set skills/codex/juce-visage
git checkout

mv skills/codex/juce-visage ~/.codex/skills/

cd ~/.codex/skills
rm -rf tmp-juce-visage
</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Sparse Checkout the Claude Code Skill Creator ]]></title>
        <description><![CDATA[ Anthropic released skill-creator, a skill for creating and iteratively improving other skills. If you’re not familiar with building skills, it’s worth trying out. Here&#39;s how to pull just that folder from the GitHub repo without cloning the entire thing using Git’s sparse checkout. ]]></description>
        <link>https://danielraffel.me/til/2026/03/05/how-to-sparse-checkout-the-claude-code-skill-creator/</link>
        <guid isPermaLink="false">69aa06e9b66f8216f730004e</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 05 Mar 2026 14:54:39 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/03/FA6E88A8-CADB-45E9-B04A-1D79B7693F86.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Anthropic released <a href="https://github.com/anthropics/skills/tree/main/skills/skill-creator?ref=danielraffel.me" rel="noreferrer">skill-creator</a>, a <a href="https://platform.claude.com/docs/en/agents-and-tools/agent-skills/overview?ref=danielraffel.me" rel="noreferrer">skill</a> for creating and iteratively improving other skills. I’ve been using it and found it really useful when writing skills.</p><p>If you’re not familiar with building skills, it’s worth trying out. You can pull just that folder from the GitHub repo without cloning the entire thing using Git’s sparse checkout:</p><pre><code class="language-bash">mkdir -p ~/.claude
cd ~/.claude

git clone --no-checkout --depth 1 --filter=blob:none --sparse https://github.com/anthropics/skills.git tmp-skills
cd tmp-skills

git sparse-checkout set skills/skill-creator
git checkout

mkdir -p ~/.claude/skills
mv skills/skill-creator ~/.claude/skills/

cd ..
rm -rf tmp-skills</code></pre><p>This clones only the repository metadata and checks out just the <code>skills/skill-creator</code> folder.</p><p><code>--filter=blob:none</code> avoids downloading file contents until needed, and <code>--depth 1</code> limits the clone to the latest commit.</p><p>The folder is then moved into <code>~/.claude/skills/</code> so Claude can load it.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ CGEvent Taps and Code Signing: The Silent Disable Race ]]></title>
        <description><![CDATA[ If you have built a macOS app that intercepts media keys or system-wide keyboard events it can be a nightmare: the tap appears to install successfully, but after re-signing the binary and launching via the Dock or Finder, events never fire. Here’s what was happening to me, and how I fixed it. ]]></description>
        <link>https://danielraffel.me/til/2026/02/19/cgevent-taps-and-code-signing-the-silent-disable-race/</link>
        <guid isPermaLink="false">6996bf3e2bd106036e4221b1</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 18 Feb 2026 23:51:51 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/FB40C9EC-3593-4506-A729-BCAAA44336D1.png" medium="image"/>
        <content:encoded><![CDATA[ <p>If you’ve built a macOS tool that intercepts media keys or system-wide keyboard events using <code>CGEvent.tapCreate()</code>, you may have hit this: the tap appears to install successfully, but after re-signing the app and launching via the Dock/Finder (or <code>open</code>), events never fire. Pressing volume keys does nothing. No crash. No error. The tap just goes quiet.</p><p>Here’s what was happening for me and what fixed it.</p><h3 id="quick-repro">Quick repro</h3><ul><li>Re-sign the app → launch via Finder/Dock or <code>open MyApp.app</code> → tap installs but no events fire</li><li>Launch the Mach-O directly (<code>.../Contents/MacOS/MyApp</code>) → events fire normally</li></ul><h2 id="the-setup">The Setup</h2><p>Media keys (play/pause, volume, brightness) often show up as “system defined” events (<code>NX_SYSDEFINED</code>) at the CoreGraphics layer. I intercepted them with a session event tap:</p><pre><code class="language-swift">let eventMask: CGEventMask = (1 &lt;&lt; CGEventType.systemDefined.rawValue)

let tap = CGEvent.tapCreate(
    tap: .cgSessionEventTap,
    place: .headInsertEventTap,
    options: .defaultTap,
    eventsOfInterest: eventMask,
    callback: eventCallback,
    userInfo: nil
)</code></pre><p>In my case,&nbsp;<code>NSEvent.addGlobalMonitorForEvents(matching: .systemDefined)</code>&nbsp;was not reliable for the media keys I needed, so I used a CoreGraphics event tap.</p><h2 id="the-bug-what-i-observed"><strong>The Bug (What I Observed)</strong></h2><p>After re-signing a new build (e.g.&nbsp;<code>codesign --force --deep</code>) and launching via Finder/Dock or&nbsp;open MyApp.app:</p><ul><li><code>tapCreate()</code>&nbsp;returned a non-nil&nbsp;CFMachPort</li><li><code>CGEvent.tapIsEnabled()</code>&nbsp;initially returned&nbsp;true</li><li>then… no callbacks ever fired</li></ul><p>Launching the binary directly worked:</p><pre><code class="language-bash">/Applications/MyApp.app/Contents/MacOS/MyApp</code></pre><p>In my setup, the failure correlated with Launch Services launches after re-signing, and behaved like a permission/trust evaluation issue: the tap existed, but didn’t receive events, and the usual "disabled" callback path wasn’t dependable.</p><h2 id="the-fix"><strong>The Fix</strong></h2><ol><li><strong>Deployment: launch the binary directly (avoid open)</strong></li></ol><pre><code class="language-bash">nohup /Applications/MyApp.app/Contents/MacOS/MyApp \
  &gt;&gt; ~/Library/Logs/MyApp.log 2&gt;&amp;1 &amp; disown</code></pre><p>The app still appears in the Dock and behaves like a normal app launch; the main difference is stdout/stderr goes to the log file.</p><ol start="2"><li><strong>Safety net: continuously verify tap health and recover</strong></li></ol><pre><code class="language-bash">Timer.scheduledTimer(withTimeInterval: 5.0, repeats: true) { _ in
    guard let tap = self.eventTap else { return }

    if !CGEvent.tapIsEnabled(tap: tap) {
        CGEvent.tapEnable(tap: tap, enable: true)

        // If tapEnable doesn't stick, reinstall the tap:
        // remove from RunLoop, create new tap, re-add.
        if !CGEvent.tapIsEnabled(tap: tap) {
            self.reinstallEventTap()
        }
    }
}</code></pre><p>And handle both disable events in the callback:</p><pre><code class="language-bash">if type == .tapDisabledByTimeout || type == .tapDisabledByUserInput {
    if let tap = info?.tap {
        CGEvent.tapEnable(tap: tap, enable: true)
    }
    return nil
}</code></pre><h2 id="what-to-check-first-permissions"><strong>What to Check First (Permissions)</strong></h2><p>Before installing the tap, verify Input Monitoring (“ListenEvent”) permission:</p><pre><code class="language-bash">if !CGPreflightListenEventAccess() {
    CGRequestListenEventAccess()
}</code></pre><p>Input Monitoring is required to&nbsp;<em>listen</em>&nbsp;for global events. Also note: if you use&nbsp;<code>.defaultTap</code>&nbsp;(not&nbsp;<code>.listenOnly</code>), you may additionally need Accessibility permission for full interception/interaction. Missing permissions can look identical to the "tap installed but no events" symptom.</p><h2 id="summary"><strong>Summary</strong></h2>
<!--kg-card-begin: html-->
<table style="caret-color: rgb(255, 255, 255); color: rgb(255, 255, 255); font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: start; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-line: none; text-decoration-thickness: auto; text-decoration-style: solid;"><thead><tr><th><p class="p1"><b>Symptom</b></p></th><th><p class="p1"><b>Likely cause</b></p></th></tr></thead><tbody><tr><td><p class="p1">Tap non-nil but no events, direct Terminal launch works</p></td><td><p class="p1">Identity/permission/trust differences after re-signing + Launch Services launch path (observed)</p></td></tr><tr><td><p class="p1">Tap is<span class="Apple-converted-space">&nbsp;</span><span class="s1">nil</span></p></td><td><p class="p1">Not permitted for that tap location/type, or creation failed</p></td></tr><tr><td><p class="p1">Tap dies mid-session</p></td><td><p class="p1">Timeout/user-input disable; recover in callback and via health checks</p></td></tr><tr><td><p class="p1"><span class="s1">NSEvent</span><span class="Apple-converted-space">&nbsp;</span>monitor misses keys</p></td><td><p class="p1">Not reliable for your target keys; use CoreGraphics tap</p></td></tr></tbody></table>
<!--kg-card-end: html-->
<p><strong>Key insight:</strong>&nbsp;a non-nil tap is not a healthy tap. Always verify&nbsp;<code>tapIsEnabled</code>&nbsp;at runtime, not just at install time.</p><h2 id="working-assumptions-based-on-observed-behavior">Working assumptions (based on observed behavior)</h2><p>I can’t prove the exact internal mechanism, but given the repeatable pattern in my setup, I’m operating under these assumptions:</p><ul><li><strong>TCC decisions are tied to code identity</strong>, and re-signing can effectively create a "new" identity that requires re-evaluation (or re-granting) for Input Monitoring / Accessibility.</li><li><strong>Launching via Launch Services</strong> (<code>open</code>, Finder/Dock) is more likely to trigger that identity/permission re-evaluation than launching the Mach-O directly.</li><li>When this happens, <strong>a <code>CGEvent</code> tap can exist but be functionally inert</strong> (no callbacks), and the normal "tap disabled" callback path may not reliably fire—so <strong>health checks + reinstall</strong> are the pragmatic mitigation.</li></ul><p>If the issue reproduces, the fastest way to validate the hypothesis is to compare:</p><ul><li>direct exec vs <code>open</code></li><li><code>tapIsEnabled</code> over time</li><li>current TCC grants for Input Monitoring / Accessibility after each re-sign</li></ul><h2 id="optional-mitigation-delay-tap-installation">Optional mitigation: delay tap installation</h2><p>In my setup, adding a short delay before installing the tap reduced failures after re-sign + Launch Services launch. I treat this as a best-effort workaround (timing-dependent), not the core fix.</p><pre><code class="language-swift">DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
    self.installEventTap()
}</code></pre><p>Or (often safer): install immediately, then recheck/reinstall after a short delay:</p><pre><code class="language-swift">installEventTap()
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
    verifyOrReinstallTap()
}</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Wemo + Google Assistant: Back From the Dead (And Faster Than Before) ]]></title>
        <description><![CDATA[ I have a house full of Wemo switches, dimmers, and plugs, and I use Google Home speakers in pretty much every room. &quot;Hey Google, turn off the kitchen light&quot; was part of my daily routine — until Belkin pulled the plug. Here&#39;s how I restored service. ]]></description>
        <link>https://danielraffel.me/til/2026/02/19/wemo-google-assistant-back-from-the-dead-and-faster-than-before/</link>
        <guid isPermaLink="false">69965c9373674603719f752e</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 18 Feb 2026 16:51:48 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/1CBCCA4A-FE33-4050-8419-7BEDA9ED0B47.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I have a house full of Wemo switches, dimmers, and plugs, and I use Google Home speakers in pretty much every room. "Hey Google, turn off the kitchen light" was part of my daily routine — until Belkin pulled the plug.</p><p>On January 31, 2026, Belkin the company behind the Wemo–Google Home integration, <a href="https://support.google.com/googlenest/answer/9159862?hl=en&ref=danielraffel.me" rel="noreferrer">turned down the integration</a> and I guess Google decided not to step in with a solution. </p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2026/02/image-11.png" class="kg-image" alt="" loading="lazy" width="1430" height="368" srcset="https://danielraffel.me/content/images/size/w600/2026/02/image-11.png 600w, https://danielraffel.me/content/images/size/w1000/2026/02/image-11.png 1000w, https://danielraffel.me/content/images/2026/02/image-11.png 1430w" sizes="(min-width: 720px) 720px"></figure><p>Overnight, my devices went offline in the Google Home app and stopped responding to voice commands.</p><h3 id="finding-the-fix">Finding the Fix</h3><p>A <a href="https://www.reddit.com/r/WeMo/comments/1qwoygy/google_speaker_voice_commands_back_up_and_running/?ref=danielraffel.me" rel="noreferrer">Reddit post</a> pointed me in the right direction. People were restoring full voice control using Home Assistant and the Matterbridge add-on to bridge their devices to Google Home via the Matter protocol. I already had Home Assistant running with my Wemo devices connected through the built-in <a href="https://www.home-assistant.io/integrations/wemo?ref=danielraffel.me" rel="noreferrer">Belkin WeMo&nbsp;integration</a> — which uses UPnP and never touched Belkin's cloud — so my devices were already working in HA. I just needed to expose them to Google Home.</p><h3 id="the-setup">The Setup</h3><ol><li><strong>(Only if required) Update HA Supervisor</strong><br>When I first tried to add a third-party add-on repository, I hit this error: <code>'StoreManager.add_repository' blocked from execution, supervisor needs to be updated first</code><br>Fix: go to Settings → System → Updates and update the Supervisor before doing anything else.</li><li><strong>Add the Matterbridge repository</strong><br>In Home Assistant: Settings → Add-ons → Add-on Store, tap the overflow menu (⋮) in the top right, select Repositories, and add: <a href="https://github.com/Luligu/matterbridge-home-assistant-addon?ref=danielraffel.me"><code>https://github.com/Luligu/matterbridge-home-assistant-addon</code></a></li><li><strong>Install and start Matterbridge</strong><br>Find Matterbridge Home Assistant Application in the store, install it, then start it and open the Web UI.</li><li><strong>Install the matterbridge-hass plugin</strong><br>In the Matterbridge UI, install the matterbridge-hass plugin. Configure it with:<br>- Your Home Assistant URL (e.g. <a href="http://homeassistant.local:8123/?ref=danielraffel.me">http://homeassistant.local:8123</a>)<br>- A long-lived access token (HA → your profile → Security → Create token)<br><br>Restart the Matterbridge add-on to let it load your HA entities.</li><li><strong>Pair with Google Home</strong><br>In the Google Home app, tap + → Set up device → Matter device, then scan the QR code shown in the Matterbridge add-on. Walk through the pairing flow and assign devices to rooms.</li></ol><h3 id="end-result">End Result</h3><p>All of my Wemo devices are back in Google Home, online, and responding to voice commands. Pleasant surprise: they're noticeably faster than before. The old Belkin integration routed commands through the cloud. Now everything runs locally over Matter — "Hey Google, turn off the backyard light" and it's instant.</p><p>If you're a Wemo user who lost Google Assistant control in February 2026, this setup is worth the 15 minutes it takes. You get your voice commands back, you get local control, and you end up in a better place than you started.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ I Built a WebDriver for WKWebView Tauri Apps on macOS ]]></title>
        <description><![CDATA[ I&#39;ve been exploring Tauri lately – the framework for building desktop (and mobile) apps using web technologies with a Rust backend. I built an an open-source W3C WebDriver implementation to make it easy to test Tauri apps on macOS. ]]></description>
        <link>https://danielraffel.me/2026/02/14/i-built-a-webdriver-for-wkwebview-tauri-apps-on-macos/</link>
        <guid isPermaLink="false">698ff0f060eedb036bf4d2ae</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 13 Feb 2026 20:43:32 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/0A590B8E-BAD8-433E-9904-548483C7871A.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>TL;DR:</strong>&nbsp;Building a Tauri app on macOS? <a href="https://github.com/danielraffel/tauri-webdriver?ref=danielraffel.me" rel="noreferrer">Tauri-WebDriver</a> is an open-source WebDriver + MCP integration for E2E testing—so tools (and agents) can click/type/screenshot like they would in a browser. Built with Claude Code, <a href="https://github.com/danielraffel/tauri-webdriver?tab=readme-ov-file&ref=danielraffel.me#quick-start" rel="noreferrer">follow the quick start</a> to try it out.</blockquote><p>I've been exploring <a href="https://v2.tauri.app/?ref=danielraffel.me">Tauri</a> lately – the framework for building desktop (and mobile) apps using web technologies with a <a href="https://rust-lang.org/?ref=danielraffel.me" rel="noreferrer">Rust</a> backend. If you haven't come across it, you: write your UI in <a href="https://react.dev/?ref=danielraffel.me" rel="noreferrer">React</a>/<a href="https://vuejs.org/?ref=danielraffel.me" rel="noreferrer">Vue</a>/<a href="https://svelte.dev/?ref=danielraffel.me" rel="noreferrer">Svelte</a>/whatever, let Rust handle the backend, and ship a native binary that's a fraction of the size of an <a href="https://www.electronjs.org/?ref=danielraffel.me" rel="noreferrer">Electron</a> app.</p><h2 id="why-tauri">Why Tauri?</h2><p>Honestly, I was just tired of <a href="https://developer.apple.com/xcode/?ref=danielraffel.me" rel="noreferrer">Xcode</a> build times and wanted the speed of web development. If you've ever maintained separate native codebases for desktop and mobile, you know the pain. Tauri lets you use the web stack you already know while still getting native-feeling performance and access to system APIs. The Rust backend gives you a memory-safe native backend (no bundled browser runtime) and the kind of mature tooling you want when dealing with lots of data or running AI workloads locally. You're not shipping a bundled Chromium – Tauri uses the platform's native webview (<a href="https://developer.apple.com/documentation/webkit/wkwebview?ref=danielraffel.me" rel="noreferrer">WKWebView</a> on macOS, <a href="https://webkitgtk.org/?ref=danielraffel.me" rel="noreferrer">WebKitGTK</a> on Linux, <a href="https://developer.microsoft.com/en-us/microsoft-edge/webview2/?form=MA13LH&ref=danielraffel.me" rel="noreferrer">WebView2</a> on Windows), so binaries stay small and resource usage stays low.</p><p>It seems to be a popular choice right now. Apps like <a href="https://www.codexmonitor.app/?ref=danielraffel.me">Codex Monitor</a> (a desktop command center for orchestrating AI coding agents) and <a href="https://www.conductor.build/?ref=danielraffel.me">Conductor</a> (parallel Claude Code and Codex agents on your Mac) are shipping with Tauri. The <a href="https://github.com/tauri-apps/awesome-tauri?ref=danielraffel.me">awesome-tauri</a> list has many more across a variety of categories.</p><p>For my first project, I'm using Tauri v2 with <a href="https://vite.dev/?ref=danielraffel.me">Vite</a> as the frontend build tool. Vite is a fast dev server and bundler – it uses native <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules?ref=danielraffel.me" rel="noreferrer">ES modules</a> during development so you get near-instant hot module replacement instead of waiting for a full rebuild. It pairs well with Tauri since both prioritize speed.</p><h2 id="the-testing-problem">The Testing Problem</h2><p>I’ve gotten spoiled by automated testing. I’ve grown used to asking agents to spin up MCP servers so they can drive browser tooling (<a href="https://github.com/ChromeDevTools/chrome-devtools-mcp?ref=danielraffel.me" rel="noreferrer">Chrome DevTools MCP–style</a>) and native build/test stacks (<a href="https://www.xcodebuildmcp.com/?ref=danielraffel.me" rel="noreferrer">XcodeBuildMCP-style</a>) and validate flows for me. In web dev, you point <a href="https://www.selenium.dev/documentation/webdriver/browsers/safari/?ref=danielraffel.me" rel="noreferrer">Selenium</a> or <a href="https://webdriver.io/?ref=danielraffel.me" rel="noreferrer">WebDriverIO</a> at your app, click a few buttons, assert some text, take screenshots, and you’re done. I wanted that exact workflow for my Tauri app.</p><p>But on macOS there isn’t an Apple-provided WebDriver for embedded WKWebView apps—Apple’s WebDriver story is <a href="https://www.selenium.dev/documentation/webdriver/browsers/safari/https://developer.apple.com/documentation/webkit/testing-with-webdriver-in-safari?ref=danielraffel.me" rel="noreferrer">safaridriver</a> for automating Safari itself, which doesn’t help when your UI is a WKWebView inside a desktop app. Linux has WebKitWebDriver, Windows has Edge WebDriver, and the official Tauri WebDriver stack supports those—but macOS requires a third-party driver or a custom bridge.</p><p>I wasn’t the only one frustrated by this. Around the same time I started building a solution, a couple of other projects popped up tackling the same problem (more on those below). But I’d already gone deep enough that it made sense to finish.</p><h2 id="what-i-built"> What I Built</h2><p><a href="https://github.com/danielraffel/tauri-webdriver?ref=danielraffel.me">Tauri-WebDriver</a> is an open-source <a href="https://www.w3.org/TR/webdriver1/?ref=danielraffel.me" rel="noreferrer">W3C WebDriver v1</a> implementation for Tauri apps on macOS, built in collaboration with Claude Code. It’s two Rust crates:</p><ol><li><a href="https://crates.io/crates/tauri-plugin-webdriver-automation?ref=danielraffel.me" rel="noreferrer">A Tauri plugin</a> that runs inside your app in debug builds. It starts a little HTTP server, injects a JavaScript bridge into the webview, and exposes endpoints for finding elements, clicking things, reading text, managing windows, executing scripts, etc.</li><li><a href="https://crates.io/crates/tauri-webdriver-automation?ref=danielraffel.me" rel="noreferrer">A CLI binary</a> (<code>tauri-wd</code>) that speaks the standard W3C WebDriver protocol on port 4444. WebDriverIO, Selenium, or any W3C-compatible test runner connects to it like they would any browser driver. The CLI launches your app, discovers the plugin's port, and translates every WebDriver command into a plugin API call.</li></ol><pre><code class="language-bash">WebDriverIO/Selenium ──HTTP:4444──&gt; tauri-wd CLI ──HTTP──&gt; plugin inside your app
</code></pre><p>It’s not “100% of WebDriver,” but it covers a <a href="https://github.com/danielraffel/tauri-webdriver?tab=readme-ov-file&ref=danielraffel.me#supported-w3c-webdriver-operations" rel="noreferrer">surprisingly large subset</a>: element finding, input, screenshots, actions, windows, cookies, iframes/shadow DOM, alerts, print-to-PDF, and computed ARIA roles.</p><h2 id="ai-powered-testing-with-mcp">AI-Powered Testing with MCP</h2><p>One thing I'm particularly excited about is the MCP (Model Context Protocol) integration. I connected tauri-webdriver to <a href="https://github.com/danielraffel/mcp-tauri-automation?ref=danielraffel.me">mcp-tauri-automation</a>, an MCP server that lets AI agents like Claude Code directly launch, inspect, and interact with your Tauri app. You can say "click the submit button, fill in the form, and take a screenshot" and it just works.</p><p>I <a href="https://github.com/Radek44/mcp-tauri-automation?ref=danielraffel.me">forked the original project</a> to add some features I needed -- <code>execute_script</code>, <code>get_page_title</code>, <code>get_page_url</code>, multi-strategy element selectors, configurable screenshot timeouts, and <code>wait_for_navigation</code> – and submitted those upstream.</p><h2 id="alternatives">Alternatives</h2><p>I'd be remiss not to mention the other solutions that exist:</p><ul><li><a href="https://docs.crabnebula.dev/plugins/tauri-e2e-tests/?ref=danielraffel.me#macos-support">CrabNebula Webdriver for Tauri</a> – A commercial hosted testing service with macOS WebDriver support.</li><li><a href="https://github.com/Choochmeque/tauri-plugin-webdriver?ref=danielraffel.me">tauri-plugin-webdriver</a> – An open-source Tauri plugin that embeds a WebDriver server directly in the plugin (single-crate vs. my two-crate approach). It supports <em>(or plans to support) </em><strong>macOS, Linux, and Windows</strong>, so if you need cross-platform WebDriver support, this is probably the more pragmatic choice.</li></ul><p>After starting this, I learned others were building similar things so this was admittedly not the greatest use of time. But, I learned more about Rust, Tauri’s plugin system, the W3C WebDriver spec, WKWebView quirks, and what’s happening under the hood when you capture screenshots in E2E tests.</p><h2 id="try-it">Try It</h2><p>If you're building a Tauri app on macOS and want automated e2e tests <a href="https://github.com/danielraffel/tauri-webdriver?tab=readme-ov-file&ref=danielraffel.me#quick-start" rel="noreferrer">follow the quick start in the GitHub repo</a>. The <a href="https://github.com/danielraffel/tauri-webdriver?ref=danielraffel.me">README</a> also has additional information and a complete endpoint reference.</p><blockquote><em>Disclosure:</em>&nbsp;The code for this project was written in collaboration with Claude Code.</blockquote> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ XcodeBuildMCP now CLI friendly ]]></title>
        <description><![CDATA[ XcodeBuildMCP v2.0 was recently released, and it adds two optional skills you can install into supported CLIs (Codex, Claude, and Cursor). ]]></description>
        <link>https://danielraffel.me/til/2026/02/09/xcodebuildmcp-now-cli-friendly/</link>
        <guid isPermaLink="false">69896f4bf836a0a2273e4045</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 09 Feb 2026 11:02:21 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/1CDB7D5A-1C5D-4599-9F9E-D38344845568.png" medium="image"/>
        <content:encoded><![CDATA[ <p><a href="https://www.xcodebuildmcp.com/?ref=danielraffel.me" rel="noreferrer">XcodeBuildMCP</a> v2.0 was recently released, introducing <a href="https://github.com/cameroncooke/XcodeBuildMCP?tab=readme-ov-file&ref=danielraffel.me#skills" rel="noreferrer">two optional agent skills</a> you can add to supported CLIs (Codex, Claude, and Cursor). Because the skills may overlap, you can only install&nbsp;<strong>one</strong>&nbsp;per agent.</p><p>For clarity, the two skills are:</p><ul><li><strong>MCP Skill:</strong>&nbsp;primes the agent on how to use the MCP server’s tools (optional if you’re already using the MCP server).</li><li><strong>CLI Skill:</strong>&nbsp;primes the agent on how to navigate the CLI (recommended if you’re using the CLI).</li></ul><p>My summary of the <a href="https://x.com/camsoft2000/status/2020961925061017779?ref=danielraffel.me" rel="noreferrer">authors description of the tradeoffs</a>:&nbsp;</p><blockquote><strong>MCP is the better overall experience</strong>, but if you care more about <strong>token efficiency</strong>&nbsp;than&nbsp;<strong>predictability</strong>, choose the CLI. MCP is more stateful and can persist app configuration across sessions, so the agent doesn’t have to rediscover settings every time it makes a tool call.</blockquote><p>To install the scripts, run the command below in your terminal and follow the prompts:</p><pre><code>curl -fsSL https://raw.githubusercontent.com/cameroncooke/XcodeBuildMCP/v2.0.0/scripts/install-skill.sh -o install-skill.sh &amp;&amp; bash install-skill.sh</code></pre><p>Additional <a href="https://github.com/cameroncooke/XcodeBuildMCP/blob/main/docs/SKILLS.md?ref=danielraffel.me" rel="noreferrer">details about the skills</a> are available in the project docs.</p><h2 id="why-this-matters"><strong>Why this matters</strong></h2><p>Xcode workflows are getting a lot tighter: you can run tests, automate builds, and even publish App Store submissions straight from your IDE or terminal. XcodeBuildMCP (plus the new install scripts) pairs well with <a href="https://appstoreconnect.apple.com/?ref=danielraffel.me" rel="noreferrer">App Store Connect</a> automation tools like:</p><ul><li><a href="https://github.com/rudrankriyam/App-Store-Connect-CLI?ref=danielraffel.me" rel="noreferrer">App-Store-Connect-CLI</a>&nbsp;— a fast, lightweight, scriptable CLI for App Store Connect that lets you automate key iOS release steps directly from the terminal.</li></ul> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Got CodexMonitor Running on My iPhone Over Tailscale ]]></title>
        <description><![CDATA[ CodexMonitor recently developed an iOS app for connecting to your projects running on your desktop over Tailscale. Until there&#39;s a TestFlight / AppStore build you will need to build this yourself if you want to use it. Here&#39;s what I did to get it working. ]]></description>
        <link>https://danielraffel.me/til/2026/02/08/how-i-got-codex-monitor-running-on-my-iphone-over-tailscale/</link>
        <guid isPermaLink="false">6988e31f7c19474d692ff51f</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sun, 08 Feb 2026 11:53:24 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/3161B737-CC20-45C6-9F43-BAA2777DEAA3.png" medium="image"/>
        <content:encoded><![CDATA[ <p><a href="https://www.codexmonitor.app/?ref=danielraffel.me" rel="noreferrer">CodexMonitor</a> recently developed an iOS app for connecting to your projects running on your desktop over Tailscale. Until there's a TestFlight / AppStore build you will need to build this yourself if you want to use it. Here's what I did to get it working.</p><h2 id="pre-requisites">Pre-requisites</h2><ul><li><strong>Desktop app (prebuilt):</strong> <a href="https://www.codexmonitor.app/?ref=danielraffel.me">https://www.codexmonitor.app</a><br>I installed/updated this instead of building the desktop app myself (the latest builds include Tailscale support + the iOS work).</li><li><strong>iOS app (from source):</strong> <a href="https://github.com/Dimillian/CodexMonitor?ref=danielraffel.me">https://github.com/Dimillian/CodexMonitor</a><br>The iOS app isn’t on TestFlight, so I built it locally. This has dependencies but I am not going to list them since building should resolve.</li><li><a href="https://apps.apple.com/us/app/xcode/id497799835?mt=12+Xcode&ref=danielraffel.me" rel="noreferrer">Xcode</a> + <a href="https://developer.apple.com/documentation/xcode/installing-the-command-line-tools/?ref=danielraffel.me" rel="noreferrer">Command Line Tools</a> <a href="https://github.com/Dimillian/CodexMonitor?ref=danielraffel.me"></a><br>I used my <a href="https://developer.apple.com/programs/enroll/?ref=danielraffel.me" rel="noreferrer">Apple Developer Account</a> (TeamID, AppleID) to build and deploy to my device not sure if that's required.</li><li><a href="https://apps.apple.com/us/app/tailscale/id1475387142?mt=12+Tailscale&ref=danielraffel.me" rel="noreferrer">Tailscale on macOS</a> and <a href="https://apps.apple.com/us/app/tailscale/id1470499037%20Tailscale?ref=danielraffel.me" rel="noreferrer">Tailscale on iOS</a> logged in and connected</li><li><a href="https://github.com/openai/codex?ref=danielraffel.me" rel="noreferrer">Codex</a> installed and authorized to your account</li></ul><hr><h3 id="1-get-the-desktop-app-running-note-your-tailscale-ip">1. Get the desktop app running + note your Tailscale IP</h3><p>On the Mac where CodexMonitor runs, open Tailscale from the menu bar and copy your <strong>Tailscale IPv4</strong> (something like <code>100.x.y.z</code>). That’s the IP the iPhone will target.</p><p>I used that IP as the <strong>remote backend address</strong> inside CodexMonitor.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/02/image-3.png" class="kg-image" alt="" loading="lazy" width="2000" height="1307" srcset="https://danielraffel.me/content/images/size/w600/2026/02/image-3.png 600w, https://danielraffel.me/content/images/size/w1000/2026/02/image-3.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/02/image-3.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/02/image-3.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Update the Remote Backend IP and Port if default is already in use</span></figcaption></figure><hr><h3 id="2-configure-codexmonitor-desktop-to-run-as-a-remote-daemon-and-start-mobile-access">2. Configure CodexMonitor desktop to run as a Remote Daemon and Start Mobile Access</h3><p>In <strong>CodexMonitor (desktop)</strong> go to:</p><p><strong>Settings → Server</strong></p><p>Set:</p><ul><li><strong>Backend mode:</strong> Remote (daemon)</li><li><strong>Remote backend IP:port:</strong> <code>&lt;TAILSCALE_IP&gt;:&lt;PORT&gt;</code><ul><li>I had to change the port (example below uses <code>4735</code>)</li><li><strong>Token:</strong> set a token</li></ul></li><li><strong>Mobile access Daemon</strong>: Click <strong>Start daemon</strong></li><li><strong>Tailscale Helper:</strong> Click <strong>Detect Tailscale</strong></li></ul><h3 id="if-tailscale-helper-doesn%E2%80%99t-work">If Tailscale Helper Doesn’t Work</h3><p>You can also start the daemon yourself from a shell. <strong>CodexMonitor shows you the exact command to use so don’t copy the generic example below as-is</strong> because it won’t include your macOS username, path, IP:port. <em>Replace&nbsp;</em><code>100.x.y.z:4735</code><em>&nbsp;with your Tailscale IP and the port you set in Step 1.</em></p><pre><code class="language-sh">'/Applications/CodexMonitor.app/Contents/MacOS/codex_monitor_daemon' \
  --listen '100.x.y.z:4735' \
  --data-dir '/Users/username/Library/Application Support/com.dimillian.codexmonitor' \
  --token '12345'</code></pre><p>Expected output:</p><pre><code class="language-bash">codex-monitor-daemon listening on 100.x.y.z:4735 (data dir: /Users/username/Library/Application Support/com.dimillian.codexmonitor)</code></pre><hr><h3 id="3-build-and-install-the-ios-app-on-your-device-local-signing">3. Build and install the iOS app on your device (local signing)</h3><p>Clone the repo and install dependencies:</p><pre><code class="language-bash">git clone https://github.com/Dimillian/CodexMonitor.git
cd CodexMonitor
npm install</code></pre><p>List devices to confirm your iPhone is visible:</p><pre><code class="language-bash">./scripts/build_run_ios_device.sh --list-devices</code></pre><p>Then build + install to your iPhone (you need an Apple Team ID that’s available in Xcode on this Mac):</p><pre><code class="language-bash">./scripts/build_run_ios_device.sh --device "iPhone" --team "&lt;YOUR_TEAM_ID&gt;"</code></pre><p>If signing isn’t set up yet</p><p>The script supports opening the generated Xcode project so you can do one-time signing setup:</p><pre><code class="language-bash">./scripts/build_run_ios_device.sh --open-xcode --team "&lt;YOUR_TEAM_ID&gt;"</code></pre><p>In Xcode, add your Apple account and choose the correct team for signing, then re-run the device script.</p><hr><h3 id="4-what-codex-actually-did-to-get-the-ios-build-working">4. What Codex actually did to get the iOS build working</h3><p>I asked Codex to run the repo’s iOS device script end-to-end and fix whatever broke. The flow looked like this:</p><ul><li>Ran ./scripts/build_run_ios_device.sh --help</li><li>Listed devices with --list-devices</li><li>First build failed due to missing signing team</li><li>Installed missing JS deps via npm install</li><li>Hit xcodebuild exit code 65 (signing/provisioning issues on that Mac)</li><li>Checked available signing identities (security find-identity -v -p codesigning)</li><li>Re-ran with a local team, still blocked until Xcode account setup was correct</li><li>Eventually isolated a linker failure related to Swift compatibility libs being searched under a stale toolchain path</li><li>Patched the generated Xcode project’s LIBRARY_SEARCH_PATHS to include stable Xcode Swift library directories as a fallback</li><li>After that, the iOS build succeeded and the app installed to my iPhone</li></ul><p>One important obvious gotcha: auto-launch via CLI fails if your iPhone is locked. If that happens, unlock the phone and re-run launch:</p><pre><code class="language-bash">xcrun devicectl device process launch --device "iPhone" --terminate-existing com.dimillian.codexmonitor</code></pre><hr><h3 id="5-configure-the-ios-app-to-talk-to-the-desktop-daemon">5. Configure the iOS app to talk to the desktop daemon</h3><p>On iOS, set the exact same values as desktop (you might find the connect overlays are kinda in the way ignore that and tap the fields):</p><ul><li>Host/IP: &lt;TAILSCALE_IP&gt;</li><li>Port: (e.g. 4735)</li><li>Token: the same token you set on desktop</li></ul><p>Then connect.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/02/image-4.png" class="kg-image" alt="" loading="lazy" width="1179" height="2556" srcset="https://danielraffel.me/content/images/size/w600/2026/02/image-4.png 600w, https://danielraffel.me/content/images/size/w1000/2026/02/image-4.png 1000w, https://danielraffel.me/content/images/2026/02/image-4.png 1179w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">CodexMonitor running natively on iOS connected to CodexMonitor desktop client over Tailscale</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ A Hidden Character Can Hide Your Codex Sessions ]]></title>
        <description><![CDATA[ Today I learned that a single weird Unicode line separator (U+2028) can make Codex.app skip loading session history. I quit and restarted the app and all my sessions vanished, which was alarming. Here&#39;s how I fixed it. ]]></description>
        <link>https://danielraffel.me/til/2026/02/04/a-hidden-character-can-hide-your-codex-sessions/</link>
        <guid isPermaLink="false">6983969e0487848cb2d07f44</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 04 Feb 2026 11:06:54 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/74BE3D65-7DA4-4DF8-BD85-93CC32FAA9CF.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Today I learned that a single weird Unicode line separator (U+2028) can make <a href="https://chatgpt.com/codex?ref=danielraffel.me" rel="noreferrer">Codex.app</a> skip loading session history. I quit and restarted the app and all my<br>sessions vanished, which was alarming.</p><p>The fix was simple: remove the offending U+2028 characters from the sessions JSONL file. After that, I quit and restarted again and everything came back. Clearly that one character was the culprit.</p><h3 id="what-to-ask-codex">What to ask Codex</h3><ul><li>Search the Codex sessions folder<sup>1</sup> for <code>U+2028</code>.</li><li>Show the exact lines where it appears.</li><li>Remove those characters from the affected session file(s).</li><li>Verify the characters are gone.</li><li>Quit and restart Codex.app to confirm sessions load again.</li></ul><h3 id="fwiw-youll-probably-never-hit-this-issue">FWIW You'll probably never hit this issue</h3><figure class="kg-card kg-embed-card"><blockquote class="twitter-tweet"><p lang="en" dir="ltr">this is in the next release, you just helped speed up the testing 🙏</p>— Andrew Ambrosino (@ajambrosino) <a href="https://twitter.com/ajambrosino/status/2019122845515985005?ref_src=twsrc%5Etfw&ref=danielraffel.me">February 4, 2026</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></figure><hr><p>1. You’ll want to search the Codex sessions folder in your home directory. On macOS, the path is <code>~/.codex/sessions</code>. Ask Codex to scan that folder for hidden Unicode line separators and clean them up.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Got Claude and Codex to Consult Each Other via Skills (and When I Use Them) ]]></title>
        <description><![CDATA[ Up until recently, getting Claude to collaborate with other agents was…functional, but clunky. Recently, I started using a skill that allows me to just call OpenAI /codex from Claude. ]]></description>
        <link>https://danielraffel.me/til/2026/02/03/how-i-got-claude-to-consult-codex-and-when-i-use-it/</link>
        <guid isPermaLink="false">69813788c699cd035650e892</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 02 Feb 2026 16:11:03 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/02/4add4194144850e37c51052b4653106e7f5c70834d20367bb77b39bb98cde4c0.jpg" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>TL;DR</strong> If you want&nbsp;<strong>Claude to get a second opinion from Codex and Codex to get a second opinion from Claude</strong>, these skills make it easy.<br><br>- <a href="https://github.com/sundial-org/skills/tree/main/skills/codex?ref=danielraffel.me" rel="noreferrer">Install the  /codex skill</a>  inside Claude → enables <code>/codex</code> in Claude<br>- <a href="https://github.com/danielraffel/generous-corp-marketplace/tree/cd7d30498caa7114036e18687243fde4ff081083/skills/claude?ref=danielraffel.me" rel="noreferrer">Install the /claude skill</a> inside Codex → enables <code>/claude</code> in Codex<br><br>Unless you use a shared top-level skill folder then put each skill in the correct <code>skills/</code> directory for that tool, restart, run <code>/skills</code> to confirm.</blockquote><hr><h1 id="what-this-solves">What This Solves</h1><p>The goal is simple: Get a second opinion without ceremony.</p><p>No artifact juggling.<br>No copying files back and forth.<br>No breaking flow.</p><p>Just use a skill, get a response, continue.</p><p>One agent stays in charge. The other is consulted. The lead integrates the result.</p><p><em>Example real world prompt in Claude:</em></p><pre><code class="language-markdown">Please review the original source using RepoPrompt to ground the analysis in the actual codebase and consult Apple documentation via Sosumi MCP. After forming a plan, review it with /codex ask it to also use RepoPrompt to ground the analysis and consult Apple documentation via Sosumi MCP, refine it, and produce an updated v2 document with revised phases and align on next steps.</code></pre><hr><h1 id="how-it-works">How It Works</h1><p>Two skills. Two agents.</p><ul><li>Claude can call <code>/codex</code></li><li>Codex can call <code>/claude</code></li></ul><p>Whichever agent you’re actively using remains primary. The other is consulted for a scoped task and returns output. You decide how to use that response in your prompt. <em>I usually have them consult and align before moving forward.</em></p><hr><h2 id="setup">Setup</h2><p>The following approaches use sparse checkouts so you only download the exact skill folder you need rather than the entire repo.</p><h2 id="install-codex-in-claude">Install <code>/codex</code> in Claude</h2><h3 id="1-pull-only-the-codex-skill">1. Pull only the <code>codex</code> skill</h3><pre><code class="language-bash">git clone --depth 1 --filter=blob:none --sparse https://github.com/danielraffel/generous-corp-marketplace.git
cd generous-corp-marketplace
git sparse-checkout set skills/codex</code></pre><h3 id="2-create-claude%E2%80%99s-skills-directory-if-missing"><strong>2. Create Claude’s skills directory (if missing)</strong></h3><pre><code class="language-bash">mkdir -p ~/.claude/skills</code></pre><h3 id="3-move-the-skill-into-place"><strong>3. Move the skill into place</strong></h3><pre><code class="language-bash">mv skills/codex ~/.claude/skills/</code></pre><h3 id="4-restart-claude"><strong>4. Restart Claude</strong></h3><p>Verify installation:</p><pre><code class="language-bash">/skills</code></pre><p>You should now see&nbsp;/codex.</p><hr><h2 id="install-claude-in-codex">Install <code>/claude</code> in Codex</h2><h3 id="1-pull-only-the-claude-skill-at-the-pinned-commit"><strong>1. Pull only the&nbsp;claude skill at the pinned commit</strong></h3><pre><code class="language-bash">git clone --filter=blob:none --sparse https://github.com/danielraffel/generous-corp-marketplace.git
cd generous-corp-marketplace
git sparse-checkout set skills/claude
git checkout cd7d30498caa7114036e18687243fde4ff081083</code></pre><h3 id="2-create-codex%E2%80%99s-skills-directory-if-missing"><strong>2. Create Codex’s skills directory (if missing)</strong></h3><pre><code class="language-bash">mkdir -p ~/.codex/skills</code></pre><h3 id="3-move-the-skill-into-place-1"><strong>3. Move the skill into place</strong></h3><pre><code class="language-bash">mv skills/claude ~/.codex/skills/</code></pre><h3 id="4-restart-codex"><strong>4. Restart Codex</strong></h3><p>Verify installation:</p><pre><code class="language-bash">/skills</code></pre><p>You should now see&nbsp;/claude.</p><hr><h2 id="final-folder-structure"><strong>Final Folder Structure</strong></h2><p>Claude:</p><pre><code class="language-bash">~/.claude/skills/codex/
  agents/
  SKILL.md</code></pre><p>Codex:</p><pre><code class="language-bash">~/.codex/skills/claude/
  references/
  SKILL.md</code></pre><hr><h2 id="what-you-now-have"><strong>What You Now Have</strong></h2><ul><li>Claude can consult Codex with&nbsp;<code>/codex</code></li><li>Codex can consult Claude with&nbsp;<code>/claude</code></li></ul><p>Clean second opinions. No artifact passing. No context juggling.</p><hr><h2 id="ralph-loop-parallel-optimization">Ralph Loop / Parallel Optimization</h2><p>If you’re running a ralph-loop style workflow or keeping a CLAUDE.md you can explicitly allow delegation for parallelizable tasks. </p><p>Somewhere I stumbled on someone suggesting using it in a <code>ralph-loop</code> or a <code>CLAUDE.md</code> and they shared this snippet that leverages the <code>/codex</code> skill:</p><pre><code class="language-md">CODEX DELEGATION (OPTIONAL):
- Use /codex &lt;task&gt; or codex exec --full-auto &lt;task&gt; for parallel work.
- Good candidates for Codex delegation:
  - Writing tests for code you just wrote
  - Implementing a component while you work on another
  - Code review of completed work items
- Do NOT delegate to Codex when:
  - The task depends on something you're currently building
  - Multiple agents would edit the same file
  - The task requires your current conversation context
- When delegating, run Codex in background and continue your work.
- Check Codex output before marking work item complete.</code></pre><h3 id="working-ralph-loop-output">Working Ralph Loop Output</h3><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2026/02/image-2.png" class="kg-image" alt="" loading="lazy" width="2000" height="792" srcset="https://danielraffel.me/content/images/size/w600/2026/02/image-2.png 600w, https://danielraffel.me/content/images/size/w1000/2026/02/image-2.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/02/image-2.png 1600w, https://danielraffel.me/content/images/2026/02/image-2.png 2358w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ HomeBar: Control Your Smart Home from the Mac Menu Bar ]]></title>
        <description><![CDATA[ HomeBar sits in your menu bar and lets you toggle lights, switches, and other HomeKit accessories with a single click. No need to open the Home app or pull out your phone. ]]></description>
        <link>https://danielraffel.me/2026/01/15/homebar-control-your-smart-home-from-the-mac-menu-bar/</link>
        <guid isPermaLink="false">696866f8e67ee14b98fcf49f</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 14 Jan 2026 20:10:43 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/01/9DA4824A-79BD-4277-82D0-37C80CBF9969.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I used <a href="https://www.generouscorp.com/Chainer/?ref=danielraffel.me" rel="noreferrer">Chainer</a> to build a small utility for anyone who uses HomeKit and wants quicker access to their devices on Mac.</p><p><a href="https://www.generouscorp.com/homebar/?ref=danielraffel.me" rel="noreferrer">HomeBar</a> sits in your menu bar and lets you toggle lights, switches, and other HomeKit accessories with a single click. No need to open the Home app or pull out your phone.</p><p>A few things it does:</p><ul><li>Menu bar access to all your HomeKit devices</li><li>Apple Shortcuts integration — run your shortcuts right from the menu</li><li>Keyboard shortcuts — assign hotkeys to any device</li><li>Sensor readings — see temperature, humidity, and other sensor data at a glance</li><li>Dark mode support — matches your system appearance</li></ul><p>It's free and <a href="https://github.com/danielraffel/homebar?ref=danielraffel.me" rel="noreferrer">open source</a>—<a href="https://www.generouscorp.com/homebar/?ref=danielraffel.me" rel="noreferrer">available for download</a> if you wanna try it out.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/Screenshot-2026-01-18-at-7.56.08---PM.png" class="kg-image" alt="" loading="lazy" width="580" height="756"><figcaption><span style="white-space: pre-wrap;">HomeBar in the Menu Bar</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ A Tool for Generating HTML Embeds for asciicinema ]]></title>
        <description><![CDATA[ Today I wanted to share a screen capture of a terminal project that I’ve been working on, and I wanted something better than a blurry screen recording. I ended up building a tool to set the start/end times and generate the code injection snippets for Ghost. ]]></description>
        <link>https://danielraffel.me/2026/01/07/a-tool-for-generating-html-embeds-for-asciicinema/</link>
        <guid isPermaLink="false">695d97a5881b3f036108c579</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 06 Jan 2026 22:19:48 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/01/43E1C06F-7E69-452F-8E6F-5BBF7296B0B6.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>TL;DR:</strong>&nbsp;Embedding&nbsp;<em>trimmed</em>&nbsp;<a href="https://asciinema.org/?ref=danielraffel.me" rel="noreferrer"><strong>asciinema</strong></a> recordings in Ghost was a little too manual, so I built a <a href="https://www.generouscorp.com/asciicinema-ghost/?ref=danielraffel.me" rel="noreferrer">tiny tool</a> that lets you set start/end clip times, pick a&nbsp;<strong>cover image</strong>&nbsp;(shown on the player before playback), choose a&nbsp;<strong>fallback image</strong>&nbsp;(for email/RSS clients that can’t run JS), and then generates the exact&nbsp;<strong>Ghost code injection + HTML card snippet</strong>&nbsp;you can paste in.</blockquote><h3 id="why-asciinema-instead-of-video">Why asciinema (instead of video)?</h3><p>I wanted to share a screen capture of a terminal project I’ve been working on—and I wanted something better than a blurry screen recording.</p><p><a href="https://asciinema.org/?ref=danielraffel.me" rel="noreferrer"><strong>asciinema</strong></a>&nbsp;records terminal sessions as lightweight text-based “cast” files (<code>.cast</code>) that you can replay and embed on the web. Because it’s text (not pixels):</p><ul><li>files stay small</li><li>rendering stays crisp at any resolution</li><li>viewers can copy/paste commands directly from playback</li></ul><p>What I liked most: I can record once, then re-embed the same session at different sizes in docs, blog posts, and READMEs without re-encoding video.</p><hr><h2 id="the-basic-workflow">The basic workflow</h2><ol><li>Record a terminal session with asciinema</li><li>Host the&nbsp;<code>.cast</code>&nbsp;somewhere public (I used GitHub)</li><li>Embed it in Ghost using the asciinema player</li><li>Trim to “just the good bit” (optional, but usually what you want)</li><li>Add a&nbsp;<strong>cover image</strong>&nbsp;for the web player and a&nbsp;<strong>fallback image</strong>&nbsp;for email/RSS</li></ol><hr><h2 id="install-asciinema">Install asciinema</h2><p>Install it however you like—package manager is easiest:</p><ul><li>macOS:&nbsp;<code>brew install asciinema</code></li><li>Debian/Ubuntu:&nbsp;<code>sudo apt install asciinema</code></li><li>Arch:&nbsp;<code>sudo pacman -S asciinema</code></li></ul><hr><h2 id="record-a-terminal-session">Record a terminal session</h2><p>Start recording:</p><pre><code class="language-bash">asciinema rec demo.cast
</code></pre><p>Stop by exiting your shell (<code>Ctrl+D</code>) or typing&nbsp;<code>exit</code>.</p><p>Replay locally to sanity-check:</p><pre><code class="language-bash">asciinema play demo.cast
</code></pre><hr><h2 id="host-your-cast-files">Host your&nbsp;<code>.cast</code>&nbsp;files</h2><p>You have two options:</p><ul><li>Upload to asciinema’s hosted service (<code>asciinema upload …</code>)</li><li>Self-host&nbsp;<code>.cast</code>&nbsp;anywhere public (S3, static hosting, GitHub, etc.)</li></ul><p>I went with: commit&nbsp;<code>.cast</code>&nbsp;files to a GitHub repo, then use GitHub&nbsp;<strong>raw URLs</strong>&nbsp;for embeds. That gives Ghost (and the player) a URL it can fetch directly.</p><hr><h2 id="embedding-in-ghost">Embedding in Ghost</h2><p>To embed a&nbsp;<code>.cast</code>&nbsp;on your site, you include&nbsp;<strong>asciinema player</strong>&nbsp;(CSS + JS) and point it at your&nbsp;<code>.cast</code>&nbsp;URL.</p><p>In Ghost that usually means:</p><ul><li><strong>Header injection:</strong>&nbsp;include player CSS</li><li><strong>Footer injection:</strong>&nbsp;include player JS (and init code)</li><li>In the post body: an&nbsp;<strong>HTML card</strong>&nbsp;containing a container element for the player</li></ul><p>One practical detail: if you send Ghost posts as newsletters (or rely on RSS), JavaScript often won’t run in those clients. I learned this the hard way. You’ll want two “previews”:</p><ul><li><strong>Cover image (web):</strong>&nbsp;shown on top of the player before someone hits play</li><li><strong>Fallback image (email/RSS):</strong>&nbsp;a plain&nbsp;<code>&lt;img&gt;</code>&nbsp;that still looks good even when scripts are stripped</li></ul><hr><h2 id="trimming-to-%E2%80%9Cjust-the-good-bit%E2%80%9D">Trimming to “just the good bit”</h2><p>Usually you don’t want the full unedited recording—you want the segment that demonstrates the feature. Two approaches:</p><h3 id="option-a-start-mid-way-easy">Option A: Start mid-way (easy)</h3><p>The player supports&nbsp;<code>startAt</code>, so you can begin playback at a specific timestamp.</p><h3 id="option-b-true-trimming-precise-but-fiddly">Option B: True trimming (precise, but fiddly)</h3><p>A&nbsp;<code>.cast</code>&nbsp;is newline-delimited JSON: one header line, then timestamped events. If you cut time ranges out, you generally need to adjust timestamps to keep playback consistent.</p><hr><h2 id="what-i-was-doing-manually-and-got-tired-of">What I was doing manually (and got tired of)</h2><p>My first pass looked like:</p><ul><li>upload&nbsp;<code>.cast</code>&nbsp;files to GitHub</li><li>grab raw URLs</li><li>hunt for start/end times</li><li>hand-assemble:<ul><li>Ghost header/footer injections</li><li>the per-clip embed HTML</li><li>a nice preview image</li><li>an email-safe fallback</li></ul></li></ul><p>It worked, but it was repetitive enough that I didn’t want to ever do that again.</p><hr><h2 id="the-tiny-tool-i-built-asciinema-ghost">The tiny tool I built:&nbsp;<code>asciinema-ghost</code></h2><p>So I built a simple single-file tool that does the “scrub → clip → embed” loop for me.</p><p>It lets you:</p><ul><li>Load one or more&nbsp;<code>.cast</code>&nbsp;files (drag/drop or URL)</li><li>Scrub the timeline and stamp the current playhead into&nbsp;<strong>Start / End</strong></li><li>Preview the selection, loop it, and use markers</li><li>Pick a frame as a&nbsp;<strong>cover image</strong>&nbsp;(web player poster)</li><li>Set a&nbsp;<strong>fallback image</strong>&nbsp;for email/RSS</li><li>Generate copy/paste-ready&nbsp;<strong>Ghost snippets</strong>&nbsp;(header injection, footer injection, and HTML-card embeds)</li><li>Save state locally + export/import a project JSON</li></ul><p>Local files are great for previewing, but final embeds want publicly hosted&nbsp;<code>.cast</code>&nbsp;URLs (GitHub raw URLs work well). Also: when running locally, browsers can be weird about&nbsp;<code>file://</code>&nbsp;+ fetching assets, so using a tiny local server makes things more consistent.</p><p><strong>Web app:</strong>&nbsp;<a href="https://www.generouscorp.com/asciicinema-ghost/?ref=danielraffel.me">https://www.generouscorp.com/asciicinema-ghost/</a><br><strong>Source repo:</strong>&nbsp;<a href="https://github.com/danielraffel/asciicinema-ghost?ref=danielraffel.me">https://github.com/danielraffel/asciicinema-ghost</a></p><figure class="kg-card kg-image-card kg-card-hascaption"><a href="https://github.com/danielraffel/asciicinema-ghost?ref=danielraffel.me"><img src="https://danielraffel.me/content/images/2026/01/image-9.png" class="kg-image" alt="" loading="lazy" width="2000" height="1307" srcset="https://danielraffel.me/content/images/size/w600/2026/01/image-9.png 600w, https://danielraffel.me/content/images/size/w1000/2026/01/image-9.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/01/image-9.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/01/image-9.png 2400w" sizes="(min-width: 720px) 720px"></a><figcaption><a href="https://github.com/danielraffel/asciicinema-ghost?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">asciicinema-ghost code snippet generator</span></a></figcaption></figure><hr><h2 id="examples-in-the-wild">Examples in the wild</h2><p><a href="https://danielraffel.me/2026/01/06/woplugins-for-claude-code/" rel="noreferrer">Here’s a post</a> where I’m using this approach for embedded terminal demos.</p><hr><h2 id="takeaways">Takeaways</h2><p>If you want terminal demos that are crisp, lightweight, copy/paste-able, and easy to embed, asciinema is a great default.</p><p>And if you’re on Ghost (or anything that supports code injection), “self-host casts + embed with asciinema player” is straightforward—especially once you have a repeatable way to clip segments, pick a cover image, and include an email-safe fallback.</p><hr><h3 id="structure-what%E2%80%99s-happening">Structure (what’s happening)</h3><p>For each embed you want:</p><ul><li><strong>HTML card</strong>&nbsp;includes a container&nbsp;<code>&lt;div&gt;</code>&nbsp;with:<ul><li>an&nbsp;<code>&lt;img&gt;</code>&nbsp;=&nbsp;<strong>fallback</strong>&nbsp;(works in email/RSS/no-JS)</li><li>a child&nbsp;<code>&lt;div&gt;</code>&nbsp;where the&nbsp;<strong>web player mounts</strong></li></ul></li><li><strong>Header injection</strong>&nbsp;loads the player CSS</li><li><strong>Footer injection</strong>:<ul><li>mounts&nbsp;<code>AsciinemaPlayer</code>&nbsp;into the mount div</li><li>hides the fallback&nbsp;<code>&lt;img&gt;</code>&nbsp;once the player is ready</li><li>overlays a&nbsp;<strong>cover image</strong>&nbsp;(poster) on the player until playback starts</li></ul></li></ul><hr><h2 id="post-content-html-card">Post Content (HTML card)</h2><pre><code class="language-html">&lt;div id="cast-f60227d3" class="cast-embed"&gt;
  &lt;!-- Fallback image (email/RSS/no-JS) --&gt;
  &lt;img
    class="cast-fallback"
    src="https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager4.png"
    alt="worktree-manager4.cast"
    style="max-width: 100%; height: auto; border: 1px solid #ddd; border-radius: 4px;"
  /&gt;

  &lt;!-- Web player mounts here --&gt;
  &lt;div class="cast-player"&gt;&lt;/div&gt;
&lt;/div&gt;

&lt;div style="height:16px"&gt;&lt;/div&gt;

&lt;div id="cast-899ead17" class="cast-embed"&gt;
  &lt;!-- Fallback image (email/RSS/no-JS) --&gt;
  &lt;img
    class="cast-fallback"
    src="https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager6.png"
    alt="worktree-manager6.cast"
    style="max-width: 100%; height: auto; border: 1px solid #ddd; border-radius: 4px;"
  /&gt;

  &lt;!-- Web player mounts here --&gt;
  &lt;div class="cast-player"&gt;&lt;/div&gt;
&lt;/div&gt;
</code></pre><hr><h2 id="post-header-injection">Post Header Injection</h2><pre><code class="language-html">&lt;link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/asciinema-player@3.14.0/dist/bundle/asciinema-player.css"&gt;
</code></pre><hr><h2 id="post-footer-injection">Post Footer Injection</h2><pre><code class="language-html">&lt;script src="https://cdn.jsdelivr.net/npm/asciinema-player@3.14.0/dist/bundle/asciinema-player.min.js"&gt;&lt;/script&gt;
&lt;script&gt;
(function () {
  if (!window.AsciinemaPlayer) return;

  function ensurePoster(rootEl, posterUrl) {
    if (!posterUrl) return null;

    var playerEl = rootEl.querySelector(".ap-player");
    if (!playerEl) return null;

    var posterEl = playerEl.querySelector(".ap-preview-poster");
    if (!posterEl) {
      posterEl = document.createElement("img");
      posterEl.className = "ap-preview-poster";
      posterEl.alt = "Cover image";
      posterEl.style.position = "absolute";
      posterEl.style.inset = "0";
      posterEl.style.width = "100%";
      posterEl.style.height = "100%";
      posterEl.style.objectFit = "cover";
      posterEl.style.zIndex = "2";
      posterEl.style.pointerEvents = "none";
      playerEl.appendChild(posterEl);
    }
    posterEl.src = posterUrl;
    return posterEl;
  }

  function clearPoster(rootEl) {
    var posterEl = rootEl.querySelector(".ap-preview-poster");
    if (posterEl) posterEl.remove();
  }

  function initCast(rootId, castUrl, startAt, endAt, posterUrl) {
    var rootEl = document.getElementById(rootId);
    if (!rootEl) return;

    var fallbackImg = rootEl.querySelector(".cast-fallback");
    var mountEl = rootEl.querySelector(".cast-player");
    if (!mountEl) return;

    var opts = {
      startAt: startAt,
      controls: "auto",
      markers: [[startAt, "Start"], [endAt, "End"]]
    };

    var player = AsciinemaPlayer.create(castUrl, mountEl, opts);

    // When the player is ready on the web: hide fallback + show cover overlay
    var onReady = function () {
      if (fallbackImg) fallbackImg.style.display = "none";
      ensurePoster(rootEl, posterUrl || (fallbackImg &amp;&amp; fallbackImg.src));
    };
    player.addEventListener("ready", onReady);

    // Remove cover once playback starts
    var onPlay = function () { clearPoster(rootEl); };
    player.addEventListener("play", onPlay);
    player.addEventListener("playing", onPlay);
  }

  initCast(
    "cast-f60227d3",
    "https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager4.cast",
    9.56,
    36.84,
    "https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager4.png"
  );

  initCast(
    "cast-899ead17",
    "https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager6.cast",
    7.40,
    576.31,
    "https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager6.png"
  );
})();
&lt;/script&gt;
</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Learning Claude Code Plugins by Building Two ]]></title>
        <description><![CDATA[ I wanted to learn how Claude Code plugins work, so I built a couple with plugin-dev: Worktree-Manager for easier git worktrees, and Chainer, which interprets what you want to build and picks + runs the right plugin(s). ]]></description>
        <link>https://danielraffel.me/2026/01/06/learning-claude-code-plugins-by-building-two/</link>
        <guid isPermaLink="false">695d43f963252a035bc8b737</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 06 Jan 2026 12:06:26 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/01/62418973-2869-4469-BDCD-5341FF290421.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I've been wanting to learn how Claude Code plugins work, so I decided to build a few using <a href="https://github.com/anthropics/claude-code/tree/main/plugins/plugin-dev?ref=danielraffel.me" rel="noreferrer">plugin-dev</a>.</p><h2 id="how-to-install-in-claude-code">How To Install (in Claude Code)</h2><pre><code class="language-bash"># 1. Add the marketplace
/plugin marketplace add danielraffel/worktree-manager

# 2. Install Worktree-Manager
/plugin install worktree-manager@generous-corp-marketplace

# 3. Install Chainer
/plugin install chainer@generous-corp-marketplace

# 4. Restart Claude Code
# Quit and reopen Claude Code to load the plugin(s)</code></pre><h2 id="worktree-manager">Worktree-Manager</h2><p>The first plugin I built is called <a href="https://www.generouscorp.com/worktree-manager/?ref=danielraffel.me" rel="noreferrer">Worktree-Manager</a>. It’s designed to make creating and managing <a href="https://git-scm.com/docs/git-worktree?ref=danielraffel.me" rel="noreferrer">git worktrees</a> easier —handy when you’re building multiple features in parallel with agents and want to avoid stepping on each other’s changes. <a href="https://www.generouscorp.com/worktree-manager/?ref=danielraffel.me#commands" rel="noreferrer">Example Commands</a>.</p><p>Here’s a quick demo of it in action.</p>
<!--kg-card-begin: html-->
<div id="cast-f60227d3">

  <img src="https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager4.png" alt="worktree-manager4.cast" style="max-width: 100%; height: auto; border: 1px solid #ddd; border-radius: 4px;">

</div>
<!--kg-card-end: html-->
<h2 id="chainer">Chainer</h2><p>The second plugin I built is called <a href="https://www.generouscorp.com/Chainer/?ref=danielraffel.me" rel="noreferrer">Chainer</a>. You tell Claude Code what you want to build, and Chainer figures out which plugin(s) to use and runs them in the right order—end to end. Instead of manually picking tools and stitching steps together, you just describe the goal in natural language and Chainer orchestrates the workflow for you. <a href="https://www.generouscorp.com/Chainer/?ref=danielraffel.me#chains" rel="noreferrer">Example Commands</a>.</p><p>In the ~10-minute demo below, I gave it a prompt, and Chainer chose&nbsp;the <a href="https://github.com/anthropics/claude-plugins-official/tree/main/plugins/frontend-design?ref=danielraffel.me" rel="noreferrer">frontend-design</a> plugin&nbsp;and completed the task end-to-end.</p><pre><code class="language-bash">Build a Galaxian-inspired arcade shooter as a single, self-contained index.html (everything inline: HTML/CSS/JS and any shaders), with no server, no build tools, no external libraries, and no external assets—it must run from file:// and on static hosting. Design, implement and build modern arcade features (beyond the original) that add real depth and replayability. Output the complete index.html when you’re done.</code></pre>
<!--kg-card-begin: html-->
<div id="cast-899ead17">

  <img src="https://raw.githubusercontent.com/danielraffel/asciicinema-output/refs/heads/main/worktree-manager6.png" alt="worktree-manager6.cast" style="max-width: 100%; height: auto; border: 1px solid #ddd; border-radius: 4px;">

</div>
<!--kg-card-end: html-->
<h2 id="play-the-game-claude-built">Play the game Claude built </h2><p><a href="https://www.generouscorp.com/voidhunters/?ref=danielraffel.me" rel="noreferrer">Void Hunters</a></p><p>There are probably better plugins for each job, but building these was a great way to learn how Claude Code plugins—and Claude Code itself—work under the hood. I especially enjoyed digging into&nbsp;<code>AskUserQuestion</code>, which turns a wall of text into a guided, multi-step flow.</p><p>What an inspiring time to make digital artifacts.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ I built a VS Code / Cursor extension to open Agents in the Editor Area ]]></title>
        <description><![CDATA[ I wanted agent terminals to live next to my files, not in a pane. So I built a small extension using Codex that adds status bar buttons (one per agent) to open terminals directly in the editor area. ]]></description>
        <link>https://danielraffel.me/2026/01/05/i-built-a-vs-code-cursor-extension-to-open-agents-in-the-editor-area/</link>
        <guid isPermaLink="false">695bf5df5c5e7f035ef5d26e</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 05 Jan 2026 12:17:19 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2026/01/41B67DD0-EB90-4BCF-AD16-1F90EE184D87.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>TL;DR:</strong> I wanted agent terminals to live in my IDE in tabs in the editor area next to my files, not  in a pane. So I built a small VS Code extension called <strong>“Commands: Open Terminal in Editor”</strong> that adds status bar buttons (one per agent) to open terminals directly in the editor area. Available on the <a href="https://marketplace.visualstudio.com/items?itemName=GenerousCorp.commands-open-terminal-in-editor&ref=danielraffel.me" rel="noreferrer">VS Code Marketplace</a> and <a href="https://open-vsx.org/extension/GenerousCorp/commands-open-terminal-in-editor?ref=danielraffel.me" rel="noreferrer">Open VSX</a>.</blockquote><p>Over the weekend I was gifted an OpenAI <a href="https://chatgpt.com/?ref=danielraffel.me#pricing" rel="noreferrer">ChatGPT Pro</a> account and decided to use it the only way I know how: by immediately building something using <a href="https://openai.com/codex/?ref=danielraffel.me" rel="noreferrer">Codex</a> that I’ve been annoyed by for way too long.</p><h2 id="the-problem">The problem</h2><p>When I’m working in <a href="https://cursor.com/?ref=danielraffel.me" rel="noreferrer">Cursor</a> or <a href="https://code.visualstudio.com/?ref=danielraffel.me" rel="noreferrer">VS Code</a> with an agent (<a href="https://developers.openai.com/codex/cli/?ref=danielraffel.me" rel="noreferrer">Codex</a>, <a href="https://code.claude.com/docs/en/setup?ref=danielraffel.me" rel="noreferrer">Claude</a>, <a href="https://github.com/google-gemini/gemini-cli?ref=danielraffel.me" rel="noreferrer">Gemini</a>, etc.), I frequently want a terminal open&nbsp;<em>as a first-class citizen in the editor area</em>—right next to the tabs for the files I’m reading/writing.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/image-4.png" class="kg-image" alt="" loading="lazy" width="2000" height="1307" srcset="https://danielraffel.me/content/images/size/w600/2026/01/image-4.png 600w, https://danielraffel.me/content/images/size/w1000/2026/01/image-4.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/01/image-4.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/01/image-4.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Agents tend to instantiate inside a pane and while useful that's not where I want mine to live</span></figcaption></figure><p>Opening an agent in the terminal and moving it manually isn’t a lot of work, but it adds friction that doesn’t need to exist.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/image-6.png" class="kg-image" alt="" loading="lazy" width="2000" height="587" srcset="https://danielraffel.me/content/images/size/w600/2026/01/image-6.png 600w, https://danielraffel.me/content/images/size/w1000/2026/01/image-6.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/01/image-6.png 1600w, https://danielraffel.me/content/images/2026/01/image-6.png 2200w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">The extension aims to replace the manual process of moving a terminal where I want it</span></figcaption></figure><p>Most terminal-focused extensions default to sidebars, which means I’m constantly repositioning the terminals they add. That side-by-side panel layout makes sense when I’m actively co-editing code, but it breaks down when an agent is doing the writing and reviewing for me and I am less "hands on". In those cases, I want the agent’s terminal and the file it’s working on to live as <strong>side-by-side tabs</strong> in the editor area, each with as much screen space as possible.</p><p>To address this I built a tiny extension:&nbsp;<strong>Commands: Open Terminal in Editor</strong>. You can find it for free in these marketplaces:</p><ul><li><a href="https://marketplace.visualstudio.com/items?itemName=GenerousCorp.commands-open-terminal-in-editor&ref=danielraffel.me" rel="noreferrer">VS Code Marketplace</a></li><li><a href="https://open-vsx.org/extension/GenerousCorp/commands-open-terminal-in-editor?ref=danielraffel.me" rel="noreferrer">Open VSX</a></li></ul><p><em>Note: The namespace for extensions is quite full and I needed a name quickly. </em>😂</p><h2 id="what-my-extension-does">What my extension does</h2><p>It adds a status bar button that opens a terminal&nbsp;<strong>in the editor area</strong>&nbsp;(instead of a panel / sidebar). You can customize the command, nickname, icon, order, whether it appears in the status bar and if it's enabled.</p><h2 id="the-hard-boundary-and-why-this-isn%E2%80%99t-%E2%80%9Cup-in-the-chrome%E2%80%9D">The hard boundary (and why this isn’t “up in the chrome”)</h2><p>When I started, my intention was to put the button in the same “privileged” place where Claude/Codex-style UI appears—top-right editor chrome / custom toolbar-y areas.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2026/01/chrome.png" class="kg-image" alt="" loading="lazy" width="388" height="414"></figure><p>Unless I misunderstood, that's not possible for normal extensions.</p><p><strong>Most 3rd party extensions cannot inject UI into</strong>:</p><ul><li>the window title bar</li><li>the top-right editor chrome</li><li>overflow menus you don’t own</li><li>macOS traffic-light / native toolbar rows</li></ul><p>This seems to be a deliberate VS Code sandboxing decision, and I found no real workaround besides striking a partnership agreement.</p><h2 id="implementation">Implementation</h2><p>Instead of fighting VS Code’s UI boundaries, I leaned into the surfaces extensions <em>are</em>&nbsp;allowed to use: an <a href="https://code.visualstudio.com/api/ux-guidelines/activity-bar?ref=danielraffel.me" rel="noreferrer">Activity Bar</a> item, <a href="https://code.visualstudio.com/api/ux-guidelines/status-bar?ref=danielraffel.me" rel="noreferrer">Status Bar</a> buttons, and a Command in the <a href="https://code.visualstudio.com/api/ux-guidelines/command-palette?ref=danielraffel.me" rel="noreferrer">Command Palette</a> (⌘⇧P / Ctrl+Shift+P).</p><p>The result is a small, configurable system that lets me:</p><ul><li>Define&nbsp;<strong>custom agent commands</strong>&nbsp;(Claude, Codex, Gemini, or anything else)</li><li>Assign each one an&nbsp;<strong>icon</strong></li><li>Choose&nbsp;<strong>which agents appear in the status bar</strong></li><li>Open each agent in a&nbsp;<strong>terminal that lives in the editor area</strong>, side-by-side in a tab with files</li></ul><p>In practice, this gives me a&nbsp;<strong>status bar button per agent</strong>. Clicking one opens a dedicated terminal directly in the editor—no panes, no dragging, no context switching. Below are screenshots.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/image-3.png" class="kg-image" alt="" loading="lazy" width="576" height="282"><figcaption><span style="white-space: pre-wrap;">Launching agents from the activity bar</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/Preset-editor.png" class="kg-image" alt="" loading="lazy" width="2000" height="1310" srcset="https://danielraffel.me/content/images/size/w600/2026/01/Preset-editor.png 600w, https://danielraffel.me/content/images/size/w1000/2026/01/Preset-editor.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/01/Preset-editor.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/01/Preset-editor.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Editing the command preset options</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/image-2.png" class="kg-image" alt="" loading="lazy" width="748" height="44" srcset="https://danielraffel.me/content/images/size/w600/2026/01/image-2.png 600w, https://danielraffel.me/content/images/2026/01/image-2.png 748w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Launching agents from the bottom status bar (most practical launch position)</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/Codex-in-editor-area.png" class="kg-image" alt="" loading="lazy" width="2000" height="1310" srcset="https://danielraffel.me/content/images/size/w600/2026/01/Codex-in-editor-area.png 600w, https://danielraffel.me/content/images/size/w1000/2026/01/Codex-in-editor-area.png 1000w, https://danielraffel.me/content/images/size/w1600/2026/01/Codex-in-editor-area.png 1600w, https://danielraffel.me/content/images/size/w2400/2026/01/Codex-in-editor-area.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Opening terminals as first-class editor tabs not in a pane alongside files 🤗</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/image-5.png" class="kg-image" alt="" loading="lazy" width="492" height="246"><figcaption><span style="white-space: pre-wrap;">Optional: launch agents from the Command Palette via ⌘⇧P / Ctrl+Shift+P</span></figcaption></figure><h2 id="how-i-built-it">How I built it</h2><p>I used Codex as a “pair” for the whole loop:</p><ol><li>Scaffold an extension</li><li>Implement the command to open/move a terminal into the editor area</li><li>Add a status bar item wired to that command</li><li>Iterate until it felt simple + reliable</li><li>Package + publish</li></ol><p>If you’ve never built a VS Code extension: the workflow is basically&nbsp;<strong>TypeScript + the </strong><a href="https://code.visualstudio.com/api?ref=danielraffel.me" rel="noreferrer"><strong>VS Code Extension API</strong></a>, compiled into the&nbsp;<code>out/</code>&nbsp;directory, with commands declared in&nbsp;<code>package.json</code>.</p><h2 id="local-build-run-workflow">Local Build &amp; Run Workflow</h2><h3 id="prerequisites">Prerequisites</h3><ul><li>Node.js</li><li>npm</li><li>Marketplace specific CLI tools for publishing (<a href="https://github.com/microsoft/vscode-vsce?ref=danielraffel.me" rel="noreferrer">vsce</a>/<a href="https://github.com/eclipse/openvsx?ref=danielraffel.me" rel="noreferrer">ovsx</a>)</li></ul><h3 id="install-dependencies">Install dependencies</h3><p>From the project root:</p><pre><code class="language-bash">npm install</code></pre><p>Run this again only if dependencies change.</p><p><strong>Compile the project</strong></p><p>To build the extension locally:</p><pre><code class="language-bash">npm run compile</code></pre><p>The extension must compile successfully before it can be run or debugged.</p><h3 id="run-debug-the-extension"><strong>Run &amp; debug the extension</strong></h3><ol><li>Open the project in your editor (e.g. VS Code).</li><li>Open the&nbsp;<strong>Run and Debug</strong>&nbsp;activity panel item.</li><li>Select the extension run configuration.</li><li>Click the&nbsp;<strong>Play ▶️</strong>&nbsp;button.</li></ol><p>This launches a new&nbsp;<strong>Extension Development Host</strong>&nbsp;window containing a fresh project with the compiled extension loaded.</p><p>Use this window to:</p><ul><li>Interact with the extension</li><li>Test features and commands</li><li>Observe logs and errors in the Debug Console</li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/Screenshot-2026-01-05-at-11.11.17---AM-1.png" class="kg-image" alt="" loading="lazy" width="580" height="558"><figcaption><span style="white-space: pre-wrap;">Select the Run and Debug panel via Activity Panel</span></figcaption></figure><p></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2026/01/Screenshot-2026-01-05-at-11.11.49---AM.png" class="kg-image" alt="" loading="lazy" width="582" height="138"><figcaption><span style="white-space: pre-wrap;">Tap the play icon in the Run and Debug Panel to launches a new&nbsp;host with the extension</span></figcaption></figure><h3 id="typical-development-loop"><strong>Typical development loop</strong></h3><ol><li>Make code changes</li><li>npm run compile</li><li>Run / Debug the extension (▶️)</li><li>Test in the Extension Development Host</li><li>Repeat</li></ol><p>You do not need to restart your main editor—only the Extension Development Host. At first, it can be confusing when a new project window opens, but that’s the environment where the extension runs.</p><h2 id="packaging-publishing-the-part-that-was-more-annoying-than-the-code">Packaging + publishing (the part that was more annoying than the code)</h2><p>Writing the code was straightforward. The real effort went into publishing—supporting both VS Code and Cursor required setting up multiple accounts and navigating a maze of hoops.</p><h3 id="why-publish-to-two-marketplaces">Why publish to two marketplaces?</h3><ul><li><a href="https://marketplace.visualstudio.com/vscode?ref=danielraffel.me" rel="noreferrer"><strong>VS Code Marketplace</strong></a>&nbsp;is the main channel for Microsoft VS Code.</li><li><a href="https://open-vsx.org/?ref=danielraffel.me" rel="noreferrer"><strong>Open VSX</strong></a>&nbsp;matters because IDEs like&nbsp;<strong>Cursor</strong>&nbsp;rely on it (they can’t just use Microsoft’s marketplace directly), so publishing there makes the extension installable in more places.</li></ul><h3 id="publish-to-the-vs-code-marketplace-vsce">Publish to the VS Code Marketplace (vsce)</h3><p>I had to install Microsoft’s extension packaging tool:</p><pre><code class="language-bash">npm install -g @vscode/vsce
</code></pre><p>Then package/publish the extension:</p><pre><code class="language-bash">vsce package
vsce publish
</code></pre><p>To publish to the VS Code Marketplace, you need an Azure DevOps org + a PAT (personal access token).</p><p>The steps I followed:</p><ol><li><a href="https://dev.azure.com/?ref=danielraffel.me" rel="noreferrer">Create a free Azure DevOps account</a></li><li><a href="https://aex.dev.azure.com/me?mkt=en-US&ref=danielraffel.me" rel="noreferrer">Create an organization</a> (the UI is… quirky):</li><li><a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/personal-access-token-agent-registration?view=azure-devops&ref=danielraffel.me" rel="noreferrer">Create a PAT (Personal Access Token) to publish</a></li><li>Compile → publish with&nbsp;<code>vsce</code>&nbsp;using the token when requested</li></ol><h3 id="publish-to-open-vsx-marketplace-ovsx">Publish to Open VSX Marketplace (ovsx)</h3><p>For Open VSX I installed their packaging tool and used it as follows:</p><pre><code class="language-bash">npm install -g ovsx
ovsx publish -p PAT_TOKEN</code></pre><p>Before I could publish there’s a little extra identity/ownership friction:</p><ol><li>I had to <a href="https://open-vsx.org/user-settings/extensions?ref=danielraffel.me" rel="noreferrer">create a free OpenVSX account</a></li><li><a href="https://accounts.eclipse.org/?ref=danielraffel.me" rel="noreferrer">Create a free Eclipse account</a> and agree to lots of agreements</li><li><a href="https://open-vsx.org/user-settings/profile?ref=danielraffel.me" rel="noreferrer">Sign into Open VSX using that Eclipse identity</a> and agree to more agreements</li><li><a href="https://open-vsx.org/user-settings/tokens?ref=danielraffel.me" rel="noreferrer">Generate an Access Token (to get a PAT)</a> so I can publish the extension</li><li><a href="https://open-vsx.org/user-settings/namespaces?ref=danielraffel.me" rel="noreferrer">Create a namespace</a> to publish the extension under (<a href="https://github.com/eclipse/openvsx/wiki/Namespace-Access?ref=danielraffel.me" rel="noreferrer">namespace info</a>)</li><li>To make the namespace show as “verified”, I <a href="https://github.com/EclipseFdn/open-vsx.org/issues/7167?ref=danielraffel.me" rel="noreferrer">filed an issue requesting ownership/verification</a></li><li>Compile → publish with&nbsp;<code>ovsx</code>&nbsp;using the token when requested</li></ol><h3 id="cursor-verification-extra-steps-beyond-open-vsx">Cursor verification (extra steps beyond Open VSX)</h3><p>Cursor <a href="https://cursor.com/docs/configuration/extensions?ref=danielraffel.me#publisher-verification" rel="noreferrer">adds its own verification layer</a> for Open VSX publishers so folks can search for an extension in their IDE.</p><p>What I had to do:</p><ul><li><a href="https://www.generouscorp.com/Commands/?ref=danielraffel.me" rel="noreferrer">Host a simple webpage on my own domain</a></li><li>Link to the published extensions</li><li>Ensure consistent publisher/extension IDs across marketplaces</li><li><a href="https://forum.cursor.com/t/extension-verification-commands-open-terminal-in-editor/148014?ref=danielraffel.me" rel="noreferrer">Request verification in their forum</a></li></ul><p>Creating all these accounts and “proving you own the publisher identity” chain was the most time-consuming part of this project.</p><h2 id="takeaways">Takeaways</h2><ul><li>Building the extension took around 10 minutes. </li><li>While there was no direct monetary cost to publishing, figuring out how to publish across the various marketplaces took longer than writing the code.</li><li>The real constraints are&nbsp;<strong>UX sandboxing</strong>&nbsp;(where you can put UI) and&nbsp;<strong>distribution identity</strong>&nbsp;(publishing + verification across ecosystems).</li><li>If you want something usable in VS Code&nbsp;<em>and</em>&nbsp;Cursor, plan for&nbsp;<strong>multiple marketplaces + verification steps</strong>.</li></ul><p>If you want to try it:</p><ul><li><a href="https://marketplace.visualstudio.com/items?itemName=GenerousCorp.commands-open-terminal-in-editor&ref=danielraffel.me" rel="noreferrer">VS Code Marketplace</a></li><li><a href="https://open-vsx.org/extension/GenerousCorp/commands-open-terminal-in-editor?ref=danielraffel.me" rel="noreferrer">Open VSX</a></li></ul><p>The source for this extension is on Github:</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/Commands/?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/Commands</div><div class="kg-bookmark-description">Contribute to danielraffel/Commands development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-19.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/Commands" alt="" onerror="this.style.display = 'none'"></div></a></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I fixed a &quot;content is not valid Base64&quot; Error in n8n v2 GitHub Node ]]></title>
        <description><![CDATA[ After upgrading from n8n v1 to v2, my workflow that uploads binary files to GitHub suddenly broke. The fix was adding an Extract from File node between the binary source and the GitHub node to manually convert the binary data to Base64. ]]></description>
        <link>https://danielraffel.me/til/2025/12/25/how-i-fixed-a-content-is-not-valid-base64-error-in-n8n-v2-github-node/</link>
        <guid isPermaLink="false">694d777c8cb24c035b646d16</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 25 Dec 2025 09:53:16 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/12/C26EC1D6-3796-46B3-B818-96CD341013D4.png" medium="image"/>
        <content:encoded><![CDATA[ <p>After upgrading from n8n v1 to v2, my workflow that uploads binary files to GitHub suddenly broke with this error:</p><pre><code>Your request is invalid or could not be processed by the service
content is not valid Base64
</code></pre><h2 id="the-root-cause">The Root Cause</h2><p><a href="https://docs.n8n.io/2-0-breaking-changes/?ref=danielraffel.me#remove-in-memory-binary-data-mode" rel="noreferrer">n8n v2 has a breaking change</a> switching the default binary data storage mode from&nbsp;<code>memory</code>&nbsp;to&nbsp;<code>filesystem</code>(<code>N8N_DEFAULT_BINARY_DATA_MODE=filesystem</code>). This improves performance for large files, but the GitHub node doesn't properly handle the conversion from filesystem-stored binary data to Base64.</p><p>In v1, binary data was stored in memory and the GitHub node could directly access and encode it. In v2, binary data is stored as file references, and the node fails to read and encode it correctly.</p><h2 id="the-fix">The Fix</h2><p>Add an&nbsp;<strong>Extract from File</strong>&nbsp;node between the binary source and the GitHub node to manually convert the binary data to Base64.</p><h3 id="before-n8n-v1worked">Before (n8n v1 - worked)</h3><pre><code>HTTP Request → GitHub Node (Binary File: ON)
</code></pre><h3 id="after-n8n-v2working-solution">After (n8n v2 - working solution)</h3><pre><code>HTTP Request → Extract from File → GitHub Node (Binary File: OFF)
</code></pre><h2 id="step-by-step-solution">Step-by-Step Solution</h2><h3 id="1-add-an-extract-from-file-node">1. Add an Extract from File node</h3><p>Insert it after whatever node produces the binary data (HTTP Request, Read Binary File, etc.) and before the GitHub node.</p><p>Configure it with:</p><ul><li><strong>Operation:</strong>&nbsp;<code>Move File to Base64 String</code></li><li><strong>Input Binary Field:</strong>&nbsp;<code>data</code>&nbsp;(or whatever the binary field is named)</li></ul><h3 id="2-update-the-github-node">2. Update the GitHub node</h3><ul><li><strong>Turn OFF</strong>&nbsp;the "Binary File" toggle</li><li>Set&nbsp;<strong>File Content</strong>&nbsp;to:&nbsp;<code>={{ $json.data }}</code></li></ul><h2 id="working-n8n-v2-workflow">Working n8n v2 Workflow</h2><pre><code class="language-json">{
  "nodes": [
    {
      "parameters": {
        "method": "POST",
        "url": "YOUR_SCREENSHOT_SERVICE_URL",
        "options": {}
      },
      "name": "HTTP Request",
      "type": "n8n-nodes-base.httpRequest",
      "typeVersion": 4.2,
      "position": [400, 300]
    },
    {
      "parameters": {
        "operation": "binaryToPropery",
        "options": {}
      },
      "name": "Extract from File",
      "type": "n8n-nodes-base.extractFromFile",
      "typeVersion": 1.1,
      "position": [600, 300]
    },
    {
      "parameters": {
        "resource": "file",
        "operation": "edit",
        "owner": "YOUR_USERNAME",
        "repository": "YOUR_REPO",
        "filePath": "path/to/file.png",
        "fileContent": "={{ $json.data }}",
        "commitMessage": "Update file"
      },
      "name": "GitHub",
      "type": "n8n-nodes-base.github",
      "typeVersion": 1,
      "position": [800, 300]
    }
  ],
  "connections": {
    "HTTP Request": {
      "main": [[{"node": "Extract from File", "type": "main", "index": 0}]]
    },
    "Extract from File": {
      "main": [[{"node": "GitHub", "type": "main", "index": 0}]]
    }
  }
}
</code></pre><h2 id="key-takeaway">Key Takeaway</h2><p>The critical change is that in v2, you can no longer rely on the GitHub node to automatically handle binary-to-Base64 conversion when using filesystem storage mode. The Extract from File node bridges this gap by explicitly converting the binary data to a Base64 string before passing it to GitHub.</p><hr><p><em>Tested on n8n v2.1.4 (Self Hosted) with&nbsp;<code>binaryDataMode: filesystem</code></em></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Why a Purple Dot Appeared in My Mac’s Menu Bar During VLC Playback and How I Fixed It ]]></title>
        <description><![CDATA[ While watching a full-screen video in VLC on my Mac, I noticed a small purple dot appear in the menu bar—something I’d never seen before. It was distracting, so I set out to disable it. ]]></description>
        <link>https://danielraffel.me/til/2025/11/29/why-a-purple-dot-appeared-in-my-macs-menu-bar-during-vlc-playback-and-how-i-fixed-it/</link>
        <guid isPermaLink="false">692ac14d4a580b0359cda39a</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 29 Nov 2025 02:01:11 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/11/image-2.png" medium="image"/>
        <content:encoded><![CDATA[ <p>While watching a full-screen video in <a href="https://www.videolan.org/vlc/download-macosx.html?ref=danielraffel.me" rel="noreferrer">VLC</a> on macOS, I noticed a small purple dot appear in the menu bar which I’d never seen before. It was distracting, so I set out to disable it. According to <a href="https://support.apple.com/en-bn/guide/mac-help/mchlp1446/mac?ref=danielraffel.me" rel="noreferrer">Apple Support</a>, a purple dot means that “the system audio is being recorded.” Naturally, I assumed VLC was responsible, especially since I had previously granted it certain permissions. I tried disabling network device access and microphone access in Privacy settings, but the dot persisted.</p><p>After some trial and error, I discovered the real cause: I had <a href="https://rogueamoeba.com/soundsource/?ref=danielraffel.me" rel="noreferrer">SoundSource</a> by Rogue Amoeba running, and it was continuously accessing audio. Quitting SoundSource instantly made the purple dot disappear.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ What’s Inside a Buddha? More Than I Ever Imagined ]]></title>
        <description><![CDATA[ Today I learned that Buddha statues—yes, even the giant ones—often have scriptures and sacred objects placed inside them. ]]></description>
        <link>https://danielraffel.me/2025/11/25/whats-inside-a-buddha-more-than-i-ever-imagined/</link>
        <guid isPermaLink="false">692594fba05b99035c341163</guid>
        <category><![CDATA[ 🧘‍♂️ Meditating ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 25 Nov 2025 04:11:44 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/11/IMG_1717.JPG" medium="image"/>
        <content:encoded><![CDATA[ <p>Today I visited the <a href="https://www.wcpalace.org/?ref=danielraffel.me" rel="noreferrer">Wangduechhoeling Palace</a> in <a href="https://en.wikipedia.org/wiki/Bumthang_Valley?ref=danielraffel.me" rel="noreferrer">Bumthang, Bhutan</a> and learned that Buddha statues—yes, even the giant ones—often have scriptures and sacred objects placed inside them. </p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/11/IMG_1717-3.JPG" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/11/IMG_1717-3.JPG 600w, https://danielraffel.me/content/images/size/w1000/2025/11/IMG_1717-3.JPG 1000w, https://danielraffel.me/content/images/size/w1600/2025/11/IMG_1717-3.JPG 1600w, https://danielraffel.me/content/images/size/w2400/2025/11/IMG_1717-3.JPG 2400w" sizes="(min-width: 720px) 720px"></figure><p>It turns out these inner contents are called a zung, or inner relic. When a statue, <a href="https://en.wikipedia.org/wiki/Stupa?ref=danielraffel.me" rel="noreferrer">stupa</a>, or religious structure is created, it’s not considered spiritually “complete” until the zung is inserted in a ceremony of consecration. This isn’t just symbolic decoration—it’s believed to imbue the object with actual spiritual power, enabling it to bless anyone who comes into contact with it.</p><p>At the heart of this process is something called the sogshing, a wooden core placed inside the hollow statue. Imagine a slender wooden pillar with a tapered top and a <a href="https://en.wikipedia.org/wiki/Vajra?ref=danielraffel.me" rel="noreferrer">vajra</a> at the base, painted a deep red and inscribed on all four sides with mantras. This sogshing forms the inner spine of the statue.</p><p>Wrapped around it in silk are layers of sacred items, such as:</p><ul><li>miniature statues (often of Yeshey Sempa, considered the main relic),</li><li>sacred relics,</li><li>precious metals,</li><li>gemstones.</li></ul><p>Then the surrounding space is filled meticulously with rolled mantras, sandalwood, incense powders, and other offerings. Only when everything is in place is the base sealed and the entire statue formally consecrated.</p><p>I had always thought of large Buddha statues as either solid or hollow, and quietly empty inside. Realizing that many of them contain carefully arranged scripture, relics, and symbolism made me realize they’re not just objects of devotion—they’re vessels.</p><p>From now on, whenever I see one, especially the towering giants, I’ll wonder what prayers and relics are quietly sitting inside, radiating meaning outward.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Install an ESP32 NAT Router Config on an M5NanoC6 ]]></title>
        <description><![CDATA[ A friend told me about a firmware called esp32_nat_router_extended, and it turns out you can turn an ESP32 into a cheap, low-power 2.4 GHz travel router pretty easily. Here’s how I got it running on my M5NanoC6 (ESP32-C6) from macOS. ]]></description>
        <link>https://danielraffel.me/til/2025/11/19/how-to-install-an-esp32-nat-router-config-on-an-m5nanoc6/</link>
        <guid isPermaLink="false">691d239a05dfa69c49a2ef38</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 18 Nov 2025 19:45:57 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/11/FullSizeRender.jpeg" medium="image"/>
        <content:encoded><![CDATA[ <p>A friend told me about a firmware called&nbsp;<a href="https://github.com/dchristl/esp32_nat_router_extended?ref=danielraffel.me" rel="noreferrer"><strong>esp32_nat_router_extended</strong></a>, and it turns out you can turn an ESP32 into a cheap, low-power 2.4 GHz travel router pretty easily. Here’s how I got it running on my&nbsp;<a href="https://shop.m5stack.com/products/m5stack-nanoc6-dev-kit?srsltid=AfmBOoqIrKDySW25M9VNRRkPRkVISZWdAc-UANotk-32RwQl_E-LuS0m&ref=danielraffel.me" rel="noreferrer"><strong>M5NanoC6</strong></a><strong> (ESP32-C6)</strong>&nbsp;from macOS.</p><hr><h3 id="1-create-a-python-virtual-environment"><strong>1. Create a Python virtual environment</strong></h3><pre><code class="language-bash">python3 -m venv esp32-nat
cd esp32-nat
source bin/activate</code></pre><p>Using a venv keeps macOS’s system Python clean and avoids PEP 668 pip restrictions.</p><hr><h3 id="2-install-esptool-inside-the-venv"><strong>2. Install esptool inside the venv</strong></h3><pre><code class="language-bash">pip install pyserial esptool</code></pre><p>This gives you the flashing tool without touching system packages.</p><hr><h3 id="3-download-the-esp32-c6-firmware"><strong>3. Download the ESP32-C6 firmware</strong></h3><p>The original&nbsp;esp32_nat_router&nbsp;repo does&nbsp;<strong>not</strong>&nbsp;provide ESP32-C6 binaries, but the extended project does:</p><p>👉&nbsp;<strong>Releases:</strong></p><pre><code class="language-bash">https://github.com/dchristl/esp32_nat_router_extended/releases</code></pre><p>For the&nbsp;<strong>M5NanoC6</strong>, download the latest&nbsp;<strong>full</strong>&nbsp;build:</p><pre><code class="language-bash">esp32c6nat_extended_full_v7.2.0.zip</code></pre><p>Unzip it and place the&nbsp;.bin&nbsp;file into:</p><pre><code class="language-bash">~/esp32-nat/bin/esp32c6nat_extended_full_v7.2.0.bin</code></pre><h3 id="what-the-filenames-mean"><strong>What the filenames mean</strong></h3><ul><li><strong>…full…</strong>&nbsp;— contains&nbsp;<em>bootloader + partitions + firmware</em>.Use this for the&nbsp;<strong>first flash</strong>.</li><li><strong>…update…</strong>&nbsp;— incremental firmware-only packages for devices already running the extended version (OTA/web UI). Not suitable for a clean flash.</li></ul><hr><h3 id="4-put-the-m5nanoc6-into-usb-flash-mode"><strong>4. Put the M5NanoC6 into USB flash mode</strong></h3><ol><li>Hold the&nbsp;<strong>BOOT</strong>&nbsp;button (GPIO9)</li><li>Plug the device into your Mac over USB-C</li><li>Release the button once connected</li></ol><p>Check the serial port:</p><pre><code class="language-bash">ls /dev/cu.*</code></pre><p>You should see something like:</p><pre><code class="language-bash">/dev/cu.usbmodem2101</code></pre><p>That’s your device.</p><hr><h3 id="5-flash-the-firmware-esp32-c6-uses-a-single-combined-image"><strong>5. Flash the firmware (ESP32-C6 uses a single combined image)</strong></h3><p>From inside your working directory:</p><pre><code class="language-bash">esptool --chip esp32c6 \
  -p /dev/cu.usbmodem2101 \
  --before default-reset --after hard-reset write-flash \
  -z --flash-size detect \
  0x0 bin/esp32c6nat_extended_full_v7.2.0.bin</code></pre><p>If you see “Hash of data verified,” the flash succeeded.</p><hr><h3 id="6-connect-configure-the-router"><strong>6. Connect &amp; configure the router</strong></h3><p>After rebooting, the M5NanoC6 broadcasts a Wi-Fi network:</p><p><strong>ESP32_NAT_Router</strong></p><p>Connect to it and open:</p><pre><code class="language-bash">http://192.168.4.1</code></pre><p>Configure:</p><ul><li><strong>STA</strong>&nbsp;(upstream) Wi-Fi SSID + password</li><li><strong>SoftAP</strong>&nbsp;(your private network) SSID + password</li></ul><p>Save &amp; reboot.</p><p>Even though the ESP32-C6 supports Wi-Fi 6, the NAT router firmware&nbsp;<strong>only operates a 2.4 GHz AP</strong>.</p><hr><h3 id="summary"><strong>Summary</strong></h3><p>This takes about five minutes and turns a $6 ESP32 board into something genuinely useful: a pocket-sized NAT router with a private SSID, surprisingly useable 2.4 GHZ range, and low power draw. I &nbsp;wouldn’t rely on it as a full replacement for a more capable, multi-band travel router, but it definitely has its place. Especially, when all you need is to extend Wi-Fi to a low-bandwidth device.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ What I Learned from Asking ChatGPT About “Based” — Very Based of Me ]]></title>
        <description><![CDATA[ I thought it might be fun to start a new series where I share random stuff I ask ChatGPT to explain — tagged accordingly. To kick things off, here’s an unedited “Today I Learned” entry: ChatGPT explaining what “based” means. ]]></description>
        <link>https://danielraffel.me/til/2025/11/06/what-i-learned-from-asking-chatgpt-about-based-very-based-of-me/</link>
        <guid isPermaLink="false">690d22de2df5e9035b7ee41a</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 06 Nov 2025 15:57:10 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/11/20858FB3-63C6-4E1E-86F7-F40E1E4C6250.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I thought it might be fun to start a new series where I share random stuff I ask ChatGPT to explain — tagged accordingly. Sure, it might end up being a savvy SEO move, but I think it’ll also be entertaining and occasionally educational. Since this journal is really just a backup brain for me, the series will hopefully be less SEO tactic and more like a fun way for me to revisit things I've learned.</p><p>To kick things off, here’s an unedited “Today I Learned” entry: ChatGPT explaining what “based” means.</p><hr><p>The slang term&nbsp;<strong>“based”</strong>&nbsp;has evolved a lot over time, but today it generally means:</p><blockquote><strong>Being unapologetically yourself, holding your beliefs confidently, even if they’re unpopular.</strong></blockquote><p>It’s often used as the opposite of “cringe” or “trying too hard to fit in.”</p><p>Originally, “based” came from the phrase&nbsp;<em>“</em><a href="https://www.urbandictionary.com/define.php?term=basehead&ref=danielraffel.me" rel="noreferrer"><em>basehead</em></a><em>”</em>&nbsp;(a derogatory term for someone addicted to freebasing cocaine), but rapper&nbsp;<a href="https://en.wikipedia.org/wiki/Lil_B?ref=danielraffel.me" rel="noreferrer"><strong>Lil B (“The BasedGod”)</strong></a> reclaimed it to mean being true to yourself — not caring what others think. Internet culture then adopted and broadened the meaning.</p><hr><h3 id="%F0%9F%92%AC-examples"><strong>💬 Examples</strong></h3><p><strong>1. Confident authenticity</strong></p><blockquote>“She told her boss that she wouldn’t work overtime without extra pay.”<br>→ “That’s so based.”</blockquote><p><strong>Meaning:</strong>&nbsp;She stood up for herself and didn’t care about being seen as difficult — she’s grounded in her principles.</p><hr><p><strong>2. Contrarian opinion (but respected)</strong></p><blockquote>“He said he prefers physical books over Kindle because he likes the smell of paper.”<br>→ “Based.”</blockquote><p><strong>Meaning:</strong>&nbsp;Even though it’s old-fashioned, it’s honest and respectable.</p><hr><p><strong>3. Internet irony or humor</strong></p><p>Sometimes people use&nbsp;<em>based</em>&nbsp;sarcastically or ironically:</p><blockquote>“He eats cold pizza for breakfast every day.”<br>→ “Based behavior.”</blockquote><p><strong>Meaning:</strong>&nbsp;It’s funny and weird, but in a self-assured way — he doesn’t care what others think.</p><hr><ol start="4"><li><strong>Political “based” (used across the spectrum)</strong></li></ol><p>In politics,&nbsp;<em>based</em>&nbsp;can mean: Not caring about political correctness; saying what you think even if it’s controversial.</p><p>People on the&nbsp;<strong>right</strong>&nbsp;often use it to praise blunt, anti-establishment statements.</p><p>People on the&nbsp;<strong>left or center</strong>&nbsp;sometimes reclaim it ironically to mean “boldly saying the truth.”</p><blockquote>“He said he doesn’t trust either political party because they’re both corrupt.”<br>→ “That’s based.”<em>(Independent-minded, not afraid to criticize the system.)</em></blockquote><blockquote>“She called out billionaires for not paying taxes.”<br>→ “Based queen.”<em>(Used in left-wing or ironic internet slang.)</em></blockquote><hr><ol start="5"><li><strong>Meme culture “based”</strong></li></ol><p>Online communities (like Reddit, 4chan, Twitter/X, etc.) popularized&nbsp;<em>based</em>&nbsp;as a meme word — part compliment, part joke.</p><p>It can mean&nbsp;<strong>cool, confident, or savage</strong>, but also&nbsp;<strong>used ironically</strong>&nbsp;to praise something absurd or over-the-top.</p><blockquote>“He unironically wears Crocs to business meetings.”<br>→ “Extremely based.”<em>(Ironically admiring his confidence.)</em></blockquote><blockquote>“That cat just knocked over a $500 vase.”<br>→ “Based cat.”<em>(Playful internet humor — rebellion framed as confidence.)</em></blockquote><hr><ol start="6"><li><strong>“Based and redpilled”</strong></li></ol><p>This phrase became popular in some online circles to mean: “Awake to the truth,” especially regarding politics, media, or society.</p><p>It comes from&nbsp;<em>The Matrix</em>’s “red pill” metaphor.</p><p>However,&nbsp;<strong>“redpilled”</strong>&nbsp;can sometimes carry extremist or conspiracy connotations depending on context.</p><p>So,&nbsp;<em>“based and redpilled”</em>&nbsp;might mean anything from&nbsp;<em>“independent thinker”</em>&nbsp;to&nbsp;<em>“aligned with fringe views.”</em></p><blockquote><strong>“He stopped watching mainstream news and started reading independent blogs.”</strong><br>→ “Based and redpilled.”&nbsp;<em>(Thinks for himself — believes he’s seeing the truth.)</em></blockquote><blockquote><strong>“She deleted all her social media accounts and started journaling instead.”</strong><br>→ “Based and redpilled.”&nbsp;<em>(Rejecting digital noise and reclaiming her attention.)</em></blockquote><blockquote><strong>“He built a backyard bunker and says he’s ready for the collapse.”</strong><br>→ “Based and redpilled.”&nbsp;<em>(Prepper mindset — possibly sliding into conspiracy territory.)</em></blockquote><blockquote><strong>“She explained how algorithms shape what we believe online.”</strong><br>→ “Based and redpilled.”&nbsp;<em>(Critical thinker calling out systemic influence.)</em></blockquote><blockquote><strong>“He insists every pop song hides subliminal messages.”</strong><br>→ “Based and redpilled.”&nbsp;<em>(Used ironically — someone who’s taken the idea too far.)</em></blockquote><hr>
<!--kg-card-begin: html-->
<table style="table-layout:auto; border-collapse:collapse; color:#fff; caret-color:#fff;">
  <colgroup>
    <col style="width:22%;"> 
    <col>
    <col>
  </colgroup>
  <thead>
    <tr>
      <th>Context</th>
      <th>Meaning of “based”</th>
      <th>Example</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td>Slang</td>
      <td>Unapologetically authentic</td>
      <td>“She wore what she wanted — based.”</td>
    </tr>
    <tr>
      <td>Political</td>
      <td>Confidently expressing controversial beliefs</td>
      <td>“He criticized his own party — based.”</td>
    </tr>
    <tr>
      <td>Meme</td>
      <td>Humorously bold or absurd</td>
      <td>“He eats pizza with chopsticks — based move.”</td>
    </tr>
  </tbody>
</table>
<!--kg-card-end: html-->
 ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ I Got Ghosted by Mailgun (So I Built a Fix) ]]></title>
        <description><![CDATA[ Ghost self-hosters can now send newsletters with Amazon SES! I added multi-provider bulk email support to Ghost, starting with Amazon SES, making it simpler and cheaper for writers to deliver newsletters using Ghost&#39;s existing newsletter features. ]]></description>
        <link>https://danielraffel.me/2025/11/06/i-got-ghosted-by-mailgun-so-i-built-a-fix/</link>
        <guid isPermaLink="false">690a8a1469e5b4035bda00d5</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 06 Nov 2025 08:15:12 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/11/9404758C-E878-4E43-8493-431A50827C66.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>TL;DR</strong>: <a href="https://www.mailgun.com/?ref=danielraffel.me" rel="noreferrer">Mailgun</a> falsely flagged me for spamming and shut down my account without an appeal process, leaving me unable to send newsletters from <a href="https://ghost.org/?ref=danielraffel.me" rel="noreferrer">Ghost</a>. That motivated me to <a href="https://github.com/TryGhost/Ghost/pull/25250?ref=danielraffel.me" rel="noreferrer">build multi-provider bulk email support</a>, starting with <a href="https://aws.amazon.com/ses/?ref=danielraffel.me" rel="noreferrer">Amazon SES</a>.</blockquote><p>Back in 2023, I started self-hosting this blog using <a href="https://ghost.org/?ref=danielraffel.me" rel="noreferrer">Ghost</a>. It was a great excuse to write more and it inspired me to learn how to use <a href="https://cloud.google.com/?ref=danielraffel.me" rel="noreferrer">Google Cloud</a>. I wanted to email posts to subscribers, and at the time to do that (easily) with Ghost I had <a href="https://docs.ghost.org/faq/mailgun-newsletters?ref=danielraffel.me" rel="noreferrer">one option</a>: use a <a href="https://www.twilio.com/en-us/resource-center/bulk-email-guide?ref=danielraffel.me" rel="noreferrer">bulk mail provider</a> called Mailgun. While Mailgun had a free tier it was mentioned nowhere obvious on their site.</p><p>I remember thinking, it’s almost like they didn’t want this plan to exist. (Pin that thought.)</p><p>I <a href="https://forum.ghost.org/t/solved-mailgun-has-changed-pay-as-you-go-offer-how-to-send-cheap-emails-with-mailgun/32850/5?ref=danielraffel.me" rel="noreferrer">found a tip</a> instructing me to reach out to Mailgun support asking to downgrade my account to their pay-as-you-go plan:</p><blockquote><strong>Mailgun Support — Aug 8, 2023</strong><br>I understand that you would like to switch to the pay as you go (Flex) plan. We will check and help you on this. Thank you for your request. We are happy to confirm, we have successfully downgraded the account to the requested plan. </blockquote><p>I wired everything up and it worked. I sent around 70 emails to a small list. Then one day it stopped working. I don’t even remember how I noticed but I eventually found this in my Ghost logs: <strong><code>Mailgun Error 403: Domain not allowed</code></strong>. </p><p>So I opened a ticket with Mailgun and a few days later:</p><blockquote><strong>Mailgun Support — Jun 27, 2024</strong><br>Your account was automatically placed on an evaluation period enforcing limits (100 msgs/hr, max 9 recipients). To help protect our customers against spam, we have automated systems in place to flag accounts that appear suspicious. Sometimes we get it wrong… Please read our <a href="https://documentation.mailgun.com/docs/mailgun/email-best-practices/best_practices/?ref=danielraffel.me" rel="noreferrer">Email Best Practices</a>/<a href="https://www.mailgun.com/legal/aup/?ref=danielraffel.me" rel="noreferrer">AUP</a> and answer:<br>– What types of emails will you be sending?<br>– Transactional, marketing, or both?<br>– How do you source email lists/contacts and what are the URLs of these sources?<br>– Could you please provide the URLs to your Terms of Service and Privacy Policy for our review?<br>– Expected monthly volume of messages?</blockquote><p>Okay. I answered everything, assuming it was a false positive and they’d flip the switch back on. Instead, it escalated:</p><blockquote><strong>Mailgun Support — Jun 28, 2024</strong><br>After reviewing the ticket, it has been determined that we will need to engage another group of colleagues. We are transferring the ticket to them.</blockquote><p>And then:</p><blockquote><strong>Mailgun Support — Jul 1, 2024</strong><br>After reviewing your account details and activity, we’ve decided to permanently disable your Mailgun account. Unfortunately, we are unable to fully disclose specific reasons behind this action in order to protect our customers, internal processes, and compliance systems. Our <a href="http://www.mailgun.com/terms?ref=danielraffel.me">Terms</a> or <a href="https://www.mailgun.com/aup/?ref=danielraffel.me">AUP</a> are available on our website with more information.</blockquote><p>No appeal process. No logical explanation. No support number. I tried finding and emailing the CEO <a href="https://www.linkedin.com/in/wconway/?ref=danielraffel.me" rel="noreferrer">William Conway</a> who was ironically, unreachable. After a few more attempts for mercy, I accepted my fate.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/11/image-1.png" class="kg-image" alt="" loading="lazy" width="1746" height="1466" srcset="https://danielraffel.me/content/images/size/w600/2025/11/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/11/image-1.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/11/image-1.png 1600w, https://danielraffel.me/content/images/2025/11/image-1.png 1746w" sizes="(min-width: 720px) 720px"></figure><h3 id="fast-forward-to-october-2025">Fast forward to October 2025</h3><p>I stumbled upon a community attempt to <a href="https://github.com/TryGhost/Ghost/pull/22771?ref=danielraffel.me" rel="noreferrer">add Postmark support</a> to Ghost. Quite a bit of time had elapsed and it didn’t seem to be progressing, but a note from Ghost’s CTO stuck with me with clear guidance: <em>“Make the smallest change you can that moves bulk mail toward an adapter, not built-in.”</em>&nbsp;That was the nudge I needed.</p><p>One evening I built the foundation for <a href="https://github.com/TryGhost/Ghost/pull/25250?ref=danielraffel.me" rel="noreferrer">multiple bulk email providers</a> in Ghost’s newsletter pipeline. A few days later, I extended it with an <a href="https://github.com/TryGhost/Ghost/pull/25367?ref=danielraffel.me" rel="noreferrer">Amazon SES adapter</a>. SES provides Ghost with a <a href="https://aws.amazon.com/ses/pricing/?ref=danielraffel.me" rel="noreferrer"><strong>highly affordable</strong></a>, alternative email-sending solution.</p><p>Getting locked out of sending a newsletter on my own domain as a legitimate, small sender was demoralizing. I missed sharing my thoughts with others and am glad to have a way to reach out again.</p><p>Ghost likely has higher-priority work than adding alternate email providers for self-hosters, so I did my best to follow their guidelines.</p><p>Hopefully it’s useful to the Ghost community and maintainers will review and reach out to discuss next steps to explore getting this upstreamed. My sense is folks have <a href="https://docs.ghost.org/faq/mailgun-newsletters?ref=danielraffel.me" rel="noreferrer">wanted more bulk-mail options</a> for a while. I believe it makes Ghost more resilient and more valuable. It’s good timing too since Ghost recently added the ability to <a href="https://docs.ghost.org/admin-api/posts/email-only-posts?ref=danielraffel.me" rel="noreferrer">email a post without publishing it</a>.</p><p>I’m excited to return to a more consistent writing rhythm and share some of the projects I’ve been working on. I’ve actually been publishing pretty regularly—it just wasn’t making it to inboxes. Starting today, that changes.</p><p>And if Mailgun ever pulls the plug on your Ghost newsletter too, you’ve now got an alternative. In unsurprising news, <a href="https://help.mailgun.com/hc/en-us/articles/360048661093-Can-you-explain-pay-as-you-go-PAYG-billing?ref=danielraffel.me" rel="noreferrer">Mailgun has discontinued its pay-as-you-go plan</a>.</p><hr><p><strong>Update 11/25/2025</strong></p><p>I recently came across a simple but effective solution for using AWS SES instead of Mailgun for sending Ghost newsletters — <strong>without</strong> modifying Ghost’s source code:</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/exlab-code/ghost-cms-amazon-ses-adapter?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - exlab-code/ghost-cms-amazon-ses-adapter: This project provides a simple but effective solution for using AWS SES instead of Mailgun for sending Ghost newsletters, without modifying Ghost’s source code.</div><div class="kg-bookmark-description">This project provides a simple but effective solution for using AWS SES instead of Mailgun for sending Ghost newsletters, without modifying Ghost&amp;#39;s source code. - exlab-code/ghost-cms-amazon-se…</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-13.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">exlab-code</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/ghost-cms-amazon-ses-adapter" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>The approach I built takes a <strong>much</strong> more complex path. <strong>It’s designed to help Ghost migrate from Mailgun to an adapter model,</strong> making SES the first reference email adapter with built-in analytics support — something the above approach doesn’t include. Unfortunately, using my approach currently requires patching Ghost after every update. I do have a script to automate it, but it’s fragile and definitely a bit of a hassle. If Ghost eventually accepts the adapter approach, this should make long-term maintenance much simpler.</p><p>My approach <a href="https://github.com/TryGhost/Ghost/pull/25365?ref=danielraffel.me" rel="noreferrer">also adds open, click, and bounce tracking</a> for SES emails via a lightweight AWS pipeline:<strong> SES → SNS → SQS → Ghost</strong>&nbsp;(with polling every 5 minutes). I shared <a href="https://github.com/TryGhost/Ghost/pull/25365/files?ref=danielraffel.me#diff-7c2f909d523c373746c335823a42b3abdd7aa3bd8ae31b1e79566567e84f5943" rel="noreferrer">SES Setup</a> instructions for the faint of heart. </p><p>Great to see there are options out there to suit different needs and preferences!</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Got Ctrl+K to Work in VS Code on Windows in the Claude Code REPL ]]></title>
        <description><![CDATA[ If you&#39;re using Claude Code&#39;s REPL in VS Code on Windows and Ctrl+K isn&#39;t working to clear text from the cursor to the end of line (like it does in bash), here&#39;s the quick fix. ]]></description>
        <link>https://danielraffel.me/til/2025/10/22/how-i-got-ctrl-k-to-work-in-vs-code-on-windows-in-the-claude-code-repl/</link>
        <guid isPermaLink="false">68f85da16970720360b93306</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 21 Oct 2025 21:37:57 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/ChatGPT-Image-Oct-21--2025-at-09_37_22-PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>If you're using Claude Code's REPL in VS Code on Windows and <code>Ctrl+K</code> isn't working to clear text from the cursor to the end of line (like it does in bash), here's the quick fix.</p><h2 id="the-problem">The Problem</h2><p>VS Code intercepts <code>Ctrl+K</code> for its own "chord" keybindings, so it never reaches the terminal/REPL. You get a message about waiting for another key instead of clearing the line.</p><h2 id="the-solution">The Solution</h2><p>You need to override the keybinding in your User keybindings file (not the read-only Default one).</p><h3 id="step-1-open-the-correct-file">Step 1: Open the correct file</h3><ul><li>Press Ctrl+Shift+P → type <code>Preferences: Open Keyboard Shortcuts (JSON)</code></li><li>Or directly open: <code>code "%APPDATA%\Code\User\keybindings.json"</code></li></ul><h3 id="step-2-add-this-configuration">Step 2: Add this configuration:</h3><pre><code class="language-bash">  [
    {
      "key": "ctrl+k",
      "command": "-workbench.action.terminal.clear",
      "when": "terminalFocus"
    },
    {
      "key": "ctrl+k",
      "command": "workbench.action.terminal.sendSequence",
      "args": { "text": "\u000b" },
      "when": "terminalFocus"
    }
  ]</code></pre><h3 id="step-3-save-no-restart-needed">Step 3: Save (no restart needed)</h3><h2 id="why-this-works"><br>Why This Works</h2><p>The key insight: you need to send the actual control character <code>\u000b</code>, not a command like <code>cls</code> or <code>clear</code>. The REPL uses readline-style keybindings where <code>Ctrl+K</code> is a character sequence, not a shell command.</p><p>Now <code>Ctrl+K</code> clears from cursor to end of line, and <code>Ctrl+U</code> clears from cursor to beginning - just like in bash!</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ GitHub Copilot’s “Free Plan Limit” Bug: A Year-Long Oversight? ]]></title>
        <description><![CDATA[ If you use GitHub Copilot on the free plan, you’ve probably seen this message: “You’ve reached your free plan limit. Your limits will reset on [date].” GitHub, if you really intend to offer a free Copilot tier, please take a look at this bug. ]]></description>
        <link>https://danielraffel.me/til/2025/10/22/github-copilots-free-plan-limit-bug-a-year-long-oversight/</link>
        <guid isPermaLink="false">68f818fa3d34db035c54c19d</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 21 Oct 2025 17:06:52 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/Screenshot-2025-10-21-at-5.05.42---PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>If you use GitHub Copilot on the free plan, you’ve probably seen this message:</p><blockquote>“You’ve reached your free plan limit. Your limits will reset on [date].”</blockquote><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/10/image-9.png" class="kg-image" alt="" loading="lazy" width="2000" height="1475" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-9.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-9.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/10/image-9.png 1600w, https://danielraffel.me/content/images/size/w2400/2025/10/image-9.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Screenshot taken on October 20, 2025</span></figcaption></figure><p>But what happens when that date comes? For many of us — nothing. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/10/image-10.png" class="kg-image" alt="" loading="lazy" width="2000" height="1475" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-10.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-10.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/10/image-10.png 1600w, https://danielraffel.me/content/images/size/w2400/2025/10/image-10.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Screenshot taken on October 21, 2025</span></figcaption></figure><p><br>The reset never actually happens. Instead, the date just quietly moves forward by another month.</p><p>This bug has been discussed by hundreds of users&nbsp;<a href="https://github.com/orgs/community/discussions/149578?ref=danielraffel.me">in the GitHub Community forum</a>, some for almost a year now. The pattern is the same every time: free users hit the supposed “monthly limit,” wait patiently for the reset, and then find themselves still locked out. No usage, no code completions, no reset.</p><hr><h3 id="why-it-matters"><strong>Why It Matters</strong></h3><p>GitHub markets a <a href="https://docs.github.com/en/copilot/concepts/billing/individual-plans?ref=danielraffel.me#github-copilot-free" rel="noreferrer">free tier for Copilot</a>, and the documentation promises monthly resets. That’s a fair deal. But when those resets never occur — even if you haven’t used the service — it crosses from “technical bug” into “broken promise.”</p><p>To be clear, I don’t believe this is malicious or a sneaky way to drive upgrades. It’s most likely a low-priority bug that’s slipped through the cracks because <a href="https://docs.github.com/en/support/contacting-github-support/creating-a-support-ticket?ref=danielraffel.me#about-support-tickets" rel="noreferrer">free-tier users don’t get formal support</a>. Still, this bug sends the wrong message: that “free” users’ experiences don’t matter enough to fix something this visible and easily verified.</p><hr><h3 id="a-simple-ask"><strong>A Simple Ask</strong></h3><p>GitHub, if you really intend to offer a <a href="https://docs.github.com/en/copilot/concepts/billing/individual-plans?ref=danielraffel.me#github-copilot-free" rel="noreferrer">free Copilot tier</a>, please take a look at <a href="https://github.com/orgs/community/discussions/149578?ref=danielraffel.me" rel="noreferrer">this bug</a> &nbsp;— it’s been raised repeatedly by users&nbsp;(<a href="https://github.com/orgs/community/discussions/171815?ref=danielraffel.me" rel="noreferrer">here</a>, <a href="https://github.com/orgs/community/discussions/174293?ref=danielraffel.me" rel="noreferrer">here</a> and elsewhere.) If it’s been quietly deprecated, just say so. I trust this isn’t malice — it’s likely just low priority and nobody’s bothered to investigate. Still, it wastes people’s time, gives the wrong impression, and comes off  very sloppy. Just fix it, or say it’s no longer supported. I trust this isn’t meant as a subtle push to get people like me to reactivate our paid accounts.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Fixed Pelotons &quot;Some information may not load&quot; Warning ]]></title>
        <description><![CDATA[ Back in August, I began noticing that whenever I started a workout, I received a warning saying, “Some information may not load.” Embarrassingly, the cause turned out to be AdGuard Home. One of my blocklists was preventing the device from connecting properly. ]]></description>
        <link>https://danielraffel.me/til/2025/10/17/how-i-fixed-pelotons-some-information-may-not-load-warning/</link>
        <guid isPermaLink="false">68f14149218e37036041c7aa</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 16 Oct 2025 20:11:11 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/BFAAF398-1BF4-4CD0-AA3C-277CE7E3AA5B.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Back in August, I began noticing that whenever I started a workout, I received a warning saying, “Some information may not load.” The result was that several features stopped working — cadence and resistance ranges, auto-resistance, Apple Watch heart rate syncing, workout progress, the leaderboard, and even the ability to pause the class. I tried restarting, resetting the network connection, and performing every troubleshooting step, including a full factory reset of the machine.</p><p><strong>Embarrassingly, the cause turned out to be much simpler: I was running AdGuard Home, and one of my blocklists was preventing the device from connecting properly.</strong> I’m always amazed that when I run into weird network behavior, my brain doesn’t immediately shout, “Try turning off AdGuard and see if that fixes it!” Fortunately, when it happened again last night, I actually remembered this time — I temporarily disabled AdGuard, and everything worked perfectly. </p><p>While Peloton provides a list of <a href="https://support.onepeloton.com/s/article/6428775274900-Bike-and-BikePlus-Corporate-Firewalls-Whitelist-Addresses?language=en_US&ref=danielraffel.me" rel="noreferrer">domains that need to be allowlisted</a> for the Bike to work I found it wasn’t accurate. At time of writing they don’t list <code>pelotime.com</code> which seems to be required if you want to sync your heart rate data from your Apple Watch during a class. <br><br>After some trial and error I allowlisted just the following domains in AdGuard and everything on my Peloton seems to work.</p><pre><code class="language-Bash">@@||onepeloton.com^$important
@@||pelotoncycle.com^$important
@@||pelotime.com^$important</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Got My Apogee Symphony MKI Working on macOS Tahoe 26.1 ]]></title>
        <description><![CDATA[ After updating to macOS Tahoe 26.1, my Apogee Symphony I/O MK I interface stopped working over USB Audio. Using a Thunderbridge with a Thunderbolt-to-USB-C adapter got it working again. ]]></description>
        <link>https://danielraffel.me/til/2025/10/16/how-i-got-my-apogee-symphony-mki-working-on-macos-tahoe-26-1/</link>
        <guid isPermaLink="false">68f03aeb218e37036041c741</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 15 Oct 2025 17:41:27 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/4981172E-2B39-4C9B-91BA-364504179561.png" medium="image"/>
        <content:encoded><![CDATA[ <p>A few years ago, Apogee officially stopped supporting my <a href="https://knowledge.apogeedigital.com/legacy-symphony-i/o-mk-i-guide-for-intel-and-apple-silicon-macs?ref=danielraffel.me" rel="noreferrer">Apogee Symphony I/O MK I</a> audio interface. Thankfully, they’ve continued to respond to my occasional emails about it — they’re genuinely good people who just had to make a tough business decision.</p><p>After updating to macOS Tahoe 26.1, I found that the interface stopped working over USB Audio. For some reason, I could get audio <em>in</em>&nbsp;but not&nbsp;<em>out</em>. I was pretty bummed and tried everything I could think of to fix it. Eventually, a helpful note from Apogee support mentioned that they’d found better results using Thunderbolt instead of USB Audio, and suggested trying a <a href="https://knowledge.apogeedigital.com/symphony64-pcie-thundebridge-users-guide-documentation-and-downloads?ref=danielraffel.me" rel="noreferrer">Thunderbridge</a> which I happen to own.</p><p>That turned out to be great advice. Yesterday, I tried setting it up but got stuck — I didn’t have a way to connect a Thunderbolt cable. Today, I remembered I actually own a <a href="https://www.apple.com/shop/product/MYH93AM/A/thunderbolt-3-usb%E2%80%91c-to-thunderbolt-2-adapter?ref=danielraffel.me" rel="noreferrer">USB-C to Thunderbolt adapter</a>. I was relieved to find that once I plugged everything in and switched the Apogee to Symphony mode, both audio input and output were working again.</p><p>Huge thanks to Apogee for taking the time to test this, share that tip and helping me squeeze a bit more life out of this very expensive, end-of-life gear. Really appreciate it.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Discovered What Was Eating My Mac’s Storage ]]></title>
        <description><![CDATA[ I ran into this problem: my 2TB SSD kept filling up at an impossible rate. No matter how much I deleted, the free space would vanish again. After clearing out every obvious file and folder I could find, I finally decided it was time to dig deeper. ]]></description>
        <link>https://danielraffel.me/til/2025/10/13/how-i-discovered-what-was-eating-my-macs-storage/</link>
        <guid isPermaLink="false">68ed4fb758969e035a380091</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 13 Oct 2025 12:35:20 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/CEBFEB0D-DA93-4568-8237-A3AFFE583D24.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I ran into this problem: my 2TB SSD kept filling up at an impossible rate. No matter how much I deleted, the free space would vanish again. After clearing out every obvious file and folder I could find, I finally decided it was time to dig deeper.</p><p>I started poking around in&nbsp;<a href="https://support.apple.com/guide/activity-monitor/welcome/mac?ref=danielraffel.me" rel="noreferrer"><strong>Activity Monitor</strong></a>, and in the&nbsp;<a href="https://support.apple.com/guide/activity-monitor/view-network-activity-actmntr1006/mac?ref=danielraffel.me" rel="noreferrer"><strong>Network</strong></a>&nbsp;tab I noticed something suspicious —&nbsp;<em>Data received: 22.93 GB</em>. That seemed unusually high. Based on a few online suggestions, I downloaded&nbsp;<a href="https://daisydiskapp.com/?ref=danielraffel.me" rel="noreferrer"><strong>DaisyDisk</strong></a>&nbsp;to get a clearer picture of what was going on.</p><p>Seconds after scanning, I found the culprit: the <a href="http://overcast.fm/?ref=danielraffel.me" rel="noreferrer"><strong>Overcast</strong></a>&nbsp;podcast app. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/10/image-8.png" class="kg-image" alt="" loading="lazy" width="866" height="950" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-8.png 600w, https://danielraffel.me/content/images/2025/10/image-8.png 866w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">I immediately changed to Download Manually after noticing Overcast was the culprit.</span></figcaption></figure><p>It had automatically downloaded&nbsp;<em>every</em>&nbsp;podcast I was subscribed to, quietly consuming&nbsp;<strong>912.7 GB</strong>&nbsp;of space. I immediately deleted the app and the&nbsp;<strong>Overcast Container</strong>&nbsp;folder, reclaiming nearly a terabyte.</p><p>I like to occasionally use Overcast on desktop, so I’ll reinstall it — but next time, I’ll switch the default download option from automatic to manual. Definitely user error on my part, but wow. A startup alert in the app about massive disk usage would’ve been appreciated.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Use Live Translation with Older AirPods (Yes, It Works!) ]]></title>
        <description><![CDATA[ Apple recently added an impressive new feature to AirPods — Live Translation — letting you hear real-time translations of in-person conversations straight through your earbuds. Even better, it also works on older AirPods Pro models (2nd and 3rd generation). ]]></description>
        <link>https://danielraffel.me/til/2025/10/13/how-to-use-live-translation-with-older-airpods-yes-it-works/</link>
        <guid isPermaLink="false">68ed397258969e035a38006f</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 13 Oct 2025 10:51:21 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/image-7.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Apple recently added a pretty amazing feature to AirPods —&nbsp;<a href="https://support.apple.com/en-us/123185?ref=danielraffel.me" rel="noreferrer"><strong>Live Translation</strong></a>&nbsp;— which lets you hear real-time translations of in-person conversations straight through your earbuds. Officially, Apple introduced this with the&nbsp;<a href="https://www.apple.com/airpods-4/?ref=danielraffel.me" rel="noreferrer"><strong>AirPods 4</strong></a>.</p><p>But here’s the best part: it also works on some&nbsp;<a href="https://www.apple.com/airpods/compare/?modelList=airpods-2nd-gen%2Cairpods-pro&ref=danielraffel.me" rel="noreferrer"><strong>older AirPods Pro models</strong></a>&nbsp;— specifically the&nbsp;<strong>AirPods Pro (2nd generation)</strong>&nbsp;and&nbsp;<strong>AirPods Pro (3rd generation)</strong>&nbsp;— when paired with an&nbsp;<strong>iPhone 15 Pro or later</strong>&nbsp;running the latest firmware. It’s not heavily advertised (likely because Apple wants to promote it as a new-model feature).</p><p><a href="https://support.apple.com/en-bw/guide/airpods/dev9c215ca94/web?ref=danielraffel.me">Learn more about how to set it up →</a></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Restore Broken Geofence Automations in Apple Home ]]></title>
        <description><![CDATA[ After updating iOS my geofence automations in the Home app stopped working. Here’s what I did to get my iPhone working again after Apple Home geofence automations stopped working. ]]></description>
        <link>https://danielraffel.me/til/2025/10/07/how-to-restore-broken-geofence-automations-in-apple-home/</link>
        <guid isPermaLink="false">68e5719b496056035b94c2fb</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 07 Oct 2025 15:09:00 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/10/3D2B906E-512B-45F1-B485-D971888F6A43.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I use the Apple Home app for <a href="https://support.apple.com/en-us/102313?ref=danielraffel.me" rel="noreferrer">geofence automations</a>—mainly to <a href="https://danielraffel.me/2024/12/06/how-to-set-up-your-honeywell-alarm-with-apple-home-for-push-notifications-and-geofence-based-arming-disarming/" rel="noreferrer">arm and disarm my alarm</a>, among other things. After updating to iOS 26, I was disappointed to find that my automations relying on geofence triggers (like “first person home” or “last person left”) stopped working. Oddly, my partner’s device continued triggering them, but mine didn’t.</p><p>I tried a range of fixes—recreating automations, toggling Home’s location permissions, upgrading <a href="https://homebridge.io/?ref=danielraffel.me" rel="noreferrer">Homebridge</a>, and updating several third-party plugins—without success. The following steps show how I fixed Home app location automations. They almost certainly involve more toggling than needed, but this approach worked for me:</p><ol><li><strong>Go to:</strong>&nbsp;Privacy &amp; Security → Location Services → System Services<br><strong>Disable:</strong></li></ol><ul><li>Alerts &amp; Shortcuts Automations</li><li>Device Management</li><li>Home Accessories</li><li>Significant Locations &amp; Routes</li><li>Improve Location Accuracy</li></ul><ol start="2"><li><strong>Then go to:</strong>&nbsp;Privacy &amp; Security → Location Services → Home<br><strong>Disable:</strong></li></ol><ul><li>Home Location access (for both app setup and Home app use)</li></ul><ol start="3"><li><strong>Reboot your iPhone.</strong></li><li><strong>Return to:</strong>&nbsp;Privacy &amp; Security → Location Services → System Services<br><strong>Re-enable:</strong></li></ol><ul><li>Alerts &amp; Shortcuts Automations</li><li>Device Management</li><li>Home Accessories</li><li>Significant Locations &amp; Routes</li><li>Improve Location Accuracy</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/10/image-2.png" class="kg-image" alt="" loading="lazy" width="1179" height="2556" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-2.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-2.png 1000w, https://danielraffel.me/content/images/2025/10/image-2.png 1179w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/10/image-3.png" class="kg-image" alt="" loading="lazy" width="1179" height="2556" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-3.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-3.png 1000w, https://danielraffel.me/content/images/2025/10/image-3.png 1179w" sizes="(min-width: 720px) 720px"></figure><ol start="5"><li><strong>Go back to:</strong>&nbsp;Privacy &amp; Security → Location Services → Home<br><strong>Re-enable:</strong></li></ol><ul><li>Home Location access (for both app setup and Home app use)</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/10/image-1.png" class="kg-image" alt="" loading="lazy" width="1179" height="666" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-1.png 1000w, https://danielraffel.me/content/images/2025/10/image-1.png 1179w" sizes="(min-width: 720px) 720px"></figure><ol start="6"><li><strong>Reboot once more.</strong></li><li>Privacy and Security -&gt; Location Services -&gt; Home</li></ol><ul><ul><li>Re-enable:<ul><li>Home Location access (for both App setup &amp; Home app use) </li></ul></li></ul></ul><ol start="6"><li>One more iPhone reboot!</li></ol><hr><blockquote>💡&nbsp;<strong>Additional Discovery: </strong>things were more mixed up than I realized</blockquote><ul><li>While I didn't change this setting I found that my automations in the Home app had switched to using my&nbsp;<strong>iPad</strong>&nbsp;for location instead of my&nbsp;<strong>iPhone</strong>.</li><li>To fix it, on my iPhone I went to&nbsp;<strong>Settings → [My Apple Account] → Find My → My Location</strong>&nbsp;and changed it to&nbsp;<strong>This Device</strong>.</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/10/image-4.png" class="kg-image" alt="" loading="lazy" width="1179" height="1371" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-4.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-4.png 1000w, https://danielraffel.me/content/images/2025/10/image-4.png 1179w" sizes="(min-width: 720px) 720px"></figure><p>Afterwards, I confirmed that the Home app was again using my iPhone for location in my geofence automations.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/10/image-5.png" class="kg-image" alt="" loading="lazy" width="1179" height="1015" srcset="https://danielraffel.me/content/images/size/w600/2025/10/image-5.png 600w, https://danielraffel.me/content/images/size/w1000/2025/10/image-5.png 1000w, https://danielraffel.me/content/images/2025/10/image-5.png 1179w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Restored NYTimes iOS Content Updates Blocked by AdGuard Home ]]></title>
        <description><![CDATA[ I use AdGuard Home with several block lists. Recently, the NY Times iOS app stopped updating content. After checking the logs, I noticed that samizdat-graphql.nytimes.com was being blocked. Once I unblocked it, content updates in the iOS and iPadOS apps started working again. ]]></description>
        <link>https://danielraffel.me/til/2025/09/16/how-i-restored-nytimes-ios-content-updates-blocked-by-adguard-home/</link>
        <guid isPermaLink="false">68c9db8c8db720035f60293a</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 16 Sep 2025 15:04:01 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/09/image.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I use AdGuard Home with several block lists. Recently, the NY Times iOS app stopped updating and refreshing content. The only way to fix this was to temporarily disable AdGuard. After checking the logs, I noticed that&nbsp;<code>http://samizdat-graphql.nytimes.com</code>&nbsp;was being blocked. Once I unblocked it, content in the iOS and iPadOS apps started updating again. <em>Sharing in case it’s useful to others.</em></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Ghost GCP Installer and Updater Scripts Now Support Ghost v6 ]]></title>
        <description><![CDATA[ I’ve updated my Ghost updater script to fully support migrating from Ghost 5 to Ghost 6, assuming your Ghost instance was installed using the companion Ghost Google Cloud installer. ]]></description>
        <link>https://danielraffel.me/2025/08/11/ghost-gcp-updater-script-now-supports-migration-from-v5-to-v6/</link>
        <guid isPermaLink="false">689903631222b921fb42c07f</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 11 Aug 2025 14:43:28 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/08/ghost.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote>TL;DR: This lets technically savvy users self-host <a href="http://ghost.org/?ref=danielraffel.me" rel="noreferrer">Ghost</a> v6 on GCP for free, with straightforward backups and updates.</blockquote><p>I’ve updated my Ghost updater script to fully support migrating from Ghost 5 to Ghost 6, for instances installed with the companion <a href="https://github.com/danielraffel/gcloud_ghost_instancer/?ref=danielraffel.me" rel="noreferrer">Ghost Google Cloud installer</a> — which has also been updated for v6 compatibility.</p><p>The new release streamlines configuration, improves SSH handling, and ensures Node.js version selection matches Ghost’s requirements. It will backup and update your instance, and you can continue using it for future updates to Ghost v6.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/gCloud-Ghost-Updater?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/gCloud-Ghost-Updater: A user-friendly script designed to reduce downtime while updating a Ghost instance hosted on Google Cloud.</div><div class="kg-bookmark-description">A user-friendly script designed to reduce downtime while updating a Ghost instance hosted on Google Cloud. - danielraffel/gCloud-Ghost-Updater</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-10.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/gCloud-Ghost-Updater" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>While running my own upgrade, I also discovered a few changes were needed for both my nginx config and my Ghost config. I’ve documented these below. <strong>They require manual edits and are not included in the update script.</strong></p><p><em>Note: I've only lightly tested the updater script on macOS.</em></p><hr><h3 id="1-script-enhancements"><strong>1. Script Enhancements</strong></h3><ul><li><strong><code>.env</code>&nbsp;overrides file</strong><ul><li>Added support for&nbsp;<code>.env</code>&nbsp;overrides with sensible defaults.</li><li>Environment variables now control key parameters.</li></ul></li><li><strong>Node.js version selection</strong><ul><li>Prefers&nbsp;<code>engines.node</code>&nbsp;(major) from&nbsp;<code>current/package.json</code>&nbsp;if available.</li><li>Falls back to the Ghost-supported major from the helper script if not.</li></ul></li><li><strong>SSH improvements</strong><ul><li><code>SSH_KEY_PATH</code>&nbsp;is optional — if it doesn’t exist, SSH falls back to your agent/default identities.</li><li>All SSH/SCP calls now consistently use&nbsp;<code>${SSH_USER}</code>&nbsp;and&nbsp;<code>$SSH_IDENTITY_OPTS</code>.</li></ul></li><li><strong>New environment variables</strong><ul><li><code>SSH_USER</code>,&nbsp;<code>SSH_KEY_PATH</code>,&nbsp;<code>LOCAL_NODE_VERSION_SCRIPT</code>,&nbsp;<code>RESOURCE_POLICY_NAME</code></li><li><code>SPEEDY_MACHINE_TYPE</code>,&nbsp;<code>NORMAL_MACHINE_TYPE</code></li><li><code>SSH_CONNECT_TIMEOUT</code>,&nbsp;<code>SSH_MAX_ATTEMPTS</code>,&nbsp;<code>SSH_RETRY_SLEEP</code></li></ul></li></ul><hr><h3 id="2-nginx-configuration-changes"><strong>2. Nginx Configuration Changes</strong></h3><p>While upgrading, I could not access the new Network tab so I had to manually update my&nbsp;<strong><code>/etc/nginx/sites-available/</code></strong>&nbsp;configs for&nbsp;<strong>ActivityPub</strong>&nbsp;and&nbsp;<strong>federation endpoints</strong>. These need to be applied in both SSL and non-SSL server blocks.</p><h4 id="non-ssl-example"><strong>Non-SSL Example</strong></h4><pre><code class="language-nginx"># ActivityPub proxy (HTTP)
location ~ ^/\.ghost/activitypub/ {
    proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header X-Real-IP         $remote_addr;
    proxy_set_header Host              $host;
    add_header X-Content-Type-Options  nosniff;
    proxy_ssl_server_name on;
    proxy_pass https://ap.ghost.org;
}

# Well-known federation endpoints
location ~ ^/\.well-known/(webfinger|nodeinfo)$ {
    proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header X-Real-IP         $remote_addr;
    proxy_set_header Host              $host;
    add_header X-Content-Type-Options  nosniff;
    proxy_ssl_server_name on;
    proxy_pass https://ap.ghost.org;
}
</code></pre><h4 id="ssl-example"><strong>SSL Example</strong></h4><pre><code class="language-nginx"># ActivityPub endpoints (proxy to Ghost AP service)
location ~ ^/\.ghost/activitypub/ {
    proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header X-Real-IP         $remote_addr;
    proxy_set_header Host              $host;
    add_header X-Content-Type-Options  nosniff;
    proxy_ssl_server_name on;
    proxy_pass https://ap.ghost.org;
}

# Well-known federation endpoints
location ~ ^/\.well-known/(webfinger|nodeinfo)$ {
    proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header X-Real-IP         $remote_addr;
    proxy_set_header Host              $host;
    add_header X-Content-Type-Options  nosniff;
    proxy_ssl_server_name on;
    proxy_pass https://ap.ghost.org;
}
</code></pre><p>I <a href="https://github.com/TryGhost/Ghost-CLI/pull/1963/files?ref=danielraffel.me" rel="noreferrer">discovered these nginx conf changes</a> were added by the Ghost team <a href="https://forum.ghost.org/t/activitypub-503-and-then-404-and-401-errors-with-ghost-cli/59206/8?ref=danielraffel.me" rel="noreferrer">via</a> a post on the Ghost forum.</p><hr><h3 id="3-temporary-ghost-config-fix"><strong>3. Temporary Ghost Config Fix</strong></h3><p>During the upgrade, I discovered my install didn’t have an email service configured — which blocked staff login. As a temporary workaround <a href="https://forum.ghost.org/t/activitypub-503-and-then-404-and-401-errors-with-ghost-cli/59206/8?ref=danielraffel.me" rel="noreferrer">Cathy Sarisky</a> kindly suggested disabling staff device verification in <code>config.production.json</code>:</p><pre><code class="language-json">"security": {
  "staffDeviceVerification": false
}
</code></pre><p><strong>Note:</strong>&nbsp;This is&nbsp;<strong>not</strong>&nbsp;a long-term solution. The proper fix is to configure a working mail service.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Automate Git Commits and Checkpoints in Alex Sidebar ]]></title>
        <description><![CDATA[ I&#39;ve been using Alex Sidebar for a while for developing macOS and iOS apps. They recently added support for Markdown files. I created a file that helps ensure that checkpoints and commits occur automatically and feature development occurs on branches. ]]></description>
        <link>https://danielraffel.me/til/2025/07/24/how-to-automate-git-commits-and-checkpoints-in-alex-sidebar/</link>
        <guid isPermaLink="false">688295c4abc98b035df409b3</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 24 Jul 2025 13:37:14 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/07/ChatGPT-Image-Jul-24--2025-at-01_36_38-PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I've been using <a href="https://www.alexcodes.app/?ref=danielraffel.me" rel="noreferrer">Alex Sidebar</a> for a while for developing macOS and iOS apps. They recently added support for Markdown files, so I ported over some biz logic I've been using in other coding tools, and figured I'd share. This helps ensure that checkpoints and commits occur automatically and feature development occurs on branches.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://gist.github.com/danielraffel/3cb309af0d3d90108e43bccaedb78536?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Created a simple Alex Markdown that helps ensure that checkpoints and commits occur while working and feature development occurs on branches.</div><div class="kg-bookmark-description">Created a simple Alex Markdown that helps ensure that checkpoints and commits occur while working and feature development occurs on branches. - ALEX.md</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-9.svg" alt=""><span class="kg-bookmark-author">Gist</span><span class="kg-bookmark-publisher">262588213843476</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/gist-og-image-54fd7dc0713e.png" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>Note: This was inspired by <a href="https://gist.github.com/jacksondc/56e3a789e068eec9d834e96d1c44d840?ref=danielraffel.me" rel="noreferrer">Chorus.md</a></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Run macOS 26 Tahoe Beta in a VM on Your Mac using UTM ]]></title>
        <description><![CDATA[ Apple just released the macOS 26 beta at WWDC, and if you&#39;re doing any kind of iOS or macOS development, it&#39;s a great time to spin up a test environment without risking your main system. ]]></description>
        <link>https://danielraffel.me/2025/06/12/how-to-run-macos-26-tahoe-beta-in-a-vm-on-your-mac-using-utm/</link>
        <guid isPermaLink="false">684b0c42e3faf3034ec37c3d</guid>
        <category><![CDATA[ 🧰 How To Guide ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 12 Jun 2025 10:46:39 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/06/ChatGPT-Image-Jun-12--2025-at-10_42_50-AM.png" medium="image"/>
        <content:encoded><![CDATA[ <figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/06/ChatGPT-Image-Jun-12--2025-at-10_42_50-AM-1.png" class="kg-image" alt="" loading="lazy" width="1024" height="1024" srcset="https://danielraffel.me/content/images/size/w600/2025/06/ChatGPT-Image-Jun-12--2025-at-10_42_50-AM-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/06/ChatGPT-Image-Jun-12--2025-at-10_42_50-AM-1.png 1000w, https://danielraffel.me/content/images/2025/06/ChatGPT-Image-Jun-12--2025-at-10_42_50-AM-1.png 1024w" sizes="(min-width: 720px) 720px"></figure><p>Apple just released the macOS 26 beta at WWDC, and if you're doing any kind of iOS or macOS development, it's a great time to spin up a test environment without risking your main system.</p><p>A while back, I shared a guide on <a href="https://danielraffel.me/2025/05/13/running-macos-in-a-vm-on-macos-with-utm/" rel="noreferrer">how to install macOS on macOS in a virtual machine</a> using <a href="https://mac.getutm.app/?ref=danielraffel.me">UTM</a>&nbsp;on macOS. That guide still works perfectly for macOS 26—there’s just one small extra step if you want to install the beta version.</p><p>After setting up your VM with a working version of macOS:</p><ol><li><strong>Sign in to your Apple ID using your developer account</strong>&nbsp;inside the VM.</li><li>Head to&nbsp;<strong>System Settings &gt; General &gt; Software Update</strong>.</li><li>Enable&nbsp;<strong>Beta Updates</strong>&nbsp;and choose&nbsp;<strong>macOS Developer Beta</strong>.</li></ol><p>From there, the system will detect and install the latest macOS 26 beta (a.k.a. macOS Tahoe). I’ve updated the original guide with this tweak, but wanted to call it out here separately since it’s the only real change.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Get Sweetpad to Recognize Xcode Project Files in Build Directories ]]></title>
        <description><![CDATA[ I encountered an issue using Sweetpad with an Xcode project that CMake was generating into a `build/` directory, but I resolved it by adding a simple settings.json file. ]]></description>
        <link>https://danielraffel.me/til/2025/05/31/how-to-get-sweetpad-to-recognize-xcode-project-files-in-build-directories/</link>
        <guid isPermaLink="false">683b789e8c1810034cd139bf</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 31 May 2025 15:04:22 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/05/D49556B2-C501-4339-A977-CC1E5FA4BA42.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I ran into an issue while trying to use&nbsp;<a href="https://marketplace.visualstudio.com/items?itemName=sweetpad.sweetpad&ref=danielraffel.me">Sweetpad</a>&nbsp;with an Xcode project that was being generated into a&nbsp;build/&nbsp;directory by CMake.</p><p>I wanted to be able to hit&nbsp;Cmd+Shift+B&nbsp;in Sweetpad and have it run the project using one of the defined Xcode schemes.</p><p>But because the&nbsp;.xcodeproj&nbsp;was being placed inside&nbsp;build/&nbsp;— not the root folder — Sweetpad kept failing with errors like:</p><pre><code>ENOENT: no such file or directory, open '...project.xcworkspace/contents.xcworkspacedata'</code></pre><h3 id="%E2%9C%85-here%E2%80%99s-how-i-got-it-working"><strong>✅ Here’s how I got it working:</strong></h3><ol><li><strong>Create a .vscode folder at the root of the source project (not in build/)</strong>.</li><li>Inside that, add a&nbsp;settings.json&nbsp;file with this content:</li></ol><pre><code>{
  "sweetpad.workspacePath": "build/PlunderTube.xcodeproj",
  "sweetpad.build.configuration": "Debug",
  "sweetpad.build.scheme": "PlunderTube",
  "sweetpad.build.sdk": "macosx"
}</code></pre><ol start="3"><li><strong>Reset Sweetpad’s cache</strong>&nbsp;via the command palette (Cmd+Shift+P → Sweetpad: Reset Extension Cache).</li><li>Rebuild the project in CMake.</li><li>Then…Cmd+Shift+B&nbsp;worked to rebuild the project.</li></ol><p>Now Sweetpad recognizes the generated&nbsp;.xcodeproj&nbsp;and lets me build and run the Xcode scheme as expected, directly from the VS Code interface — even though everything lives inside a&nbsp;build/&nbsp;directory.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Start Developing Audio Plugins on macOS ]]></title>
        <description><![CDATA[ Recently I&#39;ve been playing around with making audio plugins on macOS using JUCE. I ended up putting together a few simple scripts to make it easier to quickly spin up new audio projects. ]]></description>
        <link>https://danielraffel.me/2025/05/30/how-to-start-developing-audio-plugins-on-macos/</link>
        <guid isPermaLink="false">68394a8c8c1810034cd1316a</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 29 May 2025 23:30:56 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/05/F3D01CDE-1C31-4B7F-8C9D-E521F99C0C01.png" medium="image"/>
        <content:encoded><![CDATA[ <blockquote><strong>Update 3/5/26:</strong> I've since built a Claude Code Plugin called <a href="https://www.generouscorp.com/generous-corp-marketplace/plugins/juce-dev/?ref=danielraffel.me" rel="noreferrer">juce-dev</a> that turns this template workflow into an interactive command. It handles the full&nbsp;setup, including optional GPU UI integration. <a href="https://danielraffel.me/2026/03/06/a-claude-code-plugin-for-building-juce-audio-plugins/" rel="noreferrer">Learn more</a>.</blockquote><p>Recently I've been playing around with making audio plugins on macOS using <a href="http://juce.com/?ref=danielraffel.me" rel="noreferrer">JUCE</a>. I ended up putting together a few simple scripts to make it easier to quickly spin up new audio projects.</p><p>The <a href="https://github.com/danielraffel/JUCE-Plugin-Starter?ref=danielraffel.me" rel="noreferrer">JUCE Plugin Starter repo</a> includes:</p><ul><li>A <a href="https://github.com/danielraffel/JUCE-Plugin-Starter/blob/main/dependencies.sh?ref=danielraffel.me" rel="noreferrer">script to install all required dependencies</a> outside of Xcode (like CMake and PluginVal).</li><li>A <a href="https://github.com/danielraffel/JUCE-Plugin-Starter/blob/main/README.md?ref=danielraffel.me#1-clone-the-juce-plugin-starter-template" rel="noreferrer">quick setup flow</a> that clones the repo with the template and scripts.</li><li>A <a href="https://github.com/danielraffel/JUCE-Plugin-Starter/blob/main/init_plugin_project.sh?ref=danielraffel.me" rel="noreferrer">script that walks through reinitializing and renaming the repo</a> so you can start a fresh plugin project, configure&nbsp;<code>.env</code>&nbsp;variables, and publish it to your own GitHub account.</li><li>A <a href="https://github.com/danielraffel/JUCE-Plugin-Starter/blob/main/generate_and_open_xcode.sh?ref=danielraffel.me" rel="noreferrer">script to generate the Xcode project and open it</a>, which has been helpful when making changes that require rebuilding the project file.</li></ul><blockquote>The simplest way to get everything running is to just copy and paste the following into your terminal. That said, these commands execute several scripts and install multiple components, so <strong>it’s wise to review the full </strong><a href="https://github.com/danielraffel/JUCE-Plugin-Starter?ref=danielraffel.me" rel="noreferrer"><strong>README</strong></a><strong> first to understand what’s happening and avoid any surprises</strong>.</blockquote><pre><code class="language-bash"># Install required tools (Xcode CLT, Homebrew, CMake, PluginVal, etc.)
bash &lt;(curl -fsSL https://raw.githubusercontent.com/danielraffel/JUCE-Plugin-Starter/main/dependencies.sh)

# Clone the starter project and set up environment
git clone https://github.com/danielraffel/JUCE-Plugin-Starter.git
cd JUCE-Plugin-Starter
cp .env.example .env

# Run the first-time setup to configure your plugin project
chmod +x ./init_plugin_project.sh
./init_plugin_project.sh

# Generate and open the Xcode project (downloads JUCE on first run)
chmod +x ./generate_and_open_xcode.sh
./generate_and_open_xcode.sh</code></pre><p>Once your plugin is built and tested, you can package it for safe, notarized distribution using the <a href="https://github.com/danielraffel/JUCE-Plugin-Starter/blob/main/scripts/sign_and_package_plugin.sh?ref=danielraffel.me" rel="noreferrer"><code>sign_and_package_plugin.sh</code></a>&nbsp;script. </p><ul><li>The script signs, notarizes, and staples each format</li><li>All formats are bundled into a&nbsp;<strong>single&nbsp;</strong><code>.pkg</code><strong>&nbsp;installer</strong></li><li>The&nbsp;<code>.pkg</code>&nbsp;is signed and notarized</li><li>Finally, the&nbsp;<code>.pkg</code>&nbsp;is included in a&nbsp;<strong>ready-to-share&nbsp;</strong><code>.dmg</code></li></ul><p>Maybe this is only useful to me as a way to prototype new ideas, but if it helps others get started, awesome. The Xcode project template the repository generates sets you up to build a <strong>Standalone App</strong>, <strong>AU plugin</strong> (for Logic, GarageBand), and <strong>VST3 plugin</strong> (for Reaper, Ableton Live, etc.).</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/JUCE-Plugin-Starter?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/JUCE-Plugin-Starter</div><div class="kg-bookmark-description">Contribute to danielraffel/JUCE-Plugin-Starter development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-5.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://opengraph.githubassets.com/4204bc5327e9e253338877e9edd391b2ec6b072b5be8b465ba210fed6fb07e53/danielraffel/JUCE-Plugin-Starter" alt="" onerror="this.style.display = 'none'"></div></a></figure><hr><p><em><strong>Update:</strong>&nbsp;After creating this, someone pointed me to a more feature-rich template called&nbsp;</em><a href="https://github.com/sudara/pamplejuce?ref=danielraffel.me"><em>pamplejuce</em></a><em>.</em></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Setup macOS in a VM on macOS with UTM ]]></title>
        <description><![CDATA[ I&#39;ve been experimenting with different autonomous agents lately and I’ve started to sandbox them. When something needs to run on macOS, I’ve found UTM to be the easiest way to spin up a lightweight, isolated macOS instance on Apple Silicon. ]]></description>
        <link>https://danielraffel.me/2025/05/13/running-macos-in-a-vm-on-macos-with-utm/</link>
        <guid isPermaLink="false">682372ed4f521d035344f94c</guid>
        <category><![CDATA[ 🧰 How To Guide ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 13 May 2025 10:38:38 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/05/ChatGPT-Image-May-13--2025-at-10_37_22-AM.png" medium="image"/>
        <content:encoded><![CDATA[ <p><em>If you're a developer—or just curious about new macOS releases—this guide also works great for testing beta versions of macOS (like the newly released macOS 26 "Tahoe") inside a VM. It's a safe way to explore without touching your main system. See the bottom of this guide for instructions on enabling beta updates after setup.</em></p><ul><li><a href="#running-autonomous-agents-in-isolated-macos-environments" rel="noreferrer">Running Autonomous Agents in Isolated macOS Environments</a> </li><li><a href="#how-to-setup-macos-in-utm" rel="noreferrer">How to install macOS in UTM</a></li><li><a href="#enabling-beta-versions-of-macos-in-your-vm" rel="noreferrer">Enabling Beta Versions of macOS in Your VM</a></li><li><a href="#lessons-learned" rel="noreferrer">Lessons Learned</a></li></ul><h2 id="running-autonomous-agents-in-isolated-macos-environments">Running Autonomous Agents in Isolated macOS Environments</h2><p>Lately I’ve been experimenting with a range of autonomous agents, some of which run locally on my Mac. To keep these processes sandboxed, I’ve started isolating each one in its own containerized environment. One of the main reasons I started isolating agents in VMs is that they sometimes hijack the mouse and keyboard—making my system nearly unusable while they're running. When I need to run something on macOS in isolation, <a href="https://mac.getutm.app/?ref=danielraffel.me" rel="noreferrer">UTM</a> has been the simplest way to set up a lightweight macOS virtual machine. Just a heads-up: it only works on Apple Silicon.</p><p>UTM wraps around <a href="https://www.qemu.org/docs/master/about/index.html?ref=danielraffel.me" rel="noreferrer">QEMU</a> (a powerful, open-source emulator) but hides most of the complexity. You also have the option to use <a href="https://developer.apple.com/documentation/virtualization?ref=danielraffel.me" rel="noreferrer">Apple’s Virtualization framework</a>, which tends to run faster and integrates better with Apple Silicon. For macOS VMs, that’s what I use.</p><p>What I like about this setup is how quick it is to get started. UTM can boot from an existing macOS recovery partition, which means you don’t have to track down an installer unless you really want or need to. You can switch between your main system and a clean VM environment in seconds. It’s fast enough to be practical, and it doesn’t mess with the host.</p><h3 id="a-note-on-licensing">A note on licensing</h3><p>Apple allows you to run up to two virtual instances of macOS on a Mac, in addition to the main OS. As long as you’re using an official&nbsp;<code>.app</code>&nbsp;or IPSW installer and staying within that limit, you’re within the bounds of their terms. This seems aimed at supporting developers, researchers, and similar use cases—while also drawing a clear line to prevent anyone from spinning up macOS as a hosted service.</p><hr><h2 id="how-to-setup-macos-in-utm">How to Setup macOS in UTM</h2><p>If you're curious, here’s a quick overview of the setup process using <a href="https://mac.getutm.app/?ref=danielraffel.me" rel="noreferrer">UTM</a> on macOS. The screenshots below show what you'll see along the way:</p><h3 id="1-start-screen">1.&nbsp;<strong>Start Screen</strong></h3><p>When you open UTM and click&nbsp;<em>Create a New Virtual Machine (eg + UTM)</em>, you’ll see the choice between&nbsp;<strong>Virtualize</strong>&nbsp;(for native CPU architecture, faster) and&nbsp;<strong>Emulate</strong>&nbsp;(for other CPU types, slower). For macOS on Apple Silicon, stick with&nbsp;<strong>Virtualize</strong>.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/image-2.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/image-2.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/image-2.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/image-2.png 1600w, https://danielraffel.me/content/images/2025/05/image-2.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="2-choose-the-os">2.&nbsp;<strong>Choose the OS</strong></h3><p>UTM offers preconfigured options. Pick&nbsp;<strong>macOS 12+</strong>.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/image-3.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/image-3.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/image-3.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/image-3.png 1600w, https://danielraffel.me/content/images/2025/05/image-3.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="3-import-or-download-the-macos-installer">3.&nbsp;<strong>Import or Download the macOS Installer</strong></h3><p>Here you can either point to an IPSW file or let UTM fetch it from Apple automatically. If you have the option to continue without selecting a file, UTM will use the installer on your boot partition.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.26-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.42.26-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.42.26-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.42.26-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.26-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><blockquote>If you want or need an IPSW file, you can look here:<a href="https://ipsw.me/product/Mac?ref=danielraffel.me">https://ipsw.me/product/Mac</a>. These files usually point to Apple’s CDN, but it’s worth double-checking the URL to ensure it hasn’t been hosted and/or potentially modified by a third party.</blockquote><hr><h3 id="4-set-resources-ram-and-cpu">4.&nbsp;<strong>Set Resources (RAM and CPU)</strong></h3><p>This screen lets you assign memory and CPU cores to your VM.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.31-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.42.31-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.42.31-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.42.31-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.31-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="5-set-disk-size">5.&nbsp;<strong>Set Disk Size</strong></h3><p>Specify how much storage to allocate.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.35-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.42.35-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.42.35-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.42.35-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.35-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="6-summary-screen">6.&nbsp;<strong>Summary Screen</strong></h3><p>Here’s where you review the configuration. You’ll see the engine (Apple Virtualization), storage size, RAM, and the path to the IPSW installer if one is selected. <em>I recently had to have my Mac </em><a href="https://danielraffel.me/2025/04/16/when-your-mac-feels-broken-but-isnt/" rel="noreferrer"><em>re-imaged</em></a><em> at the Apple Store, so I’ve now get a much newer installer than the one it originally shipped with.</em></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.39-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.42.39-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.42.39-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.42.39-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.39-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="7-vm-appears-in-utm">7.&nbsp;<strong>VM Appears in UTM</strong></h3><p>Once saved, your VM shows up in the sidebar. You can launch it any time from here. It will take a little time to install.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.45-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.42.45-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.42.45-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.42.45-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.45-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="8-macos-installer-downloading">8.&nbsp;<strong>macOS Installer Downloading</strong></h3><p>If you skipped adding an IPSW earlier, UTM begins downloading it now. The terminology might say "downloading" but it’s really copying from the boot partition. You can cancel it at any time.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.53-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.42.53-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.42.53-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.42.53-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.42.53-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="9-launch-the-vm"><strong>9.&nbsp;Launch the VM</strong></h3><p>Once the installer has fully copied over, your new macOS VM will appear in the left-hand sidebar. Just hit the&nbsp;<strong>play button</strong>&nbsp;to launch it.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.56.02-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1510" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.56.02-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.56.02-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.56.02-AM.png 1600w, https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.56.02-AM.png 2024w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="10-macos-boots-in-a-new-window"><strong>10.&nbsp;macOS Boots in a New Window</strong></h3><p>The VM will start in a separate window. From here, you can go full screen and walk through the usual macOS setup process. Once it’s up and running, you’ve got an isolated, fast macOS environment to experiment with. </p><p>Although you can enable file sharing to <a href="https://support.apple.com/guide/mac-help/set-up-file-sharing-on-mac-mh17131/mac?ref=danielraffel.me" rel="noreferrer">share folders</a> between the host and guest, direct drag-and-drop file transfer between windows is not supported. 😔</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/05/Screenshot-2025-05-13-at-9.56.12-AM.png" class="kg-image" alt="" loading="lazy" width="2000" height="1368" srcset="https://danielraffel.me/content/images/size/w600/2025/05/Screenshot-2025-05-13-at-9.56.12-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2025/05/Screenshot-2025-05-13-at-9.56.12-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/05/Screenshot-2025-05-13-at-9.56.12-AM.png 1600w, https://danielraffel.me/content/images/size/w2400/2025/05/Screenshot-2025-05-13-at-9.56.12-AM.png 2400w" sizes="(min-width: 720px) 720px"></figure><hr><h2 id="enabling-beta-versions-of-macos-in-your-vm">Enabling Beta Versions of macOS in Your VM</h2><p>Want to run the latest macOS betas—like macOS 26 "Tahoe"—in your VM? Once you've completed the standard macOS installation and setup steps above, it's just a matter of opting into beta updates:</p><ol><li><strong>Sign in to your Apple Developer account from within the VM:</strong><ol><li>On the Mac inside the VM, go to &nbsp;<strong>Apple menu</strong>&nbsp;&gt;&nbsp;<strong>System Settings</strong>.</li><li>Click your name at the top of the sidebar.<ol><li>If you don’t see your name, click&nbsp;<strong>Sign in</strong>&nbsp;and enter your Apple Account email or phone number, then your password.</li></ol></li><li>Complete the process by clicking&nbsp;<strong>Sign in with Apple</strong>&nbsp;to authenticate your Developer account.</li></ol></li><li>Open&nbsp;<strong>System Settings &gt; General &gt; Software Update</strong>.</li><li>Click&nbsp;<strong>Beta Updates</strong>&nbsp;and select&nbsp;<strong>macOS Developer Beta</strong>.</li></ol><p>From there, the system will prompt you to download and install the latest available macOS beta. This is especially handy for developers working on macOS or iOS software who want to test against upcoming changes without risking their main environment. </p><p>✅&nbsp;<em>Note: This step is completely optional. If you're not working with beta software, you can skip it entirely and stick with the stable macOS release you originally installed. But, now you know how to run macOS 26 Tahoe Beta in a VM on Your Mac using UTM.</em></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/06/1a.png" class="kg-image" alt="" loading="lazy" width="781" height="528" srcset="https://danielraffel.me/content/images/size/w600/2025/06/1a.png 600w, https://danielraffel.me/content/images/2025/06/1a.png 781w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/06/2a.png" class="kg-image" alt="" loading="lazy" width="1093" height="739" srcset="https://danielraffel.me/content/images/size/w600/2025/06/2a.png 600w, https://danielraffel.me/content/images/size/w1000/2025/06/2a.png 1000w, https://danielraffel.me/content/images/2025/06/2a.png 1093w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/06/3a.png" class="kg-image" alt="" loading="lazy" width="1093" height="739" srcset="https://danielraffel.me/content/images/size/w600/2025/06/3a.png 600w, https://danielraffel.me/content/images/size/w1000/2025/06/3a.png 1000w, https://danielraffel.me/content/images/2025/06/3a.png 1093w" sizes="(min-width: 720px) 720px"></figure><hr><p>That’s it. Once the setup is done, you’ll have a fully functional macOS environment you can run in parallel to your main system. Useful for dev testing, isolating risky processes, or just giving agents a safe sandbox to play in.</p><hr><h2 id="lessons-learned">Lessons Learned</h2><p>Here are a few tips and workarounds that have made it easier to do Xcode development inside the client VM:</p><ol><li><s>Apple prevents logging in to the App Store from virtualized VMs. If you need apps you’ve purchased, download them on the host and copy them over.</s> When I first tried this, I hadn't signed in to my <strong>Apple Account</strong> under <strong>&nbsp;Apple Menu&nbsp;&gt;&nbsp;System Settings</strong>. Once I did that, app downloads worked—classic oversight! 😅</li><li>Drag and drop doesn’t work. Setup shared folders in UTM. Just be thoughtful about what folders you share—if your goal is to sandbox AI tools safely, limit access on your host machine to only what’s absolutely necessary.</li><li>Clipboard copying doesn’t work. I use an app called&nbsp;<a href="https://www.ntwind.com/cross-platform/clipboard-remote.html?ref=danielraffel.me">Clipboard Remote</a>&nbsp;to transfer clipboard content between the host and the VM over the local network.</li><li>Obviously, SSH keys don’t automatically carry over. Either copy them from your host machine (or generate scoped ones for the VM):</li></ol><pre><code class="language-bash">scp -rp ~/.ssh/* username@MACOS_UTM_CLIENT_IP:~/.ssh/</code></pre><ol start="4"><li>For Xcode and Command Line Tools on the client VM, I downloaded them via <a href="https://developer.apple.com/download/all/?ref=danielraffel.me">Apple’s developer site</a>.</li><li><a href="https://www.trycua.com/?ref=danielraffel.me" rel="noreferrer">c/ua</a> looks like a promising alternative to hosting macOS in UTM, but it’s still early days—I’m not quite ready to recommend it just yet unless you're a developer.</li></ol> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Begin Piloting Autonomous AI Developer Tools ]]></title>
        <description><![CDATA[ I recently encouraged a CEO friend to try an autonomous AI coding service—the kind of tool that would’ve seemed like science fiction not long ago. What stood out in our conversation was the complexity of introducing it to a team. ]]></description>
        <link>https://danielraffel.me/2025/05/13/how-to-begin-piloting-autonomous-ai-developers/</link>
        <guid isPermaLink="false">68228b134f521d035344f487</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 12 May 2025 18:15:24 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/05/BAEB2A91-7608-4687-BE6C-A498ECF13280.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently encouraged a CEO friend to try an autonomous AI coding service—the kind of tool that would’ve seemed like science fiction not long ago.</p><p>It’s a Slack-native bot that can fully own engineering tasks. You can ask it to do things like build a feature from a Figma file, implement a product spec, or fix a bug. It handles the entire workflow: writing code, running unit tests, making revisions, and ultimately committing to your repo for human review.</p><p>The software is still in alpha and a bit rough around the edges, but it’s surprisingly capable. And while the tech itself is impressive, what stood out in our conversation was the complexity of introducing it to a team.</p><p>Unlike most developer tools that support engineers, this one can independently ship full features—even tackle complex projects without human intervention. That shift raises thorny questions: Who owns the output? Could it replace someone’s role? How should performance be measured? And how do you introduce it without triggering fear or resistance?</p><p>My suggestion:&nbsp;Treat this as a series of structured experiments on the path to potential adoption—and be transparent throughout. Start with a pilot project co-led by members of your product, design, and engineering teams. Pick something that’s not on the active roadmap: ideally, work that’s difficult to prioritize or too disruptive to fold into active development. Avoid using a marquee feature the team is already excited to build.</p><p>This lets you complete lower priority work while evaluating the tool’s value. Make it clear this isn’t a stealth effort to replace the team. It’s a structured trial to assess the tech’s maturity, identify what kinds of projects (if any) it's best suited for, and define roles clearly—using your preferred RACI framework or equivalent—so no one is caught off guard.</p><p>I also highlighted a near-term shift: while engineers have already embraced AI coding assistants, this exploration isn’t just about encouraging their adoption of autonomous tools. It’s about evaluating whether certain technical work—like feature development or bug fixes—can proceed with less direct involvement from engineers than has traditionally been required.</p><p>Engineers will remain essential, especially for high-complexity tasks, oversight, and code review. But their role may increasingly involve responding to work initiated by AI tools and orchestrated by team members outside of engineering. That’s a meaningful shift—one that could reshape team structure, ownership, and how work flows across the organization.</p><p>If an autonomous coding tool proves valuable and you decide to bring it into regular use, mark that moment clearly. Recognize the shift—acknowledge what’s changing, reaffirm team roles, and make space for discussion. This helps ensure the transition feels intentional, not incidental.</p><p>Eventually, I think most of us will work alongside AI teammates. For tech companies, that might look like Slack bots taking tickets, writing code, and shipping updates. That future isn’t fully here yet for most companies to consider adopting—but it’s getting close. Forward-thinking teams should start shaping how they evaluate and integrate these tools now. Soon, it won’t be easy to compete with teams that are using AI across more phases of development—and just as easy to fall behind.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Where Does the Phrase &quot;From Soup to Nuts&quot; Come From? ]]></title>
        <description><![CDATA[ I was watching a show recently when someone used the phrase &quot;soup to nuts&quot;. And even though I understood what it meant — everything from beginning to end — I suddenly realized I had no idea why it meant that. ]]></description>
        <link>https://danielraffel.me/til/2025/05/13/where-does-the-phrase-from-soup-to-nuts-come-from/</link>
        <guid isPermaLink="false">682293b44f521d035344f514</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 12 May 2025 17:47:39 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/05/7382bc159efc067e18be210c2c626cfdbe495238140e7f61f21a82c616552750.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I was watching a show recently when someone used the phrase&nbsp;<em>"soup to nuts"</em>. And even though I understood what it meant —&nbsp;<em>everything from beginning to end</em>&nbsp;— I suddenly realized I had no idea&nbsp;<strong>why</strong>&nbsp;it meant that. What were soup and nuts doing together in the same sentence?</p><p>Curious, I looked it up.</p><p>As it turns out, the expression "<a href="https://en.wiktionary.org/wiki/from_soup_to_nuts?ref=danielraffel.me#English" rel="noreferrer">soup to nuts</a>" dates back to&nbsp;<strong>formal American and British dining customs</strong>&nbsp;in the 19th and early 20th centuries. In those multi-course meals:</p><ul><li><strong>Soup</strong>&nbsp;was typically served first</li><li><strong>Nuts</strong>&nbsp;came last, often offered with port or sherry after dessert</li></ul><p>So the phrase literally referred to the&nbsp;<strong>entire meal experience</strong>, from the very first spoonful to the final nibble. Over time, it became a popular idiom meaning&nbsp;"<strong>the whole thing"</strong>&nbsp;or&nbsp;"<strong>start to finish."</strong></p><p>The phrase began showing up in print in the&nbsp;<strong>early 1900s</strong>, gradually shedding its culinary roots and finding new life in business, storytelling, and tech — anywhere someone wanted to talk about something being handled comprehensively.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ A macOS Shortcut I Somehow Missed ]]></title>
        <description><![CDATA[ I’ve been using macOS for years, but only today learned that Command + backtick lets you switch between windows within the same app. Add Shift to go the other direction. ]]></description>
        <link>https://danielraffel.me/til/2025/05/06/a-macos-shortcut-i-somehow-missed/</link>
        <guid isPermaLink="false">681905b6f0cf3f034e9890ae</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 06 May 2025 09:58:05 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/05/ChatGPT-Image-May-6--2025-at-09_50_57-AM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I’ve used macOS for years and rely heavily on&nbsp;<code>Command</code> + <code>Tab</code>&nbsp;to switch between apps. But today I realized something simple that I expect will boost my window management productivity: within a single app—like Safari—you can cycle between open&nbsp;<strong>windows</strong>&nbsp;(not tabs) using:</p><ul><li><code>Command</code> + <code>`</code> — switch to the next window</li><li><code>Command</code> + <code>Shift</code> + <code>`</code>— switch to the previous window</li></ul><p>It works in Safari, Finder, Notes—basically any app with more than one window open. Super handy when juggling multiple Safari windows. Just wish I’d learned it earlier.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ When Your Mac Feels Broken—But Isn’t ]]></title>
        <description><![CDATA[ Recently, I started noticing that every time I rebooted my Apple Silicon MacBook Pro, I’d get stuck with the dreaded beachball. It wasn’t just a one-off. The issue persisted—even when I booted into Safe Mode. ]]></description>
        <link>https://danielraffel.me/2025/04/16/when-your-mac-feels-broken-but-isnt/</link>
        <guid isPermaLink="false">68001471f645c703533d460d</guid>
        <category><![CDATA[ 🪩 Reflecting on ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 16 Apr 2025 14:04:03 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/04/ChatGPT-Image-Apr-16--2025-at-02_02_45-PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Recently, I started noticing that every time I rebooted my Apple Silicon MacBook Pro, I’d get stuck with the dreaded beachball. It wasn’t just a one-off. The issue persisted—even when I booted into Safe Mode. I ran Disk Utility, which flagged and claimed to fix some issues, but the problem came back almost immediately.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/04/IMG_6457.JPG" class="kg-image" alt="" loading="lazy" width="2000" height="1500" srcset="https://danielraffel.me/content/images/size/w600/2025/04/IMG_6457.JPG 600w, https://danielraffel.me/content/images/size/w1000/2025/04/IMG_6457.JPG 1000w, https://danielraffel.me/content/images/size/w1600/2025/04/IMG_6457.JPG 1600w, https://danielraffel.me/content/images/size/w2400/2025/04/IMG_6457.JPG 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Disk utility errors on data partition</span></figcaption></figure><p>I decided to take a more drastic step. I wiped the machine completely and reinstalled macOS from scratch. Still no luck.</p><p>This MacBook Pro is four years old, maxed out with RAM and GPUs, and originally cost over $5k. The idea of needing to replace my computer right as a new round of U.S. tariffs kicked in was less than ideal. I ran Apple’s built-in diagnostics at home, and to my surprise, there were no physical hardware errors. But something still wasn’t right.</p><hr><h2 id="mysterious-disk-errors"><strong>Mysterious Disk Errors</strong></h2><p>I booted into Recovery Mode and ran fsck_apfs, which started throwing a bunch of low-level errors like:</p><pre><code>error: doc-id tree: record exists for doc-id #####, file-id ####### but no inode references this doc-id</code></pre><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/04/IMG_6461.JPG" class="kg-image" alt="" loading="lazy" width="2000" height="1500" srcset="https://danielraffel.me/content/images/size/w600/2025/04/IMG_6461.JPG 600w, https://danielraffel.me/content/images/size/w1000/2025/04/IMG_6461.JPG 1000w, https://danielraffel.me/content/images/size/w1600/2025/04/IMG_6461.JPG 1600w, https://danielraffel.me/content/images/size/w2400/2025/04/IMG_6461.JPG 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">fsck_apfs errors in recovery mode</span></figcaption></figure><p>These messages essentially mean that the file system had become partially corrupted at a structural level—likely deep in the APFS (Apple File System) metadata. These aren’t your everyday permission errors; they’re signs of serious inconsistencies.</p><p>I did everything I could think of: Safe Mode, Disk Utility, Terminal repairs, a full wipe. Nothing worked.</p><hr><h2 id="genius-bar-to-the-rescue"><strong>Genius Bar to the Rescue</strong></h2><p>As a last resort, I made a Genius Bar appointment at the Stonestown Apple Store. When I brought it in, the initial diagnostics (Mac Resource Inspector) showed no hardware failures. But I could still reproduce the issues—and Disk Utility errors kept showing up at home, even if the in-store tools didn’t catch them.</p><p>That’s when the Apple tech offered something I hadn’t tried (and am unsure I could do myself): they would keep the machine for 24 hours and run more advanced diagnostics. More importantly, they would use <strong>Apple Configurator</strong>&nbsp;to perform a&nbsp;<strong>deep-level wipe and reinstallation</strong>, which not only restores macOS but can also update low-level firmware and reset the entire APFS container from scratch.</p><p>In other words: this wasn’t just reinstalling the OS—it was a true factory reset, deeper than any tool I had access to.</p><hr><h2 id="the-result-a-brand-new-mac-almost"><strong>The Result? A Brand New Mac (Almost)</strong></h2><p>They not only reimaged the machine with the latest version of macOS Sequoia, but also wiped and rebuilt the system partitions in a way I couldn’t have done myself. After I picked it up, the machine felt brand new. The beachballs were gone. The disk errors? Completely vanished.</p><p>I’m not entirely sure how much of that process I could’ve replicated on my own—even with Terminal or booting into DFU mode—but I’m grateful they did it. Even more so because&nbsp;<strong>I’m out of warranty</strong>, and they did it all&nbsp;<strong>for free</strong>.</p><hr><h2 id="bonus-a-ride-to-remember"><strong>Bonus: A Ride to Remember</strong></h2><p>While they worked on my Mac, I took the liberty of going for a 62-mile bike ride that day. </p>
<!--kg-card-begin: html-->
<div class="strava-embed-placeholder" data-embed-type="activity" data-embed-id="14122624811" data-style="standard" data-from-embed="false"></div><script src="https://strava-embeds.com/embed.js"></script>
<!--kg-card-end: html-->
<p>The day turned out to be unexpectedly great on two counts—<strong>I got my machine back in working shape</strong>, and&nbsp;<strong>I got a ride in I might not have otherwise made time for</strong>. I picked up my Mac in my cycling clothes.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/04/image-9.png" class="kg-image" alt="" loading="lazy" width="2000" height="1347" srcset="https://danielraffel.me/content/images/size/w600/2025/04/image-9.png 600w, https://danielraffel.me/content/images/size/w1000/2025/04/image-9.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/04/image-9.png 1600w, https://danielraffel.me/content/images/2025/04/image-9.png 2124w" sizes="(min-width: 720px) 720px"></figure><p>Big thanks to the folks at the&nbsp;<a href="https://www.apple.com/retail/stonestown/?ref=danielraffel.me" rel="noreferrer">Stonestown Apple Store</a>. You took care of me and my machine when I really needed it.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ ☀️ Solar Showdown: A Friendly Energy Face-off ]]></title>
        <description><![CDATA[ Steve—my neighbor and longtime friend—and I have occasionally swapped notes on our energy use. We’ve turned our solar production and energy usage into a daily, automated solar showdown between our homes. ]]></description>
        <link>https://danielraffel.me/2025/04/16/solar-showdown-a-friendly-energy-face-off/</link>
        <guid isPermaLink="false">67ff37e3f645c703533d458b</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 16 Apr 2025 10:22:54 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/04/solarshowdown.png" medium="image"/>
        <content:encoded><![CDATA[ <figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/04/solarshowdown_640.png" class="kg-image" alt="" loading="lazy" width="640" height="640" srcset="https://danielraffel.me/content/images/size/w600/2025/04/solarshowdown_640.png 600w, https://danielraffel.me/content/images/2025/04/solarshowdown_640.png 640w"></figure><p>Steve — my longtime friend and neighbor — and I used to occasionally compare notes on our energy use. After both installing nearly identical solar setups, we no longer need to trade utility bills. Instead, we’ve turned our solar production and energy consumption into a <a href="https://www.generouscorp.com/solarshowdown-data/?ref=danielraffel.me" rel="noreferrer">daily, automated solar showdown</a> between our homes — a bit of friendly competition that not only keeps things fun, but also gets us checking our own stats more often and thinking regularly about how to be more efficient and environmentally mindful each day.</p><h3 id="%F0%9F%8E%AF-how-it-works">🎯 How It Works</h3><p>Every hour, our solar inverters send stats to a shared&nbsp;<a href="https://github.com/danielraffel/solarshowdown-data?ref=danielraffel.me">GitHub data repo</a>, via our home servers (both of us run&nbsp;<a href="https://www.proxmox.com/?ref=danielraffel.me">Proxmox</a>).</p><p>At the heart of it is the&nbsp;<a href="https://github.com/skrul/solarshowdown-api?ref=danielraffel.me">solarshowdown-api</a>&nbsp;— a lightweight Go service that pulls data from&nbsp;<a href="https://www.influxdata.com/?ref=danielraffel.me" rel="noreferrer">InfluxDB</a>, packages it up as a JSON file, and posts it directly to the data repo.</p><p>The frontend — a simple, responsive page built with vanilla JavaScript and CSS hosted on GitHub pages — reads from these JSON files and displays a live&nbsp;head-to-head dashboard&nbsp;showing:</p><ul><li>🌞 Solar Power Generated</li><li>🌿 Energy Consumed</li><li>⚡ Energy Sold to the Grid</li><li>🔌 Energy Imported from the Grid</li><li>🔋 Battery Discharge</li><li>⚡ Peak Solar Array Power Generation</li></ul><h3 id="%F0%9F%8F%86-daily-solar-champion">🏆 Daily Solar Champion</h3><p>Each day, the page calculates a&nbsp;Net Score&nbsp;and crowns a champion based on performance. Bonus categories add a twist:</p><ul><li><strong>Solar MVP</strong>&nbsp;(most generated)</li><li><strong>Grid Hustler</strong>&nbsp;(most sold)</li><li><strong>Energy Vampire</strong>&nbsp;(most consumed 😅)</li><li><strong>Battery Boss</strong>&nbsp;(most discharged)</li><li><strong>Peak Performer</strong>&nbsp;(highest peak)</li><li><strong>Net Champion</strong>&nbsp;(overall winner)</li></ul><p>And, if you share the link to the page the social preview will update daily via JS — so whoever’s on top gets the spotlight!</p><h3 id="%F0%9F%93%88-about-the-project">📈 About the Project</h3><p>Want to start your own showdown? Everything is open source, the <a href="https://github.com/skrul/solarshowdown-api?ref=danielraffel.me" rel="noreferrer">backend API</a> and <a href="https://github.com/danielraffel/solarshowdown-data?ref=danielraffel.me" rel="noreferrer">frontend</a> are hosted on GitHub.</p><hr><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/04/screencapture-generouscorp-solarshowdown-data-2025-04-16-20_11_27.png" class="kg-image" alt="" loading="lazy" width="2000" height="1718" srcset="https://danielraffel.me/content/images/size/w600/2025/04/screencapture-generouscorp-solarshowdown-data-2025-04-16-20_11_27.png 600w, https://danielraffel.me/content/images/size/w1000/2025/04/screencapture-generouscorp-solarshowdown-data-2025-04-16-20_11_27.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/04/screencapture-generouscorp-solarshowdown-data-2025-04-16-20_11_27.png 1600w, https://danielraffel.me/content/images/size/w2400/2025/04/screencapture-generouscorp-solarshowdown-data-2025-04-16-20_11_27.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><a href="https://www.generouscorp.com/solarshowdown-data/?ref=danielraffel.me"><span style="white-space: pre-wrap;">Solar Showdown desktop web page</span></a></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/04/image-6.png" class="kg-image" alt="" loading="lazy" width="640" height="826" srcset="https://danielraffel.me/content/images/size/w600/2025/04/image-6.png 600w, https://danielraffel.me/content/images/2025/04/image-6.png 640w"><figcaption><span style="white-space: pre-wrap;">Sharing the site will display that days champion</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ 🎛️ How I Installed My Apogee Symphony I/O MKI Without Using Rosetta on My Apple Silicon Mac ]]></title>
        <description><![CDATA[ After setting up a fresh Apple Silicon MacBook Pro, I decided to take a more intentional approach this time—no Rosetta 2. To get my now-sunset Apogee Symphony I/O MKI working without relying on the official installer, I built an unofficial installation script. ]]></description>
        <link>https://danielraffel.me/2025/04/10/how-i-installed-my-apogee-symphony-i-o-mki-without-using-rosetta-on-my-apple-silicon-mac/</link>
        <guid isPermaLink="false">67f82003d043df03506b9f39</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 10 Apr 2025 13:14:21 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/04/5D1461AD-CAB7-49A7-A796-C145F91C57E7.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently set up a fresh Apple Silicon MacBook Pro and decided to take a stricter approach this time around:&nbsp;<strong>no Rosetta 2</strong>. That meant avoiding anything that requires Intel emulation.</p><p>One particular roadblock? My&nbsp;<strong>Apogee Symphony I/O MKI </strong>which has been sunset by Apogee. While the last software installer&nbsp;<em>technically</em>&nbsp;supports Apple Silicon, the&nbsp;<strong>official installer still requires Rosetta</strong>&nbsp;to run. And since&nbsp;<strong>there’s no way to uninstall Rosetta</strong>&nbsp;once it’s on your system, I didn’t want to install it just for the sake of running a legacy&nbsp;.pkg&nbsp;file.</p><p>So… I didn’t.</p><hr><h4 id="%F0%9F%9B%A0%EF%B8%8F-what-i-did-instead"><strong>🛠️ What I Did Instead</strong></h4><p>I used&nbsp;<a href="https://www.mothersruin.com/software/SuspiciousPackage/?ref=danielraffel.me">Suspicious Package</a>&nbsp;— a fantastic little tool that lets you inspect and extract files from&nbsp;.pkg&nbsp;installers — to manually extract the application files, plug-ins, daemons, and kernel extensions from the Apogee installer DMG.</p><p>Even better, Suspicious Package let me inspect the installer’s&nbsp;<strong>postflight script</strong>, which revealed exactly what the official installer does — like creating the&nbsp;.plist&nbsp;file that registers the Symphony system and I/O components. This gave me confidence that I could fully replicate the install process with a shell script. </p><p>From there, I wrote:</p><ul><li>An <a href="https://github.com/danielraffel/symphony-mki-installer-sans-rosetta/blob/main/install_apogee_manual.sh?ref=danielraffel.me" rel="noreferrer">installer script</a> that mimics the official installer (minus Rosetta)</li><li>An <a href="https://github.com/danielraffel/symphony-mki-installer-sans-rosetta/blob/main/uninstall_apogee_manual.sh?ref=danielraffel.me" rel="noreferrer">optional uninstaller</a> that removes all files and settings cleanly</li></ul><p>The process involves a few extra steps, like manually placing extracted files and enabling “Reduced Security” mode to allow the legacy drivers, but it works great — and it avoids installing anything unnecessary.</p><hr><h4 id="%F0%9F%94%8D-for-anyone-else-running-into-this%E2%80%A6"><strong>🔍 For Anyone Else Running Into This…</strong></h4><h4 id=""></h4><p>If you’re a fellow&nbsp;<strong>Symphony MKI</strong>&nbsp;user on an Apple Silicon Mac and want to avoid Rosetta too, I documented everything and shared the scripts here:</p><p>👉&nbsp;<a href="https://github.com/danielraffel/symphony-mki-installer-sans-rosetta?ref=danielraffel.me"><strong>GitHub Repo: symphony-mki-installer-sans-rosetta</strong></a></p><p>The repo includes:</p><ul><li>A step-by-step guide</li><li>Download links for the official installer</li><li>Screenshots of what to expect</li><li>Detailed instructions for using Suspicious Package to extract only what you need</li></ul><p>⚠️&nbsp;<strong>Important Note</strong>: This project does not redistribute Apogee software — just scripts and instructions. You’ll need to download the official installer yourself and extract files locally.</p><hr><h4 id="%F0%9F%99%8C-credit-where-it%E2%80%99s-due"><strong>🙌 Credit Where It’s Due</strong></h4><p>Big thanks to the developer of&nbsp;<a href="https://www.mothersruin.com/software/SuspiciousPackage/?ref=danielraffel.me">Suspicious Package</a>&nbsp;— it’s one of those tools you don’t realize you need until you&nbsp;<em>really </em>need it. It made this process possible. And thanks to Apogee for enabling Apple Silicon support for this audio interface before sunsetting support. They certainly didn’t need to and I’m very appreciative that they did. 👏</p><hr><h4 id="%E2%9A%A0%EF%B8%8F-disclaimer"><strong>⚠️ Disclaimer</strong></h4><p>This is a community workaround.</p><p>I’m not affiliated with Apogee, and this project isn’t endorsed by them.</p><p>Everything is shared&nbsp;<strong>as-is</strong>, without warranties. Use it if it helps — and use it carefully.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Set Up Ghost on My Mac—With Minimal Global Dependencies ]]></title>
        <description><![CDATA[ Recently, I set up a new Mac and wanted to be more intentional about managing installations, especially with Node packages. Having faced headaches from global dependency conflicts before, particularly when developing Ghost themes for this blog, I aimed to keep my system clean and conflicts minimal. ]]></description>
        <link>https://danielraffel.me/til/2025/04/03/how-i-set-up-ghost-on-my-mac-with-minimal-global-dependencies/</link>
        <guid isPermaLink="false">67ef023831b6e50353bde571</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 03 Apr 2025 15:01:04 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/04/ChatGPT-Image-Apr-3--2025-at-02_58_27-PM2.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Recently, I set up a new Mac and wanted to be more intentional about managing installations, especially with Node packages. Having faced headaches from global dependency conflicts before, particularly when developing Ghost themes for this blog, I aimed to keep my system clean and conflicts minimal.</p><p>This time, I decided to use&nbsp;<code>nvm</code>&nbsp;(Node Version Manager) to locally manage Node.js (v20, recommended by Ghost). Using&nbsp;<code>nvm</code>&nbsp;allows each project directory to have its own specific Node version, avoiding clashes. To streamline this, I created a <a href="https://github.com/danielraffel/ghost-dev-install-macos/blob/main/setup-ghost.sh?ref=danielraffel.me" rel="noreferrer">simple script</a> that sets everything up automatically:</p><ul><li>Installs Node.js v20 with&nbsp;<code>nvm</code></li><li>Sets up Ghost locally</li><li>Adds local development tools like&nbsp;<code>yarn</code>&nbsp;and&nbsp;<code>gscan</code></li></ul><p><a href="https://github.com/danielraffel/ghost-dev-install-macos?tab=readme-ov-file&ref=danielraffel.me#-bonus-auto-use-nvmrc-node-version-on-cd" rel="noreferrer">Optional manual step</a>: Automatically switches to the correct Node version when you enter the directory (enabled via an addition to your&nbsp;<code>.zshrc</code>).</p><p>Now, when I want to remove or reset the setup, it's as simple as deleting the project folder—<a href="https://github.com/danielraffel/ghost-dev-install-macos?tab=readme-ov-file&ref=danielraffel.me#-how-do-i-clean-up-everything" rel="noreferrer">minimal residue left behind to cleanup</a>.</p><p>If you’re curious or want to use the same setup, check out the&nbsp;<a href="https://github.com/danielraffel/ghost-dev-install-macos?ref=danielraffel.me">repo on GitHub</a>.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Developing iOS Apps in Cursor with Sweetpad (Yes, It Works) ]]></title>
        <description><![CDATA[ For the past few months, I’ve been using Sweetpad, a macOS extension designed for VS Code-based editors like Cursor. It’s made Apple app development significantly more pleasant by enabling me to build native iOS, watchOS, and macOS apps directly in Cursor—without always having to use Xcode. ]]></description>
        <link>https://danielraffel.me/2025/04/01/developing-ios-apps-in-cursor-with-sweetpad-yes-it-works/</link>
        <guid isPermaLink="false">67ec725831b6e50353bde4df</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 01 Apr 2025 16:34:16 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/04/ChatGPT-Image-Apr-1--2025-at-04_23_54-PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>Update:</strong> I <a href="https://github.com/danielraffel/SwiftCatalyst?ref=danielraffel.me" rel="noreferrer">recently created Swift Catalyst</a>—a modern SwiftUI template designed to help you start building quickly with an AI coding tool like Cursor. It uses VIPER architecture, supports hot reloading in the simulator, and includes custom Swift coding rules along with a meta-rule for generating new Cursor rules. The project is pre-configured with Sweetpad, XcodeGen, SwiftLint, and works with <a href="https://github.com/johnno1962/InjectionNext?ref=danielraffel.me" rel="noreferrer">InjectionNext</a>—the latest evolution of InjectionIII—to streamline development and maintain consistency. </p><hr><h4 id="why-i-rarely-open-xcode-anymore">Why I Rarely Open Xcode Anymore</h4><p>For the past few months, I’ve been using&nbsp;<a href="https://sweetpad.hyzyla.dev/?ref=danielraffel.me" rel="noreferrer">Sweetpad</a>, a macOS extension that integrates with VS Code-based editors like <a href="https://cursor.com/?ref=danielraffel.me" rel="noreferrer">Cursor</a>. It’s completely changed how I approach Apple app development. I can now build native iOS, watchOS, and macOS apps directly in Cursor—and deploy them from the terminal—without constantly relying on Xcode.</p><p>That’s not to say I never touch Xcode anymore—but I do so far less frequently.</p><hr><h4 id="how-my-workflow-changed">How My Workflow Changed</h4><p>With Sweetpad, I can:</p><ul><li>Select my app target</li><li>Choose a <a href="https://sweetpad.hyzyla.dev/docs/simulators/?ref=danielraffel.me" rel="noreferrer">simulator</a> or <a href="https://sweetpad.hyzyla.dev/docs/devices?ref=danielraffel.me" rel="noreferrer">real device</a></li><li>Build and deploy—all from within Cursor</li></ul><p>This means less context switching and more time using Cursor’s powerful AI code models and smarter completions. I've found it to be a productivity win.</p><hr><h4 id="custom-keyboard-shortcuts-in-cursor">Custom Keyboard Shortcuts in Cursor</h4><p>I've mapped a few shortcuts in Cursor’s terminal for everyday tasks:</p><ul><li><code>Cmd+Shift+K</code>&nbsp;→ Clean builds</li><li><code>Cmd+Shift+B</code>&nbsp;→ Build &amp; run</li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/04/image-2.png" class="kg-image" alt="" loading="lazy" width="884" height="394" srcset="https://danielraffel.me/content/images/size/w600/2025/04/image-2.png 600w, https://danielraffel.me/content/images/2025/04/image-2.png 884w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Custom Keyboard Shortcuts</span></figcaption></figure><p>Once triggered in the terminal, Cursor lets me pick the destination device (simulator or real) streamlining testing.</p><hr><h4 id="hot-reload-with-injectioniii">Hot Reload with InjectionIII</h4><p>For UI iteration, I use&nbsp;<a href="https://github.com/johnno1962/InjectionIII?ref=danielraffel.me">InjectionIII</a>, which adds hot-reloading to SwiftUI views. It lets me see changes without doing a full redeploy—huge time-saver for fiddly frontend tweaks.</p><hr><h4 id="where-i-learned-this">Where I Learned This</h4><p>I picked up these tools from the&nbsp;<a href="https://youtu.be/s7BVmsZSmWQ?t=1402&ref=danielraffel.me" rel="noreferrer">Cursor Masterclass</a>&nbsp;by Rudrank Riyam and Ray Fernando. I expected the Cursor + Sweetpad workflow to feel hacky—but I was genuinely impressed. It’s smooth, modern, and lets me stay inside a developer-friendly, AI-enhanced environment.</p><hr><h4 id="my-swift-rules-for-cursor">My Swift Rules for Cursor</h4><p>I’ve published a&nbsp;<a href="https://github.com/danielraffel/SwiftRules/blob/main/swiftrules.md?ref=danielraffel.me" rel="noreferrer">Cursor rules file</a>&nbsp;(Markdown) for Swift iOS development. It includes:</p><ul><li>Sweetpad build support</li><li>InjectionIII hot reload integration</li><li><a href="https://medium.com/@pinarkocak/understanding-viper-pattern-619fa9a0b1f1?ref=danielraffel.me" rel="nofollow">VIPER architecture</a> scaffolding</li><li>SwiftUI best practices</li></ul><p><em>Note: These rules are heavily inspired by the&nbsp;</em><a href="https://www.rayfernando.ai/swift-cursor-rules?ref=danielraffel.me" rel="noreferrer"><em>Swift Cursor Rules</em></a><em>&nbsp;from Ray Fernando &amp; Lou Zell and include some verbatim text with attribution.</em></p><hr><h4 id="when-i-still-use-xcode">When I Still Use Xcode</h4><p>As my projects grow more complex—like handling shared data in app groups across iOS and watchOS—I still turn to Xcode for things like:</p><ul><li>Debugging multiple simulators</li><li>Managing entitlements</li><li>Deep integration testing</li></ul><p>I've also been experimenting with&nbsp;<a href="https://alexcodes.app/?ref=danielraffel.me" rel="noreferrer">Alex Sidebar</a>, a tool that brings AI assistance alongside Xcode. It's a <strong>huge</strong> step up from what Xcode offers out of the box and offers many of the benefits of using Cursor with Sweetpad.</p><hr><h4 id="final-thoughts">Final Thoughts</h4><p>For someone new to Swift, tools like Sweetpad and Cursor have significantly streamlined Apple app development. I’d love to see Apple adopt AI-native workflows that match the quality of these tools. In the meantime, they’ve been instrumental in helping me build quickly and efficiently.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How To Hide &quot;Disable Your Adblocker&quot; Pages with Safari Hide Distracting Items ]]></title>
        <description><![CDATA[ Several years back, I set up AdGuard Home primarily to encrypt my DNS traffic. Along the way, I realized that blocking all those trackers also meant I stopped seeing most ads. Lately, though, more and more websites are catching on and throwing up “please disable your ad blocker” messages. ]]></description>
        <link>https://danielraffel.me/til/2025/04/01/how-to-hide-disable-your-adblocker-pages-with-safari-hide-distracting-items/</link>
        <guid isPermaLink="false">67ec633d31b6e50353bde4ad</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 01 Apr 2025 15:32:06 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/04/ChatGPT-Image-Apr-1--2025-at-03_19_47-PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Several years back, I set up AdGuard Home primarily to encrypt my DNS traffic. Along the way, I realized that blocking all those trackers also meant I stopped seeing most ads. I’ve gotten pretty used to that—mixed feelings aside, it definitely makes for a much cleaner and more enjoyable reading experience.</p><p>Lately, though, more and more websites are catching on and throwing up “please disable your ad blocker” messages. Out of curiosity, I tried something today on my iPhone: I opened the page in Mobile Safari, tapped “<a href="https://support.apple.com/en-la/120682?ref=danielraffel.me" rel="noreferrer">High Distracting Items</a>,” selected the overlay—and just like that, it too disappeared. While my testing has been limited, the change on iOS and iPadOS appears to be persistent.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="200" height="113" src="https://www.youtube.com/embed/b5p466cgt_Y?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="" title="April 1, 2025"></iframe><figcaption><p dir="ltr"><span style="white-space: pre-wrap;">iOS experience using </span><a href="https://support.apple.com/en-la/120682?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">High Distracting Items</span></a></p></figcaption></figure><p>On macOS, the experience is a bit different. When attempting to remove the same content, a disclaimer appears: “Hiding distracting items will not permanently remove ads and other content that update frequently.”</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/04/image-1.png" class="kg-image" alt="" loading="lazy" width="1804" height="844" srcset="https://danielraffel.me/content/images/size/w600/2025/04/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/04/image-1.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/04/image-1.png 1600w, https://danielraffel.me/content/images/2025/04/image-1.png 1804w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">macOS experience using iOS experience using </span><a href="https://support.apple.com/en-la/120682?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">High Distracting Items</span></a></figcaption></figure><p>The “Hide Distracting Items” feature seems to apply only to the device where it’s used—it isn’t synced across iOS, iPadOS, and macOS. So if you want to hide the same element on a page across all your devices, you’ll need to do it separately on each one.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Learning to Play with the Machines ]]></title>
        <description><![CDATA[ In high school, I experimented with making music with an Akai sampler. I was fascinated by the endless ways to tweak audio and it felt revolutionary—much like AI does to me today. While I find AI exhilarating, I recognize that playing with it isn&#39;t as fun sounding to everyone. ]]></description>
        <link>https://danielraffel.me/2025/03/27/learning-to-play-with-the-machines/</link>
        <guid isPermaLink="false">67e45c44b08d15034cc298d4</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 26 Mar 2025 17:04:14 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/03/ChatGPT-Image-Mar-26--2025-at-05_00_24-PM.png" medium="image"/>
        <content:encoded><![CDATA[ <p>When I was in high school, I first experimented with an Akai sampler, marveling at the multitude of ways to manipulate audio. I learned how to slice a waveform, stretch a note, layer sounds, and apply filters and EQ to replay and drastically change the tonal characteristics. Each tweak had the potential to generate something unique and captivating—a sound never heard before.</p><p>Using samplers to mangle audio reminds me of the creative possibilities available using AI today. Just as it was possible to process sounds into something new, it’s possible to combine data and algorithms using AI to generate images, video, text, and code that not only surprises but has its own distinct essence. However, while these possibilities excite me, for some others, venturing into the realm of AI isn’t as fun sounding.</p><h4 id="current-barriers-to-adoption">Current Barriers to Adoption</h4><p>Let’s start with the obvious: the world is heavy right now. Between political drama, economic instability, and an endless stream of global uncertainty, even people who are doing “fine” are often just trying to get through the day. Their focus is largely divided between managing daily responsibilities and seeking relaxation. Engaging with new technology demands cognitive and emotional resources that are often depleted.</p><h4 id="the-emotional-weight">The Emotional Weight</h4><p>There’s a lot of emotional baggage that comes with AI. For some, it doesn’t feel exciting—it feels dystopian. Instead of being empowering, it feels like it’s eroding the very things that give people purpose, security, or even joy. It can feel like a threat—not just to people’s livelihoods and craft, but to their sense of identity and their legal rights. For others, playing with tech just isn’t that interesting—or how they ever wanted to spend their time.</p><h4 id="when-first-impressions-fail">When First Impressions Fail</h4><p>Even when people do try out new AI tools, the initial experience often isn’t great. I’ve demoed tools like <a href="https://www.cursor.sh/?ref=danielraffel.me">Cursor</a>—which I imagine would be an absolute gift for many folks working at tech companies—and discovered some never opened it again. Using an IDE can be daunting if you’re not accustomed to spending your days in one, so it’s perfectly reasonable for people to move on if they don’t quickly connect with it.</p><p>Cursor—and many other AI tools—are evolving at breakneck speed. This rapid development not only introduces genuine innovation but also brings bugs, broken features, and unpredictable changes. Getting the most out of these tools requires frequent use and unconventional thinking, best achieved through playful tinkering. This approach can <a href="https://ghuntley.com/stdlib/?ref=danielraffel.me" rel="noreferrer">uncover new ways to utilize the technology</a> that might not be immediately obvious.</p><p>For those accustomed to stability and who already excel in their current roles, these tools might initially seem like a step backward. Similarly, if you’ve previously struggled with learning technical skills, it’s easy to feel that these tools aren’t suitable—especially now, when time and energy to learn something new are particularly scarce. Fortunately, there is a wealth of tutorials available online, both free and paid, that can help bridge this gap quickly. These resources are designed to flatten the learning curve, offering step-by-step guidance that can accelerate your proficiency and confidence in using these tools.</p><h4 id="diverse-reactions">Diverse Reactions</h4><p>In my circles, I’ve noticed a divide: generalists tend to be more inclined to explore and experiment, while specialists tend to be more skeptical—often focused on where the tools fall short.</p><p>For example, most of the experienced engineers I know are used to picking up new technologies in a very deliberate way: reading the docs, diving into the language, working through tutorials. They’re thoughtful and methodical—and usually not interested in engaging with something in a half-baked manner. When AI spits out buggy code in a language they’ve mastered, it’s not just frustrating—it feels amateur and they don't want any part of it.</p><p>On the other hand, I’ve also seen folks who are newer to a field, or simply less technical overall, who just want to jump in and&nbsp;<em>make</em>&nbsp;things. For them, AI is a creative unlock. It allows them to accomplish in a short amount of time what used to take weeks or months—work they didn’t have the time, patience, or ambition to tackle before. They’re not chasing perfection—they’re chasing momentum. For them, AI unlocks capabilities they haven't developed on their own. Whether the code is well-written or not doesn’t matter—they don’t know, and frankly, don’t care.</p><p>Both of these reactions, and a spectrum in between, are valid. One person sees AI as a toy or a threat. Another may see it as the beginning of <a href="https://www.youtube.com/watch?v=rRQCusvG96k&ref=danielraffel.me" rel="noreferrer">Something Big</a>. <a href="https://danielraffel.me/2023/10/21/general-purpose-technology/" rel="noreferrer">General purpose technologies</a> reshape entire industrial sectors, so it’s no surprise they spark strong, polarized responses.</p><p>For those with deep experience, choosing not to engage with tools that feel unpolished isn’t wrong—it’s often a sign of discernment and maturity. But this moment also calls for <a href="https://en.wikipedia.org/wiki/Shoshin?ref=danielraffel.me" rel="noreferrer">shoshin</a>—a beginner’s mind. A mindset rooted in curiosity, openness, and a willingness to explore new ways of working, even when things are still rough around the edges.</p><h4 id="how-to-build-intuition">How to Build Intuition</h4><p>No matter where you stand, one thing is clear: these tools aren’t going anywhere. For those expecting plug-and-play solutions, it might be surprising to realize that their real power lies in unlocking entirely new ways of thinking and working.</p><p>Using AI tools effectively often requires reimagining familiar workflows, adapting to shortcuts, and embracing alternative paths to the same outcome—sometimes faster, sometimes messier. Doing your best work in this new landscape requires not just learning how and when to use these tools, but also being open to how they can reshape your creative or professional process entirely. As you navigate this new landscape, don’t be surprised if you find yourself chasing your tail at times.</p><p>That kind of adaptation takes trial and error: testing things out, hitting walls, stepping away, and coming back later. It’s less about judging a tool’s value in the moment, and more about building the muscle to understand where it fits, how it works, and when it truly shines. Have fun with it. Be playful.</p><p>Simon Willison’s ongoing experiments prompting vision models to generate images of a “<a href="https://simonwillison.net/tags/pelican-riding-a-bicycle/?ref=danielraffel.me" rel="noreferrer">pelican riding a bicycle</a>” are a great example. He’s using a lighthearted, familiar prompt to consistently test what the latest models can do. This approach to monitoring advancements in state-of-the-art image generation models has been both delightful and engaging.</p><p>Consider staying open—even if your first experience wasn’t great. I’m not suggesting you throw the same problem at every new model or force AI into everything you do. But it’s worth returning with fresh challenges and noticing what works and what doesn’t. Over time, you’ll build intuition—for when to lean in, when to walk away, and which tools are the right fit for the job. More often than not, what you discover will surprise you.</p><h4 id="corporate-resistance">Corporate Resistance</h4><p>AI has the potential to <a href="https://lg.substack.com/p/the-death-of-product-development?ref=danielraffel.me" rel="noreferrer">reshape how we build software products</a>. But embracing that potential isn’t just an individual challenge—it’s a cultural one.</p><p>As one experienced product manager recently put it (paraphrasing):</p><blockquote><em>"</em>AI can and should transform how we build software. But many companies—especially larger ones—aren’t embracing that shift. There’s still a strong preference for specialists, even though AI increasingly rewards generalists who can move quickly, adapt, and experiment. In this new era of software development, I find that especially striking. Over the past two years of interviewing and job searching, I’ve consistently seen employers leaning in the opposite direction—which, to me, feels short-sighted and like a missed opportunity.<em>"</em></blockquote><p>I responded:</p><blockquote><em>"Totally. Mature companies tend to suffer from continuity bias, and many employees are incentivized to protect their existing roles. In contrast, startups thrive on risk-taking—there’s a cultural push to embrace change, experiment, and bring others along. In bigger organizations, that kind of mindset often gets crushed by the existing culture before it can gain traction."</em></blockquote><p>The current climate favors curiosity, boldness, and speed over caution and rigidity. Succeeding in this kind of environment isn’t just about mindset—it’s about making things happen. With access to tools that offer real leverage, the advantage goes to those who are willing to experiment, move fast, and turn possibilities into reality. It means getting out of your comfort zone, trying new approaches, taking smart risks, and staying open to change.</p><h4 id="what-lies-ahead">What Lies Ahead</h4><p>I get that this perspective isn’t for everyone—and I don’t expect universal agreement. I generally lean towards optimism, even though I’m aware that I am likely underestimating the potential drawbacks. However, one thing is certain: the transition towards widespread use of AI is already in progress. It’s not slowing down, and I’m genuinely excited about what it enables me to do today and in the near future—even if not everyone feels the same.</p><p>Over the next 1–3 years, I expect coding models will outperform most top engineers when it comes to writing high-quality code.<strong> </strong>We’re nearing a point where a generalist can prompt their way to fully functional software products—front end, back end, design, marketing, and deployment—without needing deep expertise in any one domain.</p><p>That’s a significant shift. It doesn’t replace the need for experts or devalue craftsmanship—but it&nbsp;<em>does</em>&nbsp;raise the bar for what individuals can build on their own. And it’s likely to reshape how we work and where we choose to apply our time, energy, and human judgment.</p><p>For me, it’s an exciting time to adapt—technically, emotionally, and creatively—and I’ve chosen to lean in. I’m enjoying exploring how to use these tools not just to work smarter, but to <a href="https://www.youtube.com/watch?v=nR4JAonAR4g&ref=danielraffel.me" rel="noreferrer">play, experiment, and express myself</a> in new ways. </p><p>That’s my goal—to enjoy the process, remain curious, and explore new ways to play with the machines.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ I Searched for Earthquake Info. Siri Sent Me to a Website Last Updated in 2019. ]]></title>
        <description><![CDATA[ Tonight, I felt an earthquake and wanted to learn more about it. So, I opened Safari and searched for “Bay Area Earthquake.” Siri suggested a website with a seismic tracker that hadn&#39;t been updated in nearly six years. ]]></description>
        <link>https://danielraffel.me/2025/03/18/i-searched-for-an-earthquake-siri-sent-me-to-a-website-from-2019/</link>
        <guid isPermaLink="false">67d8f2f959afc1034baecbfd</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 17 Mar 2025 21:54:18 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/03/DALL-E-2025-03-17-21.52.45---A-minimalist--graphic-illustration-in-the-style-of-Paul-Rand-featuring-a-smartphone-screen-displaying-a-search-bar-with-the-text--Bay-Area-Earthquake.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Tonight, I felt an earthquake and wanted to learn more about it. So, I opened Safari and searched for “Bay Area Earthquake.” Siri <a href="https://abc7news.com/bay-area-earthquake-tracker/25012/?ref=danielraffel.me" rel="noreferrer">suggested a website</a>, which I clicked on.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/03/image-1.png" class="kg-image" alt="" loading="lazy" width="1234" height="240" srcset="https://danielraffel.me/content/images/size/w600/2025/03/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/03/image-1.png 1000w, https://danielraffel.me/content/images/2025/03/image-1.png 1234w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Siri Suggested Website in Safari on macOS</span></figcaption></figure><p>I expected real-time data, but what I got was a&nbsp;<strong>seismic tracker that hadn’t been updated in nearly six years.</strong></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/03/image-2.png" class="kg-image" alt="" loading="lazy" width="1186" height="1180" srcset="https://danielraffel.me/content/images/size/w600/2025/03/image-2.png 600w, https://danielraffel.me/content/images/size/w1000/2025/03/image-2.png 1000w, https://danielraffel.me/content/images/2025/03/image-2.png 1186w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Bay Area ABC News Earthquake Tracker (from before Covid-19)</span></figcaption></figure><p>Lately, Apple’s AI efforts have faced plenty of criticism, but this highlights an even deeper issue—they’re struggling with the basics. Industry leaders mastered search and real-time information retrieval decades ago, yet Apple still fails to return relevant results for simple queries.</p><p>As AI and large language models redefine how we access knowledge, the next wave of consumer AI will require systems that deliver accurate, thoughtful, and context-aware responses. If Apple is already this far behind on foundational tech, competing with AI-first companies—moving at an unprecedented pace—will be an uphill battle.</p><p><strong>Something drastic needs to change.</strong></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Tracking Accountability in the New SF Mayor’s Office ]]></title>
        <description><![CDATA[ I reached out to the new San Francisco Mayor&#39;s Office to ask how I can monitor the progress of their initiatives. After all, “You can’t manage what you don’t measure.” I haven’t received a response yet, but if and when I do, I’ll share what I learn as I definitely intend to continue to followup. ]]></description>
        <link>https://danielraffel.me/2025/03/12/tracking-accountability-in-the-new-sf-mayors-office/</link>
        <guid isPermaLink="false">67d1df6f34c727034ccb55d1</guid>
        <category><![CDATA[ 🌁 San Francisco ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 12 Mar 2025 12:39:57 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/03/DALL-E-2025-03-12-12.36.25---A-Paul-Rand-inspired-minimalist-illustration-representing-civic-engagement-and-government-transparency.-The-image-features-a-stylized-depiction-of-San.png" medium="image"/>
        <content:encoded><![CDATA[ <p>On February 20, 2025, I reached out to the new San Francisco Mayor's Office to ask how I can monitor the progress of their initiatives. I recognize that it's an early-stage administration, and I certainly don’t expect major results right away, but establishing measurable goals and maintaining transparency with constituents seems like the best way to stay focused and accountable. After all, “You can’t manage what you don’t measure.” I haven’t heard back yet, but I plan to follow up and will share any updates if and when I receive a response.</p><blockquote>Dear Mayor Lurie,<br><br>As a long-time resident of San Francisco, I have been closely observing the ongoing changes across our city. I voted for you and I appreciate the initiatives your administration has begun to implement to enhance safety, security, and overall quality of life, I find myself wondering how best to measure these improvements.<br><br>I frequently travel through areas such as 6th Street, Market Street, and the Tenderloin, and I have noticed that many of these neighborhoods still appear quite challenging. Obviously, I don’t expect change overnight. To better understand the tangible impact of your administration’s efforts, could you please advise if there is a dedicated dashboard or a set of statistics that residents can regularly monitor? Ideally, I am looking for a data-driven resource that ties the various initiatives and improvements directly to measurable outcomes across city services.<br><br>Your guidance would not only help me gauge the progress being made but also bolster community confidence in the positive changes underway. Thank you for your commitment to our city and for considering this request.</blockquote><p><strong>Update 1:</strong> After several weeks without a response, I followed up again on March 10, 2025.</p><blockquote>I was hoping someone from your office could direct me to a resource where I can track the performance of your initiatives. Could you let me know if there’s anything public-facing, if something is in the works, or if no such resource exists? I’d really appreciate a clear answer to this request. Thanks!</blockquote><p>This time, I received an immediate automated reply to my email.</p><blockquote>﻿Thank you for reaching out to me. I greatly value hearing from members of our community and appreciate you taking the time to connect.<br><br>Due to the large number of inquiries my office receives, it may take several days for us to respond. My team and I are committed to reviewing every message and will be in touch as soon as we are able.<br><br>If you need more immediate assistance, you may want to direct your inquiry to the appropriate person on my team:<br><br><strong>City Services:</strong>&nbsp;Please call 311 or visit&nbsp;<a href="https://www.sf.gov/topics/311-online-services?ref=danielraffel.me">https://www.sf.gov/topics/311-online-services</a><br><br><strong>Press Inquiries:</strong>&nbsp;For media and press-related questions, please email&nbsp;<a href="mailto:mayorspressoffice@sfgov.org">mayorspressoffice@sfgov.org</a><br><br><strong>Proclamation and Commendation Requests:</strong>&nbsp;If you are requesting a proclamation or commendation, please contact&nbsp;<a href="mailto:commendations@sfgov.org">commendations@sfgov.org</a><br><br><strong>Scheduling Requests:</strong>&nbsp;To invite me to an event or&nbsp;request&nbsp;a meeting, please reach out to&nbsp;<a href="mailto:scheduling@sfgov.org">scheduling@sfgov.org</a><br><br><strong>Appointments and Board Opportunities:</strong>&nbsp;For information on appointments or serving on boards, please email&nbsp;<a href="mailto:mayor.appointments@sfgov.org">mayor.appointments@sfgov.org</a><br><br>Thank you for your patience and understanding as my team works to respond as quickly as possible. I am committed to a responsive city government and to serving you and our community.<br><br>Sincerely,<br><br>Daniel Lurie<br>Mayor of San Francisco</blockquote><p><strong>Update 2:</strong>&nbsp;Still trying to get a response to my request. I reached out again on March 21, 2025.</p><blockquote>I wanted to follow up on the note I sent a few weeks ago on March 10th—you mentioned it might take a few days to get back to me but it’s now been a month since my initial outreach on Feb 20th.<strong>&nbsp;I am hoping someone from your office could direct me to a resource where I can track the performance of your initiatives.</strong><br><br>If there’s anything public-facing, in development, or if nothing like that currently exists, I’d really appreciate a clear answer either way. And if there’s someone else I should be speaking with about this, I’d be grateful for a quick redirect.<br><br>Thank you!</blockquote> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Fixing Safari’s “This Webpage Reloaded” Error on macOS with AdGuard Home DNS ]]></title>
        <description><![CDATA[ Recently, I made a small change in AdGuard Home’s DNS settings that might have resolved an issue impacting Safari on macOS. I switched from making single DNS requests to parallel requests. Since then, I haven’t encountered this message: “This webpage reloaded because a problem occurred.” ]]></description>
        <link>https://danielraffel.me/til/2025/03/11/fixing-safaris-this-webpage-reloaded-error-on-macos-with-adguard-home-dns/</link>
        <guid isPermaLink="false">67d0797134c727034ccb5568</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 11 Mar 2025 11:16:13 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/03/DALL-E-2025-03-11-11.15.05---A-minimalist--abstract-illustration-inspired-by-Paul-Rand.-The-design-features-a-stylized-computer-screen-with-the-Safari-browser-showing-an-error-mes.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I’ve been using AdGuard Home as a DNS relay with Google Nest WiFi (2nd Gen), and over the past year, I noticed that Safari on macOS would occasionally stop loading pages, displaying a vague error message:&nbsp;<em>“This webpage reloaded because a problem occurred.”</em>&nbsp;After that, all future requests would lead to a blank page. It did not help to flush the DNS cache and restart Safari with: <code>sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder</code>.</p><p>Interestingly, Chrome remained unaffected and continued to function normally, even when Safari refused to load pages. The only solution I found to get Safari to load pages once it entered this state was to restart my machine.</p><p>Recently, I made a small change in AdGuard Home’s DNS settings that might have resolved the issue. Under&nbsp;<em>Upstream DNS servers</em>, I switched from making single DNS requests to parallel requests. Since then, I haven’t encountered the Safari problem. If the issue returns, I’ll update this post, but for now, this adjustment appears to have fixed my issue. 🤞</p><p>Hopefully, this helps others with a similar setup who are experiencing the same Safari issue!</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/03/image.png" class="kg-image" alt="" loading="lazy" width="2000" height="1133" srcset="https://danielraffel.me/content/images/size/w600/2025/03/image.png 600w, https://danielraffel.me/content/images/size/w1000/2025/03/image.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/03/image.png 1600w, https://danielraffel.me/content/images/2025/03/image.png 2330w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Building a Custom GPT for Developer-Ready Specs ]]></title>
        <description><![CDATA[ After reading Harper’s post on leveraging LLMs for brainstorming and generating actionable plans and specs, I decided to build a custom GPT using the prompts he outlined in step one. ]]></description>
        <link>https://danielraffel.me/2025/02/21/building-a-custom-gpt-for-developer-ready-specs/</link>
        <guid isPermaLink="false">67b7c4a191cc38034cfaeb20</guid>
        <category><![CDATA[ 👀 Discovering ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 20 Feb 2025 17:03:28 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/02/DALL-E-2025-02-20-17.01.13---A-Paul-Rand-inspired-illustration-of-a-creative-workspace-with-abstract-shapes-and-bold-typography.-The-image-features-a-modern-desk-with-a-laptop-dis.png" medium="image"/>
        <content:encoded><![CDATA[ <p>After reading <a href="https://harper.blog/2025/02/16/my-llm-codegen-workflow-atm/?ref=danielraffel.me" rel="noreferrer">Harper’s post</a> on leveraging LLMs for brainstorming and generating actionable plans and specs, I decided to build a <a href="https://chatgpt.com/g/g-67b61eb87b908191b95fcc0845d4c30c-brainstorming-ideas?ref=danielraffel.me" rel="noreferrer">custom GPT</a> using the prompts outlined in step one.</p><p>In the event the GPT ever disappears here's the prompt I used:</p><blockquote>This GPT asks me one question at a time so we can develop a thorough, step-by-step spec for a new idea I am exploring. Each question should build on my previous answers, and our end goal is to have a detailed specification I can hand off to a developer. Let’s do this iteratively and dig into every relevant detail. Remember, only one question at a time.</blockquote><blockquote>Our brainstorm should end naturally. Once we’ve wrapped up the brainstorming process, or I ask, please compile our findings into a comprehensive, developer-ready specification in a file called <code>spec.md</code>. Include all relevant requirements, architecture choices, data handling details, error handling strategies, and a testing plan so a developer can immediately begin implementation.</blockquote><blockquote>Start by asking me about my idea.</blockquote><p>In step two, he suggested feeding the output into a reasoning model with additional prompts. It’s a great idea. For those specific commands, check out his post, and if you plan to use this workflow regularly, consider setting up <a href="https://danielraffel.me/til/2024/12/31/how-to-easily-summarize-an-existing-chat-with-claude-before-starting-a-new-one/" rel="noreferrer">keyboard shortcuts</a> based on the prompts he outlined.</p><p>Nice workflow, <a href="https://bsky.app/profile/harper.lol/post/3lidixzdr5j2e?ref=danielraffel.me" rel="noreferrer">Harper</a>! 👏</p><hr><p><em><strong>Note:</strong> I have absolutely no idea why a decent percentage of the time the GPT says it cannot be found. 😬 Refreshing seems to help as does using the GPT in a browser.</em></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/02/Screenshot-2025-02-20-at-5.11.43-PM.png" class="kg-image" alt="upload in progress, 0" loading="lazy" width="756" height="138" srcset="https://danielraffel.me/content/images/size/w600/2025/02/Screenshot-2025-02-20-at-5.11.43-PM.png 600w, https://danielraffel.me/content/images/2025/02/Screenshot-2025-02-20-at-5.11.43-PM.png 756w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Why Gemini Getting its Own iOS App is a Sign of Product Maturity at Google ]]></title>
        <description><![CDATA[ I recently received an email from Google announcing that Gemini now has its own dedicated iOS app—no longer bundled within the Google Search app. I see this app separation as a sign that Google is evolving and applying lessons from past missteps—a positive indicator for the future! ]]></description>
        <link>https://danielraffel.me/2025/02/20/why-gemini-getting-its-own-ios-app-is-a-sign-of-product-maturity-at-google/</link>
        <guid isPermaLink="false">67b664295947a10348f159b7</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 19 Feb 2025 16:30:03 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/02/DALL-E-2025-02-19-16.25.55---An-abstract--modernist-illustration-in-the-style-of-Paul-Rand--symbolizing-the-breakup-of-Google-s-Gemini-and-Search-apps.-The-design-incorporates-bol.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently received an email from Google announcing that Gemini now has its own dedicated iOS app—no longer bundled within the Google Search app.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/02/image.png" class="kg-image" alt="" loading="lazy" width="1324" height="1382" srcset="https://danielraffel.me/content/images/size/w600/2025/02/image.png 600w, https://danielraffel.me/content/images/size/w1000/2025/02/image.png 1000w, https://danielraffel.me/content/images/2025/02/image.png 1324w" sizes="(min-width: 720px) 720px"></figure><p>Having managed many iOS apps at Google and worked closely with representatives at Apple on App Store releases, I’ve seen firsthand the benefits of a diverse app portfolio. Standalone apps allow Google to:</p><ul><li>Deliver tailored user experiences for specific audiences that rival top-tier competitors.</li><li>Target specific App Store keywords and rank highly in critical product categories.</li><li>Simultaneously secure promotional App Store opportunities for multiple Google products.</li><li>Gain insights into industry trends, revenue, and download metrics by analyzing the performance of multiple Google apps across a broad spectrum of top-ranking apps, including direct competitors and emerging players.</li></ul><p>As an outsider, it can be difficult to understand why a company like Google would occasionally adopt such a fragmented product strategy. For example, why did they “<a href="https://donorem.medium.com/shipping-the-org-chart-3319181be9bd?ref=danielraffel.me" rel="noreferrer">ship their org chart</a>” by creating multiple entry points for Google Assistant—offering it as a standalone iOS app, within the Google Home app, and as a settings option in the Google app—despite the customer confusion, internal competition, and operational inefficiencies this created? Often, the pressure to launch quickly, combined with internal politics, forces teams to make short-term compromises with the expectation of streamlining later. In the long run, maintaining a clear and focused product experience benefits both the people who use it and the people who build it. I see this app separation as a sign that Google is evolving and applying lessons from past missteps—a positive indicator for the future!</p><p>Removing Gemini from the Google Search app is a smart move. It not only differentiates the two experiences but also gives the Gemini team room to innovate and develop its own revenue stream. Bundling Gemini within the Google Search app was clearly the fastest path to market; now, this separation paves the way for more strategic development. Not to mention, the ChatGPT app on iOS and macOS is outstanding and constantly getting better. If Gemini wants to compete, it will need to step up massively.</p><p>Kudos to everyone at Google who contributed to this thoughtful—<strong>if obvious</strong>—internal decision to break up the iOS Google Search and Gemini apps sooner rather than later.</p><p>Personally, I hope that as Gemini and Search continue to experiment together, they address the quality of their integrations. The current AI-generated overviews continue to fall short of the high standards I associate with Google Search, and improving these will be key to maintaining trust and credibility.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/02/image-1.png" class="kg-image" alt="" loading="lazy" width="1374" height="1164" srcset="https://danielraffel.me/content/images/size/w600/2025/02/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/02/image-1.png 1000w, https://danielraffel.me/content/images/2025/02/image-1.png 1374w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Got Developer Mode to Show Up on watchOS—and Fixed Xcode watchOS Discovery Issues ]]></title>
        <description><![CDATA[ If Developer Mode doesn’t appear on your Apple Watch, try physically connecting your iPhone to your Mac and opening Xcode. And if your Mac and Xcode can’t find your Watch, a full reset and reinstall of Xcode may be necessary. ]]></description>
        <link>https://danielraffel.me/til/2025/02/01/how-i-got-developer-mode-to-appear-on-watchos-when-it-was-missing/</link>
        <guid isPermaLink="false">679e86da5248ce034b8985df</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 01 Feb 2025 12:55:48 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/02/DALL-E-2025-02-01-12.51.14---A-minimalist-and-bold-illustration-in-the-style-of-Paul-Rand.-The-image-should-feature-a-stylized-Apple-Watch-with-a-missing--Developer-Mode--toggle---fe8450e2d8b9bd6a.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR</strong>: I’ve run into several issues developing with my Apple Watch in Xcode, and here are a few fixes that helped. If&nbsp;<strong>Developer Mode</strong>&nbsp;doesn’t appear on your Apple Watch, try physically connecting your iPhone to your Mac and opening Xcode. Afterwards, if Xcode still can’t detect your Watch, you may need to&nbsp;<strong>reset and reinstall Xcode</strong>.&nbsp;On yet another Mac setup, I was only able to get my&nbsp;<strong>Apple Watch to appear in Xcode by enabling a personal hotspot</strong>—see Fix #3.</p><blockquote>9/25 Update: I strongly recommend Fix #3—it’s consistently worked across multiple Apple Watches.</blockquote><hr><h4 id="%F0%9F%94%A7-fix-1-how-to-enable-developer-mode-on-watchos-when-it-doesn%E2%80%99t-appear">🔧&nbsp;<strong>Fix #1: </strong>How to Enable Developer Mode on watchOS When It Doesn’t Appear</h4><p>I was working on a simple app for my Apple Watch and needed to enable Developer Mode to build and run it on my device. However, despite repeatedly checking&nbsp;<strong>Settings &gt; Privacy &amp; Security</strong>, the&nbsp;<em>Developer Mode</em>&nbsp;option was nowhere to be found. After experimenting with a few solutions, I finally discovered a fix:</p><ol><li><strong>Connect the iPhone to a Mac with a physical USB cable.</strong></li><li><strong>Open Xcode and go to&nbsp;<em>Manage Run Destinations</em>.</strong></li><li>The watchOS device should appear—follow the prompts on the Watch to trust the computer.</li><li>Now go to&nbsp;<strong>Privacy &amp; Security</strong>&nbsp;on your Apple Watch—<em>Developer Mode</em>&nbsp;should appear.</li><li>Enable it, then&nbsp;<strong>restart your watch</strong>!</li></ol><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/02/incoming-36131EA7-1B72-4527-B929-1B48D56591F9-c4acc4d3ed56fa24.png" class="kg-image" alt="" loading="lazy" width="368" height="448"><figcaption><span style="white-space: pre-wrap;">Bit annoying to make this appear when it's missing!</span></figcaption></figure><hr><h4 id="%F0%9F%A7%BC-fix-2-reset-xcode-when-it-can%E2%80%99t-detect-your-apple-watch">🧼 Fix #2: Reset Xcode When It Can’t Detect Your Apple Watch</h4><p>After wiping and reinstalling macOS Sequoia 18.4 on a Mac where I had previously used Xcode, I ran into a new issue—Xcode could no longer detect my Apple Watch, even when it was connected via a USB cable. This fix required fully resetting Xcode-related files and reinstalling from scratch. </p><p>⚠️&nbsp;<strong>Heads up!</strong></p><p>This process is&nbsp;<em>destructive</em>. It wipes Xcode-related caches, simulators, settings, logs, and preferences. While it has helped resolve stubborn issues (e.g. simulator/device pairing), it’s time-consuming and doesn’t always guarantee a fix.&nbsp;<strong>Only do this if other troubleshooting hasn’t worked.</strong></p><p>🔐&nbsp;<strong>Before You Start:</strong></p><ul><li>Backup any custom device profiles, provisioning profiles, and simulator data you want to preserve.</li><li>Save/export settings and any unsaved changes in your Xcode projects.</li></ul><p><strong>💡 Should you decide to proceed, here are the steps I followed:</strong></p><h4 id="1-reset-simulators-and-derived-data">1. Reset simulators and derived data</h4><pre><code class="language-bash"># Erase all simulators
xcrun simctl erase all

# Remove derived data
rm -rf ~/Library/Developer/Xcode/DerivedData</code></pre><h4 id="2-delete-xcode-and-support-files">2. Delete Xcode and support files</h4><pre><code class="language-bash"># Remove Xcode
sudo rm -rf /Applications/Xcode.app

# Remove all Xcode developer settings and caches
rm -rf ~/Library/Developer
rm -rf ~/Library/Caches/com.apple.dt.Xcode
rm -rf ~/Library/Application\ Support/Xcode
rm -rf ~/Library/Preferences/com.apple.dt.Xcode.plist
rm -rf ~/Library/Preferences/com.apple.dt.xcodebuild.plist
rm -rf ~/Library/Logs/DiagnosticReports/Xcode_*

# Delete command line tools and reset developer directory
sudo rm -rf /Library/Developer/CommandLineTools
sudo xcode-select --reset</code></pre><blockquote>⚠️&nbsp;<strong>Note:</strong>&nbsp;This clears all Xcode preferences, logs, and caches. Your actual project files are&nbsp;<strong>not</strong>&nbsp;affected but you'll need to re-download Xcode and sign back in to your Apple Developer account. Ugh.</blockquote><h4 id="3-reinstall-xcode-and-command-line-tools"><strong>3. Reinstall Xcode and Command Line Tools</strong></h4><ul><li><strong>Download Xcode</strong>&nbsp;from the&nbsp;<strong>Mac App Store</strong>, or use the&nbsp;<a href="https://developer.apple.com/download/all/?ref=danielraffel.me">Apple Developer site</a>&nbsp;if you need a specific version.</li></ul><p>After installation, re-install Command Line Tools in the terminal:</p><pre><code class="language-bash">xcode-select --install</code></pre><h4 id="4-open-xcode-and-re-pair-devices"><strong>4. Open Xcode and Re-Pair Devices</strong></h4><p>After Xcode is installed and launched:</p><ul><li>Open&nbsp;<strong>Window → Devices and Simulators</strong></li><li>Plug in your&nbsp;<strong>iPhone and Apple Watch</strong>&nbsp;via USB</li><li>Your Watch should <strong><em>hopefully</em></strong> now appear and pair successfully</li></ul><hr><h4 id="%F0%9F%8C%90-fix-3-when-xcode-can%E2%80%99t-find-your-apple-watch%E2%80%94try-a-hotspot"><strong>🌐 Fix #3: When Xcode Can’t Find Your Apple Watch—Try a Hotspot</strong></h4><p>On one macOS setup,&nbsp;<strong>none of the previous fixes worked</strong>—Xcode still wouldn’t detect my Apple Watch. Here’s what finally did the trick:</p><ol><li><strong>Enable Personal Hotspot</strong>&nbsp;on your iPhone.</li><li><strong>Connect your Mac and Watch to the hotspot</strong>&nbsp;via Wi-Fi.</li><li><strong>Physically connect both the iPhone and Apple Watch</strong>&nbsp;to your Mac using USB cables.</li><li>Open&nbsp;<strong>Xcode &gt; Devices and Simulators</strong>—and if you’re lucky, your Apple Watch will finally appear for setup.</li></ol><blockquote><strong>Note:</strong>&nbsp;I'm speculating but this may be helpful if you’re on a<strong> mesh network</strong>&nbsp;(e.g., Eero, Nest, Orbi) with lots of devices, where&nbsp;<strong>bonjour and device discovery can be flaky</strong>. The hotspot setup helps by isolating traffic to just the essential devices, eliminating potential network noise or routing issues that can interfere with device pairing in Xcode.</blockquote> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ GGB Weather App for iOS ]]></title>
        <description><![CDATA[ I built a simple iOS app that pulls a live webcam image of the Golden Gate Bridge, provides real-time weather conditions, and suggests the best times to cross based on the days weather. While I haven&#39;t published it to the AppStore the code is open-source for anyone who wants to compile it. ]]></description>
        <link>https://danielraffel.me/2025/02/01/ggb-weather-app-for-ios/</link>
        <guid isPermaLink="false">679d661b5248ce034b898563</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 31 Jan 2025 16:32:23 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/02/DALL-E-2025-01-31-16.25.43---A-Paul-Rand-inspired-illustration-of-a-cyclist-riding-across-the-Golden-Gate-Bridge.-The-artwork-features-bold-geometric-shapes--clean-lines--and-a-mi-6d452e626ffed559.png" medium="image"/>
        <content:encoded><![CDATA[ <p>In a <a href="https://danielraffel.me/2024/10/22/plan-your-golden-gate-bridge-crossing-ideal-for-cyclists-runners-and-walkers/" rel="noreferrer">previous post</a>, I mentioned that I frequently bike across the Golden Gate Bridge. The ride from my home to the bridge takes about 35 minutes, and after completing a loop on the other side, I typically cross back about 40 minutes later. The weather in my neighborhood often differs significantly from the bridge, and I've learned the hard way that conditions can change dramatically while I’m on the other side. After a few rough rides due to being underdressed for my second crossing, I got tired of checking multiple data sources to gauge real-time conditions and decide what to wear.</p><p>This frustration led me to build a <a href="http://www.generouscorp.com/ggb/?ref=danielraffel.me" rel="noreferrer">simple website</a> that pulls a live webcam image of the bridge and allows me to configure my estimated arrival times for both crossings to get weather forecasts that allow me to dress accordingly. After creating the site, I thought it would be even nicer to have a small widget on my phone, so I <a href="https://github.com/danielraffel/ggb-weather-swift?ref=danielraffel.me" rel="noreferrer">decided to port it to Swift</a> as a learning exercise. Given the niche audience for an app like this, I’m unsure whether I'll ever publish it on the App Store, but I’ve shared the code so others can compile it if they’d like.</p><p>For now, I have a handy widget on my phone that lets me quickly check a recent snapshot of the Golden Gate Bridge, along with current weather conditions and some inspiration for the two best times to cross today. I also got to learn a bit about Swift.</p><figure class="kg-card kg-video-card kg-width-regular" data-kg-thumbnail="https://danielraffel.me/content/media/2025/02/Simulator-Screen-Recording---iPhone-16---2025-01-31-at-16.21.53-bf9b8d90b084f831_thumb-c76c9af46eea728b.jpg" data-kg-custom-thumbnail="">
            <div class="kg-video-container">
                <video src="https://danielraffel.me/content/media/2025/02/Simulator-Screen-Recording---iPhone-16---2025-01-31-at-16.21.53-bf9b8d90b084f831.mp4" poster="https://img.spacergif.org/v1/1178x2556/0a/spacer.png" width="1178" height="2556" loop="" autoplay="" muted="" playsinline="" preload="metadata" style="background: transparent url('https://danielraffel.me/content/media/2025/02/Simulator-Screen-Recording---iPhone-16---2025-01-31-at-16.21.53-bf9b8d90b084f831_thumb-c76c9af46eea728b.jpg') 50% 50% / cover no-repeat;"></video>
                <div class="kg-video-overlay">
                    <button class="kg-video-large-play-icon" aria-label="Play video">
                        <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
                            <path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"></path>
                        </svg>
                    </button>
                </div>
                <div class="kg-video-player-container kg-video-hide">
                    <div class="kg-video-player">
                        <button class="kg-video-play-icon" aria-label="Play video">
                            <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
                                <path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"></path>
                            </svg>
                        </button>
                        <button class="kg-video-pause-icon kg-video-hide" aria-label="Pause video">
                            <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
                                <rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"></rect>
                                <rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"></rect>
                            </svg>
                        </button>
                        <span class="kg-video-current-time">0:00</span>
                        <div class="kg-video-time">
                            /<span class="kg-video-duration">0:12</span>
                        </div>
                        <input type="range" class="kg-video-seek-slider" max="100" value="0">
                        <button class="kg-video-playback-rate" aria-label="Adjust playback speed">1×</button>
                        <button class="kg-video-unmute-icon" aria-label="Unmute">
                            <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
                                <path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"></path>
                            </svg>
                        </button>
                        <button class="kg-video-mute-icon kg-video-hide" aria-label="Mute">
                            <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
                                <path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"></path>
                            </svg>
                        </button>
                        <input type="range" class="kg-video-volume-slider" max="100" value="100">
                    </div>
                </div>
            </div>
            
        </figure><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/ggb-weather-swift?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/ggb-weather-swift</div><div class="kg-bookmark-description">Contribute to danielraffel/ggb-weather-swift development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-3.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/ggb-weather-swift" alt="" onerror="this.style.display = 'none'"></div></a></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How Disabling Parts of a Neural Network May Reveal the Secrets Behind Biased Outputs ]]></title>
        <description><![CDATA[ A friend recently became intrigued by claims that some impressive new AI models have been sabotaged by government propaganda, so he began exploring ways to reverse that influence. In his search, he encountered the concepts of model ablation and mechanistic interpretability. ]]></description>
        <link>https://danielraffel.me/til/2025/01/31/how-disabling-parts-of-a-neural-network-may-reveal-the-secrets-behind-biased-outputs/</link>
        <guid isPermaLink="false">679d305a4d82730349724edf</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 31 Jan 2025 12:47:40 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-31-12.45.33---A-minimalist--bold-graphic-inspired-by-Paul-Rand--using-geometric-shapes-and-a-clean--structured-composition.-The-design-employs-a-limited-color-palet.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>Update 5/27/25:</strong> <a href="https://transformer-circuits.pub/2025/attribution-graphs/methods.html?ref=danielraffel.me" rel="noreferrer">Anthropic</a> has developed methods to better understand and visualize the internal workings of large language models, akin to how biologists use microscopes to study cellular structures. They've created tools that map out the 'circuitry' inside models, revealing how specific computational features interact to generate outputs. This breakthrough enhances our ability to assess and improve the models' reliability and transparency in practical applications.</p><hr><p>A friend recently became intrigued by claims that some impressive new AI models have been sabotaged by government propaganda, so he began exploring ways to reverse that influence. In his search, he encountered the concepts of <a href="https://en.wikipedia.org/wiki/Ablation_(artificial_intelligence)?ref=danielraffel.me" rel="noreferrer">model ablation</a> and <a href="https://en.wikipedia.org/wiki/Explainable_artificial_intelligence?ref=danielraffel.me#Interpretability" rel="noreferrer">mechanistic interpretability</a>—techniques used to understand and analyze the inner workings of machine learning models, particularly complex ones like deep neural networks. Since I wasn't familiar with these concepts and don't specialize in this field, I looked them up. Here's what I learned:</p><hr><h3 id="model-ablation">Model Ablation</h3><p>Model ablation is the process of systematically removing or disabling parts of a model (such as <a href="https://en.wikipedia.org/wiki/Neural_network_(machine_learning)?ref=danielraffel.me" rel="noreferrer">neurons</a>, <a href="https://en.wikipedia.org/wiki/Layer_(deep_learning)?ref=danielraffel.me" rel="noreferrer">layers</a>, <a href="https://machine-learning.paperspace.com/wiki/weights-and-biases?ref=danielraffel.me" rel="noreferrer">weights</a>, or even entire components) to observe how these changes affect the model’s performance or behavior. The goal is to identify which parts are critical for certain tasks or functionalities.</p><p><strong>Key Points:</strong></p><ul><li><strong>Understanding Component Importance:</strong><br>By "ablating" (i.e., turning off or removing) specific components, researchers can infer the role and importance of those components in the overall model performance. For example, if removing a particular layer significantly degrades performance on a task, that layer is likely crucial.</li><li><strong>Identifying Redundancies:</strong><br>Ablation studies can reveal if some parts of the model are redundant or if there are alternative pathways in the network that can compensate for the loss of certain components.</li><li><strong>Practical Applications:</strong><ul><li><strong>Model Pruning:</strong> Ablation methods can lead to more efficient models by identifying and removing unnecessary components, which is especially useful for deploying models on resource-constrained devices.</li><li><strong>Debugging and Optimization:</strong> Helps in diagnosing why a model might be failing or underperforming on certain tasks by pinpointing which parts of the network contribute most to errors.</li></ul></li><li><strong>Methodology:</strong><br>Researchers typically conduct ablation studies by modifying the model’s architecture or parameters in controlled experiments and then measuring changes in performance, activation patterns, or other metrics.</li></ul><p><strong>Hypothetical Application for Removing Propaganda:</strong><br>Imagine an open source AI model that, allegedly, has been influenced to output propaganda. A researcher might:</p><ul><li><strong>Systematically Disable Components:</strong><br>Run controlled ablation experiments where specific layers or neurons are temporarily disabled.</li><li><strong>Evaluate Output Changes:</strong><br>Examine whether the removal of certain components reduces or eliminates outputs that resemble propagandistic language or biases.</li><li><strong>Isolate Critical Propaganda Circuits:</strong><br>Identify parts of the network that, when disabled, lead to a significant drop in propaganda-like responses. This would provide clues about which parts of the model are responsible for incorporating such bias.</li></ul><p>This process is highly iterative and requires careful testing. It might reveal that a particular layer or set of neurons is over-representing certain political narratives. Once these are identified, one might consider permanently modifying or “pruning” these components to mitigate the undesired influence.</p><hr><h3 id="mechanistic-interpretability">Mechanistic Interpretability</h3><p>Mechanistic interpretability is an approach aimed at understanding the internal mechanisms and computations of a machine learning model at a granular, algorithmic level. Instead of treating the model as a "black box," this approach seeks to reverse-engineer the network’s inner workings to explain how it processes information and makes decisions.</p><p><strong>Key Points:</strong></p><ul><li><strong>Dissecting the "Algorithm":</strong><br>The goal is to identify and describe the specific computations, circuits, or pathways within the network that lead to particular outputs. This might involve mapping out how information flows through the layers, how neurons interact, and what kind of operations they perform.</li><li><strong>Human-Understandable Explanations:</strong><br>Unlike high-level statistical analyses, mechanistic interpretability strives for explanations that are understandable in human terms. For example, researchers might explain that a certain network circuit acts similarly to a logical "if-then" statement or performs a specific type of pattern matching.</li><li><strong>Research Directions and Examples:</strong><ul><li><strong>Circuit Analysis:</strong> Efforts in mechanistic interpretability often involve identifying “circuits” within neural networks that are responsible for certain behaviors, such as recognizing objects in images or processing language.</li><li><strong>Layer and Neuron Analysis:</strong> Detailed studies might focus on the role of individual neurons or groups of neurons, tracking how changes in their activation relate to the model's overall output.</li></ul></li><li><strong>Benefits:</strong><ul><li><strong>Trust and Reliability:</strong> A clearer understanding of how models work can help build trust in their decisions, which is especially important in high-stakes applications like healthcare or finance.</li><li><strong>Safety and Robustness:</strong> By knowing the internal mechanisms, researchers can better anticipate and mitigate failure modes or vulnerabilities (such as adversarial attacks).</li></ul></li></ul><p><strong>Hypothetical Application for Removing Propaganda:</strong><br>Using mechanistic interpretability, a researcher could:</p><ul><li><strong>Map the Information Flow:</strong><br>Dive into the model’s internal circuits to identify which parts of the network are responsible for processing politically charged or propagandistic content.</li><li><strong>Trace Specific Computations:</strong><br>Analyze how specific inputs trigger outputs that seem to carry propaganda. For instance, if certain word patterns or phrases consistently lead to biased responses, the researcher might trace these back to particular neurons or pathways.</li><li><strong>Develop a Modification Strategy:</strong><br>With a detailed map of the computations, the researcher can propose targeted modifications. These might include:<ul><li><strong>Re-calibrating the Weights:</strong> Adjusting the influence of neurons that contribute to propagandistic outcomes.</li><li><strong>Altering Activation Functions:</strong> Tweaking how these neurons activate in response to certain inputs.</li><li><strong>Implementing Filters or Safeguards:</strong> Adding components that specifically detect and neutralize propagandistic signals before they affect the final output.</li></ul></li></ul><p>This approach, while theoretically promising, is <strong>exceptionally challenging</strong>. It requires a deep understanding of the model’s architecture and behavior, as well as a rigorous validation process to ensure that modifications do not inadvertently impair the model’s overall performance or introduce new biases.</p><hr><h3 id="how-they-relate-and-a-hypothetical-workflow">How They Relate and a Hypothetical Workflow</h3><p>While model ablation is often used as an experimental tool to understand which parts of a model are necessary for its performance, mechanistic interpretability is a broader effort to map out and explain the inner workings of the model in detail. Together, these approaches could form a hypothetical workflow for removing unwanted propaganda influences:</p><ol><li><strong>Initial Diagnosis with Ablation:</strong><ul><li>Systematically disable various components of the AI model.</li><li>Monitor changes in outputs, especially those suspected of carrying propaganda.</li><li>Identify which parts, when removed, result in a significant reduction of bias or propagandistic content.</li></ul></li><li><strong>Deep Dive with Mechanistic Interpretability:</strong><ul><li>Analyze the identified components to understand their specific functions and interactions.</li><li>Map out the neural circuits and computations that contribute to the propagation of biased outputs.</li><li>Develop targeted strategies (e.g., weight adjustments, architectural changes) to mitigate or remove the undesired influence.</li></ul></li><li><strong>Validation and Iteration:</strong><ul><li>Validate the modified model against a wide range of inputs to ensure that the removal of propaganda does not compromise the model’s ability to perform its intended tasks.</li><li>Iterate on the modifications, continuously refining the approach based on performance metrics and further interpretability studies.</li></ul></li></ol><hr><h3 id="final-thoughts">Final Thoughts</h3><p>It’s important to note that this entire process is hypothetical and fraught with challenges. Removing ideological bias or propaganda from an AI model is not as straightforward as simply “turning off” a few neurons. The interdependencies within deep neural networks mean that changes in one part of the model can have unforeseen effects on others. Moreover, the definitions of “propaganda” or “bias” are subjective, and any modifications must be carefully balanced against the risk of degrading the overall performance or introducing new biases.</p><p>While model ablation and mechanistic interpretability offer promising avenues for understanding and potentially modifying AI models, applying these techniques to remove something as complex and nuanced as propaganda requires not only technical expertise but also a careful, nuanced approach to ensure the integrity and reliability of the model remain intact.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Why Some Installers Say No to Solar—And Why They Might Be Wrong ]]></title>
        <description><![CDATA[ Some homes labeled &quot;unsuitable&quot; for solar could still benefit from it—often with more affordable equipment and creative system design. Consider getting a quote from an installer who can better accommodate your home&#39;s unique conditions. ]]></description>
        <link>https://danielraffel.me/2025/01/31/why-some-installers-say-no-to-solar-and-why-they-might-be-wrong/</link>
        <guid isPermaLink="false">679d15354d82730349724e1e</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 31 Jan 2025 11:54:48 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-31-11.53.38---A-bold--minimalist-illustration-in-the-style-of-Paul-Rand--featuring-a-single-solid-yellow-stylized-sun-with-geometric-rays-shining-over-a-house-with-.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR:</strong> Some homes labeled "unsuitable" for solar could still benefit from it—often with more affordable equipment and creative system design. Consider getting a quote from an installer who can better accommodate your home's unique conditions.</p><hr><p>It took me over five years to finally commit to installing solar on my house. Throughout my investigations, I had some unusual experiences. One company, in particular, visited multiple times over several years. Initially, I reached out because I was interested in a new type of solar panel they had developed. They came early in the product’s lifecycle to assess whether my home was a good candidate, but then I didn’t hear from them for nearly a year. After regularly contacting them when they finally got back to me, they said my home wasn’t suitable for their technology.</p><p>A few years later, I contacted them again to explore their more traditional solar panels and received a quote—but the cost of their battery and PV system was extremely high, and the projected payback period was too long to justify the investment. A few more years passed, and I received another quote from a <a href="https://potreroenergy.com/?ref=danielraffel.me" rel="noreferrer">different local installer</a>. This time, the payback period was a much more reasonable 3-4 years. I decided to move forward, especially since federal incentives and California’s <a href="https://www.cpuc.ca.gov/industries-and-topics/electrical-energy/demand-side-management/self-generation-incentive-program/participating-in-self-generation-incentive-program-sgip?ref=danielraffel.me" rel="noreferrer">SGIP battery rebate</a> made the financials more attractive. The SGIP rebate has since been capped, but is still a valuable rebate.</p><p>The reason I wanted to write this post is that I’ve heard from several people exploring solar who were told by installers that their home wasn’t a good fit. However, many of them live in sunny parts of the Bay Area, where shading and overcast conditions likely aren’t significant issues given modern PV efficiency gains. Even with partial shading, a <a href="https://www.youtube.com/watch?v=JjJ3v_HPPiY&ref=danielraffel.me" rel="noreferrer">PV system configured in parallel</a> using <a href="https://www.youtube.com/watch?v=Uj0GodUF2Ys&ref=danielraffel.me" rel="noreferrer">bifacial</a>, <a href="https://www.youtube.com/watch?v=otovXbs_qZs&ref=danielraffel.me" rel="noreferrer">half-cell design panels</a> can reduce power loss. Adding microinverters or power optimizers can further enhance efficiency.</p><p>On overcast days, my system still generates a useful amount of power, though output can fall below 20% of peak capacity. Fortunately, those days aren’t frequent, and even on overcast days it's usually enough to cover most of my daytime energy needs. More importantly, with a large enough battery backup, I can bank excess energy on sunnier days to offset the dips during cloudier periods. Since overcast conditions where I live are sporadic and tree shading is often manageable, I suspect some installers may be overestimating their impact.</p><p>This raises a key question: Why do some installers discourage solar installations? Don’t they want to make money? </p><p>I suspect the issue comes down to maximizing profit and setting a minimum project cost to justify their time. For example, if an installer’s minimum viable project is $45,000 to $50,000 in an affluent area, they may simply walk away from anything smaller. There are costs to continuing to engage with a potential customer who might realize the deal isn’t worthwhile, so it’s easier for them to cut things off early. If the estimated payback period exceeds 10+ years, it becomes a tough sell. Most solar quotes outline costs and payback periods, and rather than presenting an unappealing financial case, my assumption is that some installers might simply say, “Your home isn’t a strong candidate,” citing tree cover or another factor as an easy explanation. Ideally, those who do that would be upfront and admit, “Our pricing is on the higher end; you might find a better deal elsewhere,” but that could come across as acknowledging they’re overcharging. While some installers may prioritize their margins or simply have higher operating costs, many others are truly committed to helping people adopt solar and are driven to find cost-effective solutions that maximize system performance. These are the providers worth seeking out.</p><p>If an installer has told you that your home isn’t a good fit for solar, it’s worth considering another opinion from one that uses more cost-effective technology, such as inverters and battery solutions from younger companies like <a href="https://eg4electronics.com/?ref=danielraffel.me" rel="noreferrer">EG4</a>, which offers high performance inverter and battery hardware at a fraction of the cost of more established, premium brands. The market is full of affordable PV panel manufacturers. A more cost-effective installer may be able to cut the cost in half without sacrificing power generation or storage capacity. </p><p>Smaller solar installers aren’t cutting prices out of desperation—they can simply afford to offer lower costs. Advances in solar and battery technology have made systems far more efficient and affordable, allowing smaller businesses to pass those savings on to customers. In California, although some homes have major challenges that make solar impractical (such as heavy shading or unsuitable roof angles), they are the exception rather than the norm. The real question is cost: How long will it take for the system to pay for itself? Higher-cost systems naturally result in longer payback periods, making the investment harder to justify—especially if they don’t generate enough solar energy to store in batteries.</p><p>With energy costs in California expected to keep rising, now may be the best time to explore a whole-home backup system—one that can power your entire house during an outage, increase energy independence, and reduce long-term utility costs. The grid is becoming more strained and less reliable, making a well-designed solar-plus-battery system a smart investment for both savings and resilience. Plus, federal and state rebates won’t be around forever, so taking advantage of incentives now could significantly improve your financial return.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Fire Roasted Salsa ]]></title>
        <description><![CDATA[ There’s a Mexican restaurant in San Francisco that I really love, and they serve a fire roasted salsa that I find absolutely delicious. Without mentioning any name&#39;s—since this version is surprisingly faithful to the original—I thought I’d share a recipe for a variation I make at home. ]]></description>
        <link>https://danielraffel.me/2025/01/26/fire-roasted-salsa/</link>
        <guid isPermaLink="false">6796c096c1e4ee0347a14e16</guid>
        <category><![CDATA[ 👨‍🍳 Cooking ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sun, 26 Jan 2025 15:26:45 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/35F51815-7A22-4F64-A94C-2AB8EA6B6864.png" medium="image"/>
        <content:encoded><![CDATA[ <p>There’s a Mexican restaurant in San Francisco that I really love, and they serve a fire roasted salsa that I find absolutely delicious. Without mentioning any name's—since this version is surprisingly faithful to the original—I thought I’d share a recipe for a variation I make at home.</p><p><strong>Ingredients</strong></p><ul><li>2x 28oz can fire roasted tomatoes</li><li>2 bunch green onions (~1c chopped)</li><li>2-3 heads of garlic</li><li>1 dried pasilla chili (deseeded and roughly chopped)</li><li>15-20 medium dried chili de árbol (deseeded and roughly chopped)</li><li>1 guajillo chili (deseeded and roughly chopped)</li><li>2 bunch cilantro (~1c packed, removed from stems)</li><li>1/8c canola oil</li><li>1 tbs kosher salt</li></ul><p><strong>Steps</strong></p><ol><li>Coarsely chop the onion into big chunks. Leave the garlic cloves unpeeled. </li><li>Pan roast the onions and garlic with no oil for 7-10 minutes until they are slightly blackened.</li><li>When done, transfer the onion and garlic to a bowl or plate. I find it’s easier to separate them. Remove the skin and cut the stem from the garlic cloves. Set both aside.</li><li>In a large sauce pan add the tomatoes and deseeded chili’s. Simmer for 10 minutes.</li><li>Add the chopped cilantro, garlic and onions and simmer for 1 minute.</li><li>Remove from heat and blend in a Vitamix or other high powered mixer until mostly smooth.</li><li>Drizzle oil in to emulsify.</li><li>Add salt to taste.</li><li>Chill and consume the next day.</li></ol><p>Note: Although it might be tempting to add a little water while sautéing or blending, I’ve found it’s usually unnecessary and can easily make the salsa too thin.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5561.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5561.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5561.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5561.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5561.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Blackened scallions</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5562.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5562.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5562.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5562.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5562.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Garlic with skins removed</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5563.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5563.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5563.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5563.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5563.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Cilantro with minimal stems (else gets a stringy texture and bitter taste)</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5564.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5564.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5564.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5564.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5564.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Simmering tomato and chilis</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5565.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5565.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5565.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5565.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5565.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Simmering tomato, chili, onion, garlic and cilantro</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5566.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="1500" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5566.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5566.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5566.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5566.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Blended and emulsified salsa</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/IMG_5571.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="2667" srcset="https://danielraffel.me/content/images/size/w600/2025/01/IMG_5571.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/IMG_5571.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/IMG_5571.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2025/01/IMG_5571.jpeg 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Fire roasted salsa in jar</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Thoughts on Seed Oils ]]></title>
        <description><![CDATA[ A friend recently asked for my take on seed oils. To be honest, I’m not a subject area expert and haven’t followed the topic too closely, but based on the research I’ve read they don’t seem harmful. In fact, they might even support heart health when used instead of saturated fats. ]]></description>
        <link>https://danielraffel.me/2025/01/22/thoughts-on-seed-oils/</link>
        <guid isPermaLink="false">679162b1042808034c798aee</guid>
        <category><![CDATA[ 👨‍🍳 Cooking ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 22 Jan 2025 13:51:58 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-22-13.49.25---An-abstract--minimalist-graphic-design-in-the-style-of-Paul-Rand--featuring-geometric-shapes-and-bold-colors-to-represent-the-concept-of-seed-oils-and.png" medium="image"/>
        <content:encoded><![CDATA[ <p>A friend recently asked for my take on seed oils. To be honest, I’m not a subject area expert and haven’t followed the topic too closely, but based on the research I’ve read—like articles from the <a href="https://www.heart.org/en/news/2024/08/20/theres-no-reason-to-avoid-seed-oils-and-plenty-of-reasons-to-eat-them?ref=danielraffel.me" rel="noreferrer">American Heart Association</a> and <a href="https://www.theveganrd.com/2025/01/seed-oils-in-plant-based-diets/?ref=danielraffel.me" rel="noreferrer">insights from a nutrition professional</a>—they don’t seem harmful. In fact, they might even support heart health when used instead of saturated fats.</p><p>At this point, my perspective is that the concerns surrounding seed oils are mostly exaggerated and fueled by misinformation rather than solid science. I tend to agree with my friend’s take: <em>“Diet is hard to science. Probably diversity of natural substances is the best?”</em></p><p>Avoiding seed oils entirely would be challenging—and honestly, it feels like an unnecessary restriction that narrows my vegan and vegetarian choices. While they are commonly found in highly processed foods, I see that as a separate concern. The evidence on the health risks of <a href="https://www.amazon.com/Ultra-Processed-People-Science-Behind-Food/dp/1324036729/ref=tmm_hrd_swatch_0?_encoding=UTF8&dib_tag=se&dib=eyJ2IjoiMSJ9.C0kTcPaXf83MQfJnj04nv1purzJQOIZhdBcKS2IFv-CeoKAq6nArpFGFl7sR2c7Ulru4MpWZTusBL9j600LT-xk4I4PPn_cXtUemmT6i48HUX9xtj3D4quvCWGjbhb480nDPwH5PPSosQqXAAor5CtkuLQSlNPLFcV06pZVyFI9cuzGLZLNJEE1-y67c3mUb2lft4CkQ-juEE24ONsQseK-MF2hbXplVwAlgxJs8a3g.Szvr5wMwzI4U4L9vtXpYkNndsSQeBdPazWPRHZmOEpw&qid=1737582649&sr=1-1&ref=danielraffel.me" rel="noreferrer">highly processed foods</a> seems solid to me, and I do make an effort to limit them. When salad chains like <a href="https://www.theguardian.com/food/2025/jan/21/sweetgreen-seed-oils?ref=danielraffel.me" rel="noreferrer">Sweetgreen switch to a fully “seed oil-free” menu</a>, I get the business rationale given current trends, but it also seems like they’re catering to misconceptions rather than science.</p><p>If I’ve overlooked something significant or new research emerges, I’m open to reconsidering my stance. But for now, avoiding seed oils altogether seems unnecessary.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Migrating Proxmox Containers to a New Machine ]]></title>
        <description><![CDATA[ I recently picked up a Lenovo ThinkBook M series and installed Proxmox on it. Since I was already running Proxmox on a MacBook Air, I wanted to migrate all my existing containers to the new machine. Here&#39;s how I did it. ]]></description>
        <link>https://danielraffel.me/til/2025/01/21/migrating-proxmox-containers-to-a-new-machine/</link>
        <guid isPermaLink="false">6785d106a3c85f034dfabf29</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 21 Jan 2025 11:52:44 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-21-11.52.08---Create-an-image-in-the-style-of-Paul-Rand-that-visually-represents-the-process-of-migrating-Proxmox-containers-to-a-new-machine.-The-image-should-incl.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently picked up a <a href="https://www.lenovo.com/us/en/d/thinkcentre-m-series-tiny/?orgRef=https%253A%252F%252Fwww.google.com%252F&srsltid=AfmBOoohkYK0_CUTjT8n8fc9gll1oBRuDkGSFBwFjQ_7XUu7rKL9mlq7&ref=danielraffel.me" rel="noreferrer">Lenovo ThinkBook M series</a> and installed <a href="https://www.proxmox.com/?ref=danielraffel.me" rel="noreferrer">Proxmox</a> on it. Since I was already running Proxmox on a MacBook Air, I wanted to migrate all my existing containers to the new machine. The process turned out to be surprisingly simple, and here’s a step-by-step guide based on how I did it.</p><hr><h4 id="step-1-back-up-containers-on-the-source-machine"><strong>Step 1: Back Up Containers on the Source Machine</strong></h4><p>The first step was to create backups of each container on the source Proxmox machine (my MacBook Air). Proxmox automatically stores these backups in the following directory:</p><pre><code class="language-bash">/var/lib/vz/dump/
</code></pre><p>You can trigger the backups via the Proxmox web interface by selecting each container and choosing the "Backup" option.</p><p><strong>Tip:</strong> It’s a good idea to regularly back up your containers and store the backups on a remote drive or another machine. While you can set up a dedicated Proxmox Backup Server, I find that using simple image backups works just as well, as long as you stick to a consistent backup routine.</p><p><strong>Note:</strong> If you're looking to also back up and restore your Proxmox host, <a href="https://forum.proxmox.com/threads/official-way-to-backup-proxmox-ve-itself.126469/post-552384?ref=danielraffel.me" rel="noreferrer">this thread</a> could be helpful. There may be custom configurations that can streamline the restoration process.</p><hr><h4 id="step-2-transfer-the-backup-files-to-the-new-proxmox-machine"><strong>Step 2: Transfer the Backup Files to the New Proxmox Machine</strong></h4><p>After backing up the containers, I transferred the backup files to the new Proxmox machine. While SSH'd into the source machine, I used <code>scp</code> to copy the files over:</p><pre><code class="language-bash">scp -r /var/lib/vz/dump/* root@[targetIP]:/var/lib/vz/dump/
</code></pre><p>Make sure to replace <code>[targetIP]</code> with the IP address of your new Proxmox machine.</p><p>This command recursively copies all backup files from the source machine to the new one. Using <code>scp</code> worked great for me, but you can also use other tools like <code>rsync</code> if you prefer incremental transfers.</p><hr><h4 id="step-3-restore-containers-on-the-new-machine"><strong>Step 3: Restore Containers on the New Machine</strong></h4><p>Once the backup files were on the new machine, I opened the Proxmox web interface, navigated to the "Backup" section, and selected "Restore" for each container image. The restore process went smoothly, and within minutes, all my containers were up and running on the new system.</p><hr><h3 id="final-thoughts">Final Thoughts</h3><p>Migrating Proxmox containers to a new machine was much easier than I expected. By simply backing up, transferring, and restoring container images, I was able to replicate my entire setup without any headaches.</p><p>While Proxmox Backup Server is a great option for centralized and automated backups, this simple image-based method works perfectly for me. The key is to back up regularly and store those images on another machine or an external drive to safeguard against hardware failures.</p><p>If you’re thinking about upgrading your Proxmox hardware or just want to be prepared for emergencies, this approach is a straightforward and reliable solution.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Connecting a Pioneer VSX-LX505 to Apple HomeKit Using Homebridge ]]></title>
        <description><![CDATA[ I recently purchased a Pioneer VSX-LX505 AV receiver and wanted to integrate it with Apple HomeKit using Homebridge. Since I couldn’t find clear information online confirming whether this setup was possible, I decided to document the steps that worked for me. ]]></description>
        <link>https://danielraffel.me/til/2025/01/21/connecting-a-pioneer-vsx-lx505-to-apple-homekit-using-homebridge/</link>
        <guid isPermaLink="false">678ebbde42e3a2034d780043</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 21 Jan 2025 11:30:52 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-20-13.34.12---An-illustration-in-the-style-of-Paul-Rand-depicting-a-modern-Pioneer-VSX-LX505-AV-receiver-connected-to-Apple-HomeKit-through-Homebridge.-The-design-s.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently purchased a Pioneer VSX-LX505 AV receiver and wanted to integrate it with Apple HomeKit using Homebridge. My goal was to easily control the amp—powering it on/off, switching inputs, adjusting the volume, and managing playback—all through the Apple Home app. Since I couldn’t find clear information online confirming whether this setup was possible, I decided to document the steps that worked for me. </p><p>The following assumes you are already familiar with Homebridge and have it set up with your Apple Home app. If you're new to Homebridge, you may want to check out Homebridge's official documentation to get started</p><h2 id="setting-up-the-homebridge-plugin">Setting Up the Homebridge Plugin</h2><p>To accomplish this, I used the <a href="https://github.com/nitaybz/homebridge-onkyo-pioneer?ref=danielraffel.me" rel="noreferrer"><code>homebridge-onkyo-pioneer</code></a> plugin, which provides support for Pioneer and Onkyo receivers. Setting it up was straightforward, with only a few key configurations required:</p><ol><li><strong>Finding the Receiver's IP Address:</strong><br>Since my amp was connected via Ethernet and I hadn’t manually configured the network settings yet, I used a network scanner to discover its IP address. It's worth making this IP address static and reserving it on your router to ensure it doesn't change, preventing potential connection issues with Homebridge in the future.</li><li><strong>Configuring the Plugin:</strong><br>After installing the plugin, the main settings required were:<ul><li><strong>IP Address:</strong> Adding the amp’s IP.</li><li><strong>Device Name:</strong> Choosing a friendly name.</li><li><strong>Volume Control Type:</strong> Selecting <code>light bulb</code> as the accessory type for volume control (other options include <code>fan</code>).</li></ul></li></ol><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/01/image-1.png" class="kg-image" alt="" loading="lazy" width="1522" height="678" srcset="https://danielraffel.me/content/images/size/w600/2025/01/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/01/image-1.png 1000w, https://danielraffel.me/content/images/2025/01/image-1.png 1522w" sizes="(min-width: 720px) 720px"></figure><h3 id="my-homebridge-configuration">My Homebridge Configuration</h3><p>Here’s the final configuration I used in the plug-in's <code>config.json</code> file:</p><pre><code class="language-json">{
    "statePollingInterval": 30,
    "debug": false,
    "receivers": [
        {
            "ip": "192.168.86.58",
            "name": "VXS-LX505",
            "volumeAccessory": "bulb",
            "maxVolume": 200
        }
    ],
    "platform": "OnkyoPioneer",
    "_bridge": {
        "username": "XX:XX:XX:XX:XX:XX",
        "port": 30209,
        "name": "homebridge-onkyo-pioneer EBCD"
    }
}
</code></pre><p>The key change, which is described below under resolving volume control issues, involved updating the <code>maxVolume</code> setting to 200.</p><hr><h3 id="pairing-with-homekit"><strong>Pairing with HomeKit</strong></h3><p>To optimize performance and ensure a stable connection, I set up the Pioneer receiver as a <strong>child bridge</strong> in Homebridge. In fairness, I have no idea if adding a child bridge for this device actually improves or enables anything. Regardless, the process I took involved two main steps:</p><h4 id="phase-1-adding-the-child-bridge"><strong>Phase 1: Adding the Child Bridge</strong></h4><ol><li>Opened the Apple Home app and tapped <strong>"Add Accessory."</strong></li><li>Used the camera to scan the QR code displayed in the Homebridge plugin interface.</li><li>The child bridge was successfully added to HomeKit, but the receiver itself didn’t appear as a controllable device.</li></ol><h4 id="phase-2-manually-adding-accessories"><strong>Phase 2: Manually Adding Accessories</strong></h4><p>Since the receiver didn’t automatically import any devices, I had to manually add it by following these steps:</p><ol><li>Opened the Apple Home app and tapped <strong>"Add Accessory."</strong></li><li>Selected <strong>"More Options."</strong></li><li>Found the Pioneer receiver listed as two separate accessories—one for <strong>inputs</strong> and another for <strong>power/volume.</strong></li><li>Added both to my HomeKit setup. <ol><li>Note: If/when asked added the following HomeKit code: 031-45-154</li></ol></li></ol><p>Once both accessories were added, the receiver finally appeared in the Home app, allowing me to control it directly from my iPhone.</p><hr><h2 id="resolving-volume-control-issues">Resolving Volume Control Issues</h2><p>After getting everything set up, I noticed an issue with volume control—HomeKit's slider maxed out at <strong>-32 dB</strong>, far below the receiver's true maximum.</p><h3 id="the-fix">The Fix:</h3><p>To resolve this, as described above in the config, I updated the <code>maxVolume</code> setting in my Homebridge configuration from <strong>100 to 200</strong>, then restarted Homebridge. This allowed the Apple Home app to control the full volume range from <strong>-80 dB to +17 dB</strong>, aligning perfectly with the amp’s physical volume dial.</p><hr><h2 id="final-thoughts">Final Thoughts</h2><p>With this setup, I can now:</p><ul><li>Turn the receiver on/off with Siri or automations.</li><li>Adjust volume using HomeKit sliders.</li><li>Change inputs from the Home app.</li><li>Create HomeKit automations to control the receiver alongside other smart devices.</li></ul><p>If you're looking to integrate your Pioneer AV receiver with Apple HomeKit, Homebridge makes it relatively simple with the right configuration. Just be sure to set the correct <code>maxVolume</code> and explore HomeKit's automation capabilities to make the most of your smart home setup!</p><hr><h3 id="pics-or-it-didnt-happen">Pics or It Didn't Happen </h3><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/Screenshot-2025-01-20-at-8.27.13-PM.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="1397" srcset="https://danielraffel.me/content/images/size/w600/2025/01/Screenshot-2025-01-20-at-8.27.13-PM.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2025/01/Screenshot-2025-01-20-at-8.27.13-PM.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/Screenshot-2025-01-20-at-8.27.13-PM.jpeg 1600w, https://danielraffel.me/content/images/2025/01/Screenshot-2025-01-20-at-8.27.13-PM.jpeg 2388w" sizes="(min-width: 720px) 720px"><figcaption><a href="https://support.apple.com/guide/iphone/control-accessories-iph0a717a8fd/ios?ref=danielraffel.me"><span style="white-space: pre-wrap;">Apple Home app</span></a><span style="white-space: pre-wrap;"> showing the media controller playing music on Spotify</span></figcaption></figure><p></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/Screenshot-2025-01-20-at-8.29.05-PM.jpeg" class="kg-image" alt="" loading="lazy" width="398" height="834"><figcaption><a href="https://support.apple.com/en-us/108330?ref=danielraffel.me"><span style="white-space: pre-wrap;">Control Center</span></a><span style="white-space: pre-wrap;"> media controls on iOS/iPadOS</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/Screenshot-2025-01-20-at-8.29.30-PM.jpeg" class="kg-image" alt="" loading="lazy" width="916" height="791" srcset="https://danielraffel.me/content/images/size/w600/2025/01/Screenshot-2025-01-20-at-8.29.30-PM.jpeg 600w, https://danielraffel.me/content/images/2025/01/Screenshot-2025-01-20-at-8.29.30-PM.jpeg 916w" sizes="(min-width: 720px) 720px"><figcaption><a href="https://support.apple.com/en-au/guide/iphone/iphcd5c65ccf/ios?ref=danielraffel.me"><span style="white-space: pre-wrap;">Now Playing</span></a><span style="white-space: pre-wrap;"> on iOS/iPadOS Lock Screen</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Setting Up Nginx with SSL on Proxmox for Tailscale Subdomains ]]></title>
        <description><![CDATA[ Running a home lab on Proxmox is a great way to self-host services, but keeping track of multiple IPs and ports can get complicated. To simplify access to services, I set up custom subdomains hosted on Cloudflare that are easy to remember and restricted to my private Tailscale network. ]]></description>
        <link>https://danielraffel.me/til/2025/01/18/setting-up-nginx-with-ssl-on-proxmox-for-tailscale-subdomains/</link>
        <guid isPermaLink="false">678af77f42e3a2034d77ffe2</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 17 Jan 2025 17:06:46 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-17-17.04.07---A-minimalist-and-modern-illustration-inspired-by-Paul-Rand-s-design-style--featuring-abstract-geometric-shapes-to-symbolize-a-Proxmox-server--a-privat.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Running a home lab on Proxmox is a great way to self-host services, but keeping track of multiple IPs and ports can get complicated. To simplify access to services, I set up custom subdomains hosted on <a href="http://cloudflare.com/?ref=danielraffel.me" rel="noreferrer">Cloudflare</a> that are easy to remember and restricted to my private Tailscale network.</p><p>With <a href="https://nginx.org/?ref=danielraffel.me" rel="noreferrer">Nginx</a>, Cloudflare, and <a href="http://tailscale.com/?ref=danielraffel.me" rel="noreferrer">Tailscale</a>, each service in my home lab now has its own memorable, SSL-secured, private URL. Here’s how I configured my setup.</p><hr><h3 id="step-1-install-necessary-software-on-proxmox-pve-host"><strong>Step 1: Install Necessary Software on Proxmox PVE Host</strong></h3><p><strong>Update the System:</strong></p><pre><code class="language-bash">sudo apt update &amp;&amp; sudo apt upgrade -y
</code></pre><p><strong>Install Certbot and Cloudflare DNS Plugin:</strong></p><pre><code class="language-bash">sudo apt install certbot python3-certbot-dns-cloudflare -y
</code></pre><p><strong>Install Nginx:</strong></p><pre><code class="language-bash">sudo apt install nginx -y
</code></pre><hr><h3 id="step-2-configure-cloudflare-for-dns-verification"><strong>Step 2: Configure Cloudflare for DNS Verification</strong></h3><ol><li>Log in to your Cloudflare dashboard.</li><li><strong>Create an API Token</strong>:<ul><li>Navigate to <strong>My Profile &gt; API Tokens</strong>.</li><li>Create a new token with the following settings:<ul><li><strong>Zone</strong>: DNS - Edit</li><li><strong>Zone Resources</strong>: Include specific zones (your domains).</li></ul></li><li>Copy the generated token.</li></ul></li></ol><p><strong>Store the API Token Securely</strong>:</p><pre><code class="language-bash">sudo mkdir -p /etc/letsencrypt
sudo nano /etc/letsencrypt/cloudflare.ini
</code></pre><p>Add the following to the file:</p><pre><code class="language-ini">dns_cloudflare_api_token = YOUR_API_TOKEN
</code></pre><p>Secure the file:</p><pre><code class="language-bash">sudo chmod 600 /etc/letsencrypt/cloudflare.ini
</code></pre><hr><h3 id="step-3-create-a-dns-a-record-in-cloudflare"><strong>Step 3: Create a DNS A Record in Cloudflare</strong></h3><p>For each new subdomain:</p><ol><li><strong>Create an A Record</strong>:<ul><li><strong>Name</strong>: <code>&lt;subdomain&gt;.yourdomain.com</code> (e.g., <code>service.yourdomain.com</code>).</li><li><strong>Content</strong>: Use any <strong>public IP</strong>—this doesn't need to point to your actual server, as we'll update it later. It can even be an IP registered to someone else, since this is a temporary placeholder.</li><li><strong>Proxy Status</strong>: Enable Cloudflare proxy (orange cloud) if desired.</li></ul></li></ol><p><strong>Verify the DNS Record</strong>:</p><pre><code class="language-bash">ping &lt;subdomain&gt;.yourdomain.com
</code></pre><p>Ensure the subdomain resolves to the placeholder public IP.</p><hr><h3 id="step-4-obtain-ssl-certificates"><strong>Step 4: Obtain SSL Certificates</strong></h3><p>Run Certbot to generate an SSL certificate for the subdomain:</p><pre><code class="language-bash">certbot certonly --dns-cloudflare --dns-cloudflare-credentials /etc/letsencrypt/cloudflare.ini -d &lt;subdomain&gt;.yourdomain.com
</code></pre><p>Certificates will be stored at:</p><pre><code class="language-bash">/etc/letsencrypt/live/&lt;subdomain&gt;.yourdomain.com/
</code></pre><hr><h3 id="step-5-update-nginx-configuration"><strong>Step 5: Update Nginx Configuration</strong></h3><ol><li><strong>Create or Edit Your Configuration File</strong>:<br>Open your Nginx configuration file or create a new one specifically for your setup:</li></ol><pre><code class="language-bash">sudo nano /etc/nginx/sites-available/proxmox
</code></pre><ol start="2"><li><strong>Add Server Blocks for Your Subdomains</strong>:<br>Include a server block for each subdomain. Replace <code>&lt;subdomain&gt;</code>, <code>&lt;tailscale-IP&gt;</code>, and <code>&lt;port&gt;</code> with the appropriate values:</li></ol><pre><code class="language-nginx"># Redirect HTTP to HTTPS for &lt;subdomain&gt;.yourdomain.com
server {
    listen 80;
    server_name &lt;subdomain&gt;.yourdomain.com;

    return 301 https://$host$request_uri;
}

# HTTPS server block for &lt;subdomain&gt;.yourdomain.com
server {
    listen 443 ssl;
    server_name &lt;subdomain&gt;.yourdomain.com;

    ssl_certificate     /etc/letsencrypt/live/&lt;subdomain&gt;.yourdomain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/&lt;subdomain&gt;.yourdomain.com/privkey.pem;

    location / {
        proxy_pass http://&lt;tailscale-IP&gt;:&lt;port&gt;;
        proxy_http_version 1.1;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

        # Optional: Increase timeouts for long connections
        proxy_read_timeout 300s;
        proxy_send_timeout 300s;

        # Disable buffering for real-time data
        proxy_buffering off;
    }
}
</code></pre><ol start="3"><li><strong>Set the New Configuration as Default</strong>:</li></ol><p>Disable the default configuration that comes with Nginx:</p><pre><code class="language-bash">sudo rm /etc/nginx/sites-enabled/default
</code></pre><p>Enable your new configuration:</p><pre><code class="language-bash">sudo ln -s /etc/nginx/sites-available/proxmox /etc/nginx/sites-enabled/
</code></pre><hr><h3 id="step-6-test-and-reload-nginx"><strong>Step 6: Test and Reload Nginx</strong></h3><p>If the test is successful, reload Nginx:</p><pre><code class="language-bash">sudo systemctl reload nginx
</code></pre><p>Test the configuration:</p><pre><code class="language-bash">sudo nginx -t
</code></pre><hr><h3 id="step-7-update-dns-to-tailscale-ip"><strong>Step 7: Update DNS to Tailscale IP</strong></h3><ol><li>Return to the Cloudflare dashboard.</li><li>Update the A record for the subdomain:<ul><li>Change the <strong>Content</strong> value from the public IP to the <strong>Tailscale private IP</strong> (e.g., <code>100.X.X.X</code>).</li></ul></li></ol><p>Verify the new DNS record:</p><pre><code class="language-bash">ping &lt;subdomain&gt;.yourdomain.com
</code></pre><p>Ensure it resolves to your Tailscale IP.</p><hr><h3 id="step-8-verify-the-setup"><strong>Step 8: Verify the Setup</strong></h3><ol><li>Open the subdomain in a browser to verify functionality.</li></ol><p>Test HTTPS access for the subdomain:</p><pre><code class="language-bash">curl -k https://&lt;subdomain&gt;.yourdomain.com
</code></pre><hr><h3 id="step-9-set-up-automatic-ssl-renewal"><strong>Step 9: Set Up Automatic SSL Renewal</strong></h3><p>Automate SSL certificate renewal with a cron job:</p><pre><code class="language-bash">sudo crontab -e
</code></pre><p>Add the following line:</p><pre><code class="language-bash">0 3 * * * certbot renew --quiet --post-hook "systemctl reload nginx"
</code></pre><hr><h3 id="notes-on-ssl-and-tailscale-ips"><strong>Notes on SSL and Tailscale IPs</strong></h3><p>While Tailscale provides a highly secure private network, adding SSL to subdomains ensures that browser connections are treated as secure, eliminating those annoying warnings about "insecure connections." This makes accessing services much more seamless, especially when using modern browsers that enforce HTTPS for many functionalities.</p><p>It’s also worth noting that while Tailscale IPs are stable, they’re not static. In rare cases, a Tailscale IP could change, which might require updating the DNS record for the affected subdomain to avoid disruptions. </p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Solving a Proxmox Networking Mystery: USB Ethernet Dongle Filtering Traffic ]]></title>
        <description><![CDATA[ I recently encountered a strange networking issue while running Proxmox on a MacBook Air with a USB-to-Ethernet dongle. Disabling RX/TX checksumming on the dongle forced the kernel to handle checksumming in software, resolving the issue. ]]></description>
        <link>https://danielraffel.me/til/2025/01/07/how-i-fixed-proxmox-network-issues-caused-by-settings-on-a-usb-ethernet-dongle/</link>
        <guid isPermaLink="false">677d73b290a694034c0f7d3c</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 07 Jan 2025 11:35:38 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-07-11.28.01---An-illustration-in-the-style-of-Paul-Rand--depicting-a-modern-tech-networking-issue-with-abstract-geometric-shapes.-The-design-should-represent-a-MacB.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently encountered a strange networking issue while running <a href="https://www.proxmox.com/en/?ref=danielraffel.me" rel="noreferrer">Proxmox</a> on a MacBook Air using a USB-to-Ethernet dongle. Although Proxmox could successfully ping devices on my local network, attempts to connect using certain other protocols (e.g., HTTP, netcat) would hang indefinitely.</p><p>To troubleshoot, I compared the behavior of my macOS laptop (connected via Wi-Fi) with Proxmox running on the same MacBook Air (connected via Ethernet through a USB dongle). Despite being on the same subnet, Proxmox devices could ping a specific device (<a href="https://www.eyezon.com/duo.php?ref=danielraffel.me" rel="noreferrer">EnvisaLink DUO</a>) but failed to connect using <code>curl</code> or <code>netcat</code>.</p><h3 id="initial-observations">Initial Observations</h3><h4 id="on-macos-connected-via-wi-fi">On macOS (Connected via Wi-Fi)</h4><ul><li><strong>Ping</strong>: <code>ping 192.168.86.150</code> → Success</li><li><strong>HTTP Request</strong>: <code>curl http://192.168.86.150</code> → Success (retrieved the expected webpage)</li><li><strong>Netcat</strong>: <code>nc -zv 192.168.86.150 4025</code> → Success</li><li><strong>MTU Size</strong>: <code>ifconfig</code> returned an MTU size of 1500</li><li><strong>Port Scan</strong>: <code>nmap -p 80 192.168.86.150</code> showed port 80 as open.</li></ul><h4 id="on-proxmox-host-and-vms-connected-via-usb-dongle">On Proxmox (Host and VMs Connected via USB Dongle)</h4><ul><li><strong>Ping</strong>: <code>ping 192.168.86.150</code> → Success</li><li><strong>HTTP Request</strong>: <code>curl http://192.168.86.150</code> → Hangs with no response</li><li><strong>Netcat</strong>: <code>nc -zv 192.168.86.150 4025</code> → Hangs</li><li><strong>MTU Size</strong>: <code>ip link show</code> returned an MTU size of 1500</li><li><strong>Port Scan</strong>: <code>nmap -p 80 192.168.86.150</code> showed port 80 as filtered.</li></ul><p>This was puzzling: even though Proxmox was on the same subnet and could ping the target device, it couldn’t establish connections using other protocols.</p><hr><h3 id="troubleshooting-steps">Troubleshooting Steps</h3><h4 id="1-verified-network-configuration">1. Verified Network Configuration</h4><p>I confirmed that both the Proxmox host and its VMs had static IP addresses in the correct subnet.</p><h4 id="2-checked-arp-resolution">2. Checked ARP Resolution</h4><p>The ARP table showed that the target device was reachable. I flushed the ARP table to rule out any stale entries, but the issue persisted.</p><h4 id="3-ran-tcpdump">3. Ran <code>tcpdump</code></h4><p>Capturing outgoing packets with <code>tcpdump</code> revealed that SYN packets were being sent to the target device, but no responses were received.</p><h4 id="4-tested-dns-resolution">4. Tested DNS Resolution</h4><p>I added the target device’s IP to <code>/etc/hosts</code> to ensure hostname resolution wasn’t an issue. However, this didn’t solve the problem.</p><h4 id="5-checked-mtu-size">5. Checked MTU Size</h4><p>Both macOS and Proxmox reported an MTU size of 1500, ruling out MTU mismatches as a potential cause.</p><h4 id="6-compared-nmap-results">6. Compared <code>nmap</code> Results</h4><p>This was the key clue:</p><ul><li>On macOS, <code>nmap</code> reported port 80 as open.</li><li>On Proxmox, <code>nmap</code> reported port 80 as filtered.</li></ul><p>Since no firewall was running on Proxmox, I suspected that the USB Ethernet dongle might be interfering with traffic.</p><hr><h3 id="honing-in-on-the-usb-ethernet-dongle">Honing in on the USB Ethernet Dongle</h3><p>Given that the issue only occurred when using the USB-to-Ethernet dongle, I started to suspect that the dongle might be filtering or altering traffic. To investigate further, I checked the dongle’s offloading features using <code>ethtool</code>:</p><pre><code class="language-bash">ethtool -k enx803f5d096607
</code></pre><p>The output showed that RX and TX checksumming were enabled:</p><pre><code class="language-plaintext">rx-checksumming: on
tx-checksumming: on
  tx-checksum-ipv4: on
  tx-checksum-ip-generic: off [fixed]
  tx-checksum-ipv6: on
  tx-checksum-fcoe-crc: off [fixed]
  tx-checksum-sctp: off [fixed]
</code></pre><p>Offloading generally improves performance by allowing the network interface to handle certain tasks. However, in some USB Ethernet dongles, enabling offloading can cause issues with packet integrity or filtering.</p><hr><h3 id="the-fix">The Fix</h3><p>To test whether disabling RX/TX checksumming would resolve the issue, I ran:</p><pre><code class="language-bash">sudo ethtool -K enx803f5d096607 rx off tx off
</code></pre><p>I then re-ran the <code>nmap</code> scan:</p><pre><code class="language-bash">nmap -p 80 192.168.86.150
</code></pre><p>This time, the port was reported as open:</p><pre><code class="language-plaintext">PORT   STATE SERVICE
80/tcp open  http
</code></pre><p>A quick test with <code>curl</code> also worked:</p><pre><code class="language-bash">curl http://192.168.86.150
</code></pre><p>Result:</p><pre><code class="language-plaintext">&lt;HTML&gt;&lt;BODY&gt;&lt;H1&gt;Server Requires Authentication&lt;/H1&gt;&lt;/BODY&gt;&lt;/HTML&gt;
</code></pre><p>Finally, the issue was resolved.</p><hr><h3 id="automatically-disabling-rxtx-checksumming-at-boot">Automatically Disabling RX/TX Checksumming at Boot</h3><p>The command <code>sudo ethtool -K enx803f5d096607 rx off tx off</code> is temporary and will be reset on reboot. To make it persistent, I created a systemd service:</p><p><strong>Enable the service:</strong></p><pre><code class="language-bash">sudo systemctl enable disable-offload.service
</code></pre><p><strong>Add the following content:</strong></p><pre><code class="language-plaintext">[Unit]
Description=Disable RX/TX checksumming on Ethernet USB dongle
After=network.target

[Service]
ExecStart=/sbin/ethtool -K enx803f5d096607 rx off tx off
Type=oneshot
RemainAfterExit=true

[Install]
WantedBy=multi-user.target
</code></pre><p><strong>Create a new systemd service file:</strong></p><pre><code class="language-bash">sudo nano /etc/systemd/system/disable-offload.service
</code></pre><p>This ensures that the <code>ethtool</code> command runs automatically at boot.</p><hr><h3 id="why-disabling-checksumming-helped">Why Disabling Checksumming Helped</h3><p>USB Ethernet dongles can have limited hardware capabilities. Enabling checksum offloading can lead to packet corruption or filtering by the dongle itself. Disabling RX/TX checksumming forces the kernel to handle checksumming in software, which resolved the issue in this case.</p><hr><h3 id="final-thoughts-and-lessons-learned">Final Thoughts and Lessons Learned</h3><p>If you’re running Proxmox on a machine with a USB-to-Ethernet dongle and encounter strange network behavior where ping works but other protocols hang, try disabling RX/TX checksumming using <code>ethtool</code>. This simple fix saved me hours of frustration and could help others avoid similar issues.</p><p>Sometimes, the problem isn’t with your network configuration or the target device but with the hardware in between.</p><hr><h3 id="commands-recap">Commands Recap</h3><p><strong>Check offloading status:</strong></p><pre><code class="language-bash">ethtool -k &lt;interface&gt;
</code></pre><p><strong>Disable RX/TX checksumming:</strong></p><pre><code class="language-bash">sudo ethtool -K &lt;interface&gt; rx off tx off
</code></pre><p><strong>Re-enable checksumming (if needed):</strong></p><pre><code class="language-bash">sudo ethtool -K &lt;interface&gt; rx on tx on
</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Automated EV Charging Using EVCC, an EG4 Solar Inverter, and a Tesla Mobile Connector ]]></title>
        <description><![CDATA[ I recently came across a project called EVCC—a great solution for anyone looking to optimize EV charging and comfortable working with a terminal shell. It connects with your vehicle, inverter, or home storage to make smart charging decisions. ]]></description>
        <link>https://danielraffel.me/2025/01/03/how-i-automated-ev-charging-using-evcc-an-eg4-solar-inverter-and-a-tesla-mobile-connector/</link>
        <guid isPermaLink="false">67782afa63a310034f294ad7</guid>
        <category><![CDATA[ 🏠 Home Automation ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 03 Jan 2025 12:55:24 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2025/01/DALL-E-2025-01-03-12.47.45---An-abstract--minimalistic-image-in-the-style-of-Paul-Rand--depicting-a-DIY-EV-solar-charging-setup.-The-composition-includes-symbolic-representations-.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR: Want to charge your Tesla with the Universal Mobile Connector using EVCC? Have a Lux Power or EG4 inverter and battery and want to integrate them with EVCC? Or are you interested in using EVCC to calculate real-time energy import/export pricing with PG&amp;E? If any of these scenarios sound relevant, read on!</strong></p><p>I recently shared my experience cobbling together a <a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/" rel="noreferrer">DIY</a> EV solar charging solution—something I don’t recommend! Thankfully, I discovered a much more mature project called <a href="https://evcc.io/en/?ref=danielraffel.me" rel="noreferrer">EVCC</a>, which I wish I had known about earlier. It’s a fantastic option for those looking to optimize their EV charging and who are comfortable using the terminal. Once installed it's user-friendly enough to be shared with other household members who may need to use it to quickly fast charge an EV.</p><p>Since EVCC is open source, it didn’t matter that it lacked native support for my <a href="https://luxpowertek.com/?ref=danielraffel.me" rel="noreferrer">Lux Power</a>/<a href="https://eg4electronics.com/?ref=danielraffel.me" rel="noreferrer">EG4</a> inverter. A <a href="https://skrul.com/?ref=danielraffel.me" rel="noreferrer">buddy</a> and I purchased a <a href="https://monitormy.solar/detail/13?ref=danielraffel.me">third-party dongle</a> designed to extract data from the otherwise closed inverter, and we figured out how to integrate it with EVCC. This allowed us to manage charging for our similar setups, which each include an <a href="https://eg4electronics.com/categories/inverters/eg4-18kpv-12lv-all-in-one-hybrid-inverter/?ref=danielraffel.me" rel="noreferrer">EG4 18kPV inverter</a>, a Tesla Model 3, and a <a href="https://shop.tesla.com/product/mobile-connector?ref=danielraffel.me">Tesla Universal Mobile Connector</a> (UMC). The dongle streams data via MQTT, enabling EVCC to track critical metrics such as solar generation, grid usage, and battery status, while also supporting dynamic control of battery discharge during fast charging.</p><p>In addition to integrating the inverter, I set up <a href="https://github.com/wimaha/TeslaBleHttpProxy?ref=danielraffel.me">TeslaBLEHttpProxy</a> to let EVCC smartly control the (dumb) Tesla UMC. This works by enabling EVCC to communicate with the Tesla vehicle through the proxy locally over Bluetooth, allowing near real-time charging adjustments based on solar and grid conditions. With this setup, I can dynamically adjust charging parameters such as amperage, enable fast charging, limit charging to solar excess, and more.</p><p>To better manage energy costs, I configured EVCC to account for PG&amp;E’s seasonal tariffs under my <a href="https://www.pge.com/en/account/rate-plans/find-your-best-rate-plan/electric-home.html?ref=danielraffel.me" rel="noreferrer">Electric Home Rate Plan</a><strong> </strong>(E-ELEC). Using a custom formula, the tariffs automatically adjust between summer and winter rates, with different time-of-use (TOU) pricing tiers. This ensures that I can charge at the lowest possible cost by prioritizing solar energy, battery reserves, or grid power depending on availability and cost.</p><p>Overall, this custom setup offers a fully integrated solution that balances solar self-consumption, grid tariffs, and vehicle charging. It’s ideal for anyone comfortable with configuring these components and looking to maximize their EV charging efficiency.</p><p>If you’re interested in setting up something similar, check out <a href="https://github.com/skrul/evcc-config?ref=danielraffel.me" rel="noreferrer">this repository</a>. Clone it, copy&nbsp;<code>evcc.env.sample</code>&nbsp;to&nbsp;<code>evcc.env</code>, and edit the values to match your setup. If you don’t plan to use InfluxDB, be sure to uncomment&nbsp;<code>DISABLE_INFLUX</code>&nbsp;in&nbsp;<code>evcc.env</code>. Once your values are set, run&nbsp;<code>gen.sh</code>&nbsp;to generate&nbsp;<code>evcc.yaml</code>&nbsp;with your customized configuration. The repo provides pre-configured PG&amp;E seasonal tariffs for California residential <a href="https://pge.com/electrichome?ref=danielraffel.me" rel="noreferrer">E-ELEC households</a> and annual feed-in export energy pricing for 2025, both of which can be easily removed if they don’t match your setup. This allows for real-time energy cost calculations.</p><p>While your setup is likely to differ, this repository serves as a useful example of how to use EVCC to:</p><p>• Integrate a <a href="https://luxpowertek.com/?ref=danielraffel.me" rel="noreferrer">Lux Power</a>/<a href="https://eg4electronics.com/?ref=danielraffel.me" rel="noreferrer">EG4</a> inverter using a third-party dongle.</p><p>• Enable smart control of a Tesla with a UMC charger via Bluetooth using <a href="https://github.com/wimaha/TeslaBleHttpProxy?ref=danielraffel.me" rel="noreferrer">TeslaBLEHttpProxy</a>.</p><p>• Configure seasonal summer/winter tariffs for importing energy <a href="https://www.pge.com/assets/pge/docs/account/rate-plans/residential-electric-rate-plan-pricing.pdf?ref=danielraffel.me" rel="noreferrer">via PG&amp;E</a>.</p><p>• Configure annual feed-in export energy pricing <a href="http://pge.com/eecvalues?ref=danielraffel.me" rel="noreferrer">via PG&amp;E</a>.</p><p><strong>Requirements:</strong></p><p>• <a href="https://github.com/evcc-io/evcc?ref=danielraffel.me" rel="noreferrer">EVCC</a>&nbsp;– Open-source EV charging controller.</p><p>• <a href="https://monitormy.solar/detail/13?ref=danielraffel.me" rel="noreferrer">Monitor My Solar Dongle</a><strong> </strong>– If you’re integrating an EG4 or Lux inverter.</p><p>• <a href="https://github.com/wimaha/TeslaBleHttpProxy?ref=danielraffel.me" rel="noreferrer">TeslaBLEHttpProxy</a>&nbsp;– If you’re planning on using a Tesla UMC with EVCC.</p><p>For more details and to get started, check out the repository:</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/skrul/evcc-config?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - skrul/evcc-config</div><div class="kg-bookmark-description">Contribute to skrul/evcc-config development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40-2.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">skrul</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/evcc-config" alt="" onerror="this.style.display = 'none'"></div></a></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2025/01/image.png" class="kg-image" alt="" loading="lazy" width="1968" height="1752" srcset="https://danielraffel.me/content/images/size/w600/2025/01/image.png 600w, https://danielraffel.me/content/images/size/w1000/2025/01/image.png 1000w, https://danielraffel.me/content/images/size/w1600/2025/01/image.png 1600w, https://danielraffel.me/content/images/2025/01/image.png 1968w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Screenshot displaying options for charging modes, such as Solar and Fast, along with real-time energy consumption and cost tracking.</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Easily Summarize an Existing Chat with Claude Before Starting a New One ]]></title>
        <description><![CDATA[ Sometimes my chats with Claude Pro get pretty long, and I start running into usage limits. When that happens, I need to start a new thread—but I don’t want to waste time going back to summarize everything. That’s where I’ve found text replacements on iOS/macOS to be incredibly helpful. ]]></description>
        <link>https://danielraffel.me/til/2024/12/31/how-to-easily-summarize-an-existing-chat-with-claude-before-starting-a-new-one/</link>
        <guid isPermaLink="false">67737c3763a310034f2949f9</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 30 Dec 2024 21:23:42 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/ffs.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Sometimes my chats with Claude Pro get pretty long, and I start running into <a href="https://support.anthropic.com/en/articles/8324991-about-claude-pro-usage?ref=danielraffel.me" rel="noreferrer">usage limits</a>. When that happens, I need to start a new thread—but I don’t want to waste time going back to summarize everything. That’s where I’ve found text replacements on iOS/macOS to be incredibly helpful.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-31.png" class="kg-image" alt="" loading="lazy" width="704" height="224" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-31.png 600w, https://danielraffel.me/content/images/2024/12/image-31.png 704w"></figure><h3 id="how-i-use-text-replacements">How I Use Text Replacements</h3><p>When I’m running out of messages and need to start a fresh chat, I simply message Claude:</p><blockquote><em>I need to summarize our original goals, the approaches we've tried, our current progress, any relevant details like the technologies or software packages involved, and what remains to be done. Additionally, I need to suggest clear next steps to prepare for a new discussion.</em></blockquote><p>Claude returns a concise summary that I can easily paste into a new chat to keep making progress without hitting my quota. To save time, I’ve created a short text replacement that quickly expands into that message.</p><h3 id="setting-up-text-replacement">Setting Up Text Replacement</h3><p>Here’s how to <a href="https://support.apple.com/guide/iphone/use-text-replacements-iph6d01d862/ios?ref=danielraffel.me" rel="noreferrer">set up text replacements</a> on iOS or macOS:</p><ol><li>Go to&nbsp;<strong>Settings &gt; General &gt; Keyboard &gt; Text Replacement</strong>.</li><li>Tap the&nbsp;<strong>+</strong>&nbsp;button.</li><li>Enter your saved summary in the "Phrase" field.</li><li>Choose a shortcut, like&nbsp;<strong>ffs</strong>, in the "Shortcut" field.</li><li>Save it.</li></ol><p>Now, anytime you type your shortcut, your saved summary will appear.</p><h3 id="why-this-works-well-with-claude">Why This Works Well with Claude</h3><p>Claude Pro has limits based on message count and conversation length. Longer chats use up your quota faster because Claude re-reads the entire thread with every message. Starting a new chat with a clear, concise summary avoids this problem and keeps things efficient.</p><p>By using text replacements, I can quickly transition to a new thread without losing momentum. Hopefully, future improvements to Claude will eliminate the need for this workaround.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ 2024 Year-End Playlist Roundup: Discovering the Best Music of the Year ]]></title>
        <description><![CDATA[ It’s that time of year when everyone shares their best-of lists. Here are some of the playlists I’ve been enjoying listening to. ]]></description>
        <link>https://danielraffel.me/2024/12/23/2024-year-end-playlist-roundup-discovering-the-best-music-of-the-year/</link>
        <guid isPermaLink="false">67588d3778585b034ca37995</guid>
        <category><![CDATA[ 🎶 Listening to music ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sun, 22 Dec 2024 23:25:24 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/2024_Year_End.png" medium="image"/>
        <content:encoded><![CDATA[ <p>It’s that time of year when everyone shares their best-of lists. I’ve been exploring a lot of music I hadn’t discovered before, and it’s been incredibly inspiring. I usually look for or create end-of-year playlists on Spotify, and this year, I <a href="https://danielraffel.me/til/2024/12/22/how-i-turned-end-of-year-album-lists-into-spotify-playlists/" rel="noreferrer">made a tool</a> to facilitate streamlining the playlist creation process. Here are some of the playlists I’ve been listening to.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Bleep 100 Tracks 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/3jkreFgCPN7ULb2NEDKix5?si=77639efa6c7a4177&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://bleep.com/release/482981-various-artists-bleep-100-tracks-2024?ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Bleep 2024 Top Ten Albums" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/3AuzK4Mndl63Go6fRnjQph?si=42b095add1984c41&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://bleep.com/top-10-albums-of-the-year-2024?lang=en_GB&srsltid=AfmBOoq9fckUY49f3p6rU0bt0XodzHt1W-_9Ecr6NA5jo59BsZe7v8Qy&ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The  Wire Magazine - Best of 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/4Dc0EzxXC23meCD4jL0ymd?si=d938c5ab1e224d70&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: WIRE Archive Releases 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/42ACG2qkBSTwGY8MdAmbHP?si=ad638e4b57ca46f6&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Avant-Rock  by Antonio Poscic" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/6PM4RBQ118Ii366RafahMB?si=353d7f8facfb4607&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Critical Beats by Misha Farrant" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/5dtmBzTxn17tUHeG8cjLiv?si=3c7b2ecd18f74152&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Dub &amp; Reggae by Steve Barker" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/46vq3Ace3lr9QFBIjldtLW?si=341cd82e3f8f48b5&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Electronics by Leah Kardos" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/7iLHwXayhiJbcqoCwKmxr2?si=9a1f2aab32d748bb&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Global by Francis Gooding" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/1o8KvKi34TlGNYkrh95DzD?si=32adf9508c274eb4&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 HipHop &amp; R&amp;B by Tim Fish" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/2vnuibFfdXVqoEnIKzDMo8?si=7a5b58037d5b462d&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Jazz &amp; Improv by Phil Freeman" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/5ReReaX14VweU47CGqjGzb?si=f00e0ce5d661405a&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Wire 2024 Modern Composition by Julian Cowley" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/76pZaS9J3ZiHMkkCQqgoc6?si=5c27844a9f1b4058&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.thewire.co.uk/issues/491/492?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Electronic Sound Best Albums 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/4NUlGiMjnTj5Si1ikcLiKY?si=a1cc2b599e254461&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://electronicsound.squarespace.com/shop/p/issue-120?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The Quietus Albums Of The Year 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/3A1ExbcwvPataIG8jk896T?si=8c68d15ae73846b1&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://thequietus.com/tq-charts/albums-of-the-year/albums-of-the-year-best-albums-2024-norman-records/?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Ted Gioia's 100 best recordings of 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/13GfXt7O3R4fXkBvCC8uje?si=2eaf356aae7c40d2&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.honest-broker.com/p/the-100-best-recordings-of-2024-part?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">part 1 via</span></a><span style="white-space: pre-wrap;"> and </span><a href="https://www.honest-broker.com/p/the-100-best-recordings-of-2024-part-c25?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">part 2 via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Boomkat Albums Of The Year 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/6rCf9BD5SxhajMo4mDExKz?si=8fb9e662ed6a41a6&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2689?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Boomkat Reissues 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/7IITkKUeBiDNNicj42oRFL?si=88b494238e8440ff&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2688?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Boomkat Discoveries 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/10FUBCRY0PwpSEGxWtUVl7?si=2eb8f5de019b498f&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2690?ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Actress 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/32l3210AZvvXGdda4KCBSg?si=3169f48a54e84086&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2579?ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Akira Rabelais 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/1zBGxuO09bg4jjaMhpvcnS?si=b849bbf64e134985&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2705?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Errorsmith 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/7vf5yGMxsOcCpEcVemjQIV?si=daad6bdc30a14e48&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2602?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Jim O'Rourke 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/4VWUz3Pu7KOox73ReKppka?si=963c1c630a674536&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2584?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Mark Fell 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/7t56Ep43caWZxVlbLyBx9O?si=3c857448290d4186&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2648?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Oren Ambarchi 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/34jsR3Zcc64382NxWXoYVu?si=dbefaea9e99f47d4&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2716?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Rian Treanor 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/6v2meicNqZFMyeMdK88QoB?si=c278543b7ccf4068&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2619?ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Turk Dietrich 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/6jw8cgVLUy30KtfktMTQRr?si=155d850876574b69&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://boomkat.com/charts/boomkat-end-of-year-charts-2024/2662?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: The 50 Best Songs Of 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/2n7UC7QkQq3x0JinsM1NKP?si=e20a29fe123d4133&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.stereogum.com/2288976/best-songs-2024/lists/year-in-review/?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: NME best songs of 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/3fRfLCgLgyY9UQ5pfGQYXy?si=6fb316b96293458b&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.nme.com/lists/end-of-year/best-songs-2024-3817596?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Pitchfork's The 100 Best Songs of 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/0ABF7hk0SojqkRBUos6xuJ?si=768c1027f0294162&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://pitchfork.com/features/lists-and-guides/best-songs-2024/?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Pitchfork’s 50 Best Albums of 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/5s2G5HKqZoX3nMESobSX6c?si=17b133d68a6c45d2&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://pitchfork.com/features/lists-and-guides/best-albums-2024/?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Metacritic Best of 2024 (So Far) - Top Tracks" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/2EJt7U0gJecqNo6dglla2D?si=6e11b312888c455b&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.metacritic.com/browse/albums/score/metascore/year/filtered?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: 24for24 by Darren McClure" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/1NC45gkXNPd4yRou7dT2JT?si=6a7cdf5706d44c6e&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List via 12k discord</span></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Said the Gramophone Favorite Albums 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/66oXlLKPmDyCy2pTU0Mo1T?go=1&amp;sp_cid=dd6e052ec3315148aa34dd59ad839949&amp;utm_source=oembed&amp;utm_medium=desktop"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.saidthegramophone.com/archives/best_songs_of_2024.php?ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Said the Gramophone Best Songs  2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/7Mdkf8WtwZqtNqU61hstbW?si=de458749c30d4142&amp;utm_source=oembed"></iframe><figcaption><p><span style="white-space: pre-wrap;">List </span><a href="https://www.saidthegramophone.com/archives/best_songs_of_2024.php?ref=danielraffel.me"><span style="white-space: pre-wrap;">via</span></a></p></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Create Spotify Playlists from End-of-Year Album Lists ]]></title>
        <description><![CDATA[ While there are plenty of tools for creating playlists with individual songs, I’ve found few that make it simple to create playlists with all the tracks from specific albums. This is especially helpful at the end of the year, when “Best of the Year” album lists appear. ]]></description>
        <link>https://danielraffel.me/til/2024/12/22/how-i-turned-end-of-year-album-lists-into-spotify-playlists/</link>
        <guid isPermaLink="false">6767708ef2d979034e900fa6</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 21 Dec 2024 18:25:19 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/DALL-E-2024-12-21-18.21.04---A-minimalist-and-modern-graphic-in-the-style-of-Paul-Rand--illustrating-the-concept-of-creating-Spotify-playlists-from-album-artist-lists.-The-design-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>While there are plenty of tools for creating playlists with individual songs, I’ve found few that make it simple to create playlists with all the tracks from specific albums. This is especially helpful at the end of the year, when I’m exploring “Best of the Year” album lists and want to dive into entire albums. By combining a <a href="https://www.generouscorp.com/Spotlistr-album-creation/?ref=danielraffel.me" rel="noreferrer">tool I built</a> with <a href="http://spotlistr.com/?ref=danielraffel.me" rel="noreferrer">Spotlistr</a>, you can transform album-artist lists into Spotify playlists that include every track from each album.</p><h3 id="the-problem">The Problem</h3><p>At the end of every year, I always come across best of lists featuring albums I missed out on. Manually searching for each album on Spotify and adding its tracks to a playlist is tedious. I wanted a faster way to generate full album playlists.</p><h3 id="the-solution">The Solution</h3><p>Here’s how I solved the problem:</p><h4 id="1-copy-list-from-the-web">1. Copy List from the <strong>Web</strong></h4><p>I use a <a href="https://www.icloud.com/shortcuts/bf2cd0a48cf04af9baa14949f4975f2f?ref=danielraffel.me" rel="noreferrer">shortcut</a> on my iPhone that can scrape web content directly from Safari. When browsing an end-of-year album list on my phone, I share the page with this shortcut which copies the entire text into my clipboard. <em>Note: I'm often on mobile but you can also just easily copy/paste lists on desktop web.</em></p><h4 id="2-extract-album-and-artist-names">2. Extract Album and Artist Names</h4><p>Once I have the list I often need to clean it up so I paste it into ChatGPT and ask it to:</p><blockquote>“Extract only the album titles and artist names, formatted as ‘Artist - Album’ pairs, with each pair on a new line and line no numbers, bullets or extra text.”</blockquote><p>This step generates a clean list of album-artist pairs to use (e.g., Charli xcx - BRAT).</p><h4 id="3-create-a-spotify-web-api-app">3. Create a Spotify Web API App</h4><p>To use the <a href="https://www.generouscorp.com/Spotlistr-album-creation/?ref=danielraffel.me" rel="noreferrer">tool</a>, you’ll first need to set up a Spotify Web API app. It’s simple and takes about 60 seconds:</p><ul><li>Log in to <a href="https://developer.spotify.com/?ref=danielraffel.me">Spotify for Developers</a>.</li><li>Click <strong>Create an App</strong>, give it a name, sign up to use the Web API, and set <code>localhost</code> as the return URI.</li><li>Copy the client token and secret provided.</li></ul><h4 id="4-setup-the-tool-with-your-credentials">4. Setup the Tool with your Credentials</h4><p>Paste the client token and secret into step 1 of the <a href="https://www.generouscorp.com/Spotlistr-album-creation/?ref=danielraffel.me" rel="noreferrer">tool</a> I built and press <strong>Authenticate</strong>. <em>Note: I don't store or log anything but you are welcome to download the html file if you want to run it locally.</em></p><h4 id="5-convert-album-artist-pairs-to-spotify-urls">5. Convert Album-Artist Pairs to Spotify URLs</h4><p>Copy and paste your cleaned up list of album-artist pairs into step 2 of the <a href="https://www.generouscorp.com/Spotlistr-album-creation/?ref=danielraffel.me" rel="noreferrer">tool</a>. It will automatically generate Spotify album URIs, saving you from manually searching for each album on Spotify. Once created, copy the generated Spotify URIs.</p><h4 id="6-generate-a-playlist-with-tracks-from-each-album-using-spotlistr">6. Generate a Playlist with Tracks from Each Album Using Spotlistr</h4><p>Once you have the Spotify album URIs, it’s time to create the playlist:</p><ul><li>Open <a href="https://spotlistr.com/?ref=danielraffel.me">Spotlistr</a>.</li><li>Paste the list of Spotify album URIs into Spotlistr.</li><li>Spotlistr will automatically generate a playlist, pulling all the tracks from each album.</li></ul><p>The result is a Spotify playlist that includes <em>every track</em> from the albums on your list—not just a single song from each. <em>Note: You'll want to reference the original list and make sure you found tracks for the correct album, some are bound to be wrong but this should get you most of the way there.</em></p><h3 id="why-it%E2%80%99s-useful">Why It’s Useful</h3><p>This approach creates a playlist featuring every track from each artist’s album. If you frequently discover music through curated lists, this workflow transforms a tedious task into a more efficient process. <strong>If you dig what you discover hopefully you’ll consider supporting both the artist and the source of the list!</strong></p><hr><h3 id="workflow-overview">Workflow Overview</h3><p>The following screenshots illustrate the high-level workflow. Each step represents the process, showing how tasks flow from start to finish. Let’s dive into each section to understand the connections and actions involved.</p><p><strong>Step 1: Find a Best of Album List</strong></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-30.png" class="kg-image" alt="" loading="lazy" width="1352" height="778" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-30.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-30.png 1000w, https://danielraffel.me/content/images/2024/12/image-30.png 1352w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Find and copy a list of favorite albums </span><a href="https://www.saidthegramophone.com/archives/best_songs_of_2024.php?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">like this one</span></a></figcaption></figure><p><strong>Step 2: Convert the list to album - artist pairs</strong></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-29.png" class="kg-image" alt="" loading="lazy" width="1316" height="1398" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-29.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-29.png 1000w, https://danielraffel.me/content/images/2024/12/image-29.png 1316w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Paste the list into chatGPT and ask it to cleanup and format the list, this is a bit different than some of the other lists so I gave it slightly different instructions with an example of what I wanted.</span></figcaption></figure><p><strong>Step 3: Convert the list to Spotify URIs</strong></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-27.png" class="kg-image" alt="" loading="lazy" width="1540" height="1700" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-27.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-27.png 1000w, https://danielraffel.me/content/images/2024/12/image-27.png 1540w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Then paste the the list into step 1 of the tool, press convert, get the Spotify album URIs and copy them.</span></figcaption></figure><p><strong>Step 4: Convert the Spotify URIs to a Spotify Playlist</strong></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-28.png" class="kg-image" alt="" loading="lazy" width="1188" height="1600" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-28.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-28.png 1000w, https://danielraffel.me/content/images/2024/12/image-28.png 1188w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Finally, paste the Spotify album URIs into Spotlistr and create your playlist. This excellent tool will let you review/edit.</span></figcaption></figure><p><strong>Step 5: Enjoy Listening to Your Spotify Playlist</strong></p><figure class="kg-card kg-embed-card"><iframe style="border-radius: 12px" width="100%" height="352" title="Spotify Embed: Said the Gramophone Favorite Albums 2024" frameborder="0" allowfullscreen="" allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://open.spotify.com/embed/playlist/66oXlLKPmDyCy2pTU0Mo1T?si=41eac42211c9408e&amp;utm_source=oembed"></iframe></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How To Send Custom Push Notifications When Hazel Adds Files to Apple Books ]]></title>
        <description><![CDATA[ I use the macOS app Hazel to automate a bunch of tasks on a Mac mini. Using Hazel I automatically add PDFs I download to my iCloud folder to Apple Books. Today, I took it a step further by enabling push notifications whenever Hazel adds a file to Apple Books. ]]></description>
        <link>https://danielraffel.me/til/2024/12/12/how-to-send-custom-push-notifications-when-hazel-adds-files-to-apple-books/</link>
        <guid isPermaLink="false">6759eb0e6f8e0d03503e8f04</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 12 Dec 2024 13:54:15 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/pushover.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I use the macOS app <a href="https://www.noodlesoft.com/?ref=danielraffel.me" rel="noreferrer">Hazel</a> to automate a bunch of tasks on a Mac mini. Hazel makes it incredibly easy to manage files by applying rules to folders and performing actions like moving, renaming, or executing scripts. Most of my Hazel rules revolve around moving files synced via <a href="https://syncthing.net/?ref=danielraffel.me" rel="noreferrer">Syncthing</a> to my desired archive locations. Using Hazel I automatically add PDFs I download to my iCloud folder to Apple Books.</p><p>Today, I took it a step further by enabling push notifications with Pushover whenever Hazel adds a file to Apple Books. This small tweak ensures I know exactly when a PDF is available to read. The setup was straightforward, and I’ve shared the steps below so others can replicate it.</p><p>I originally started exploring Pushover for sending notifications from the Apple Home app, but I found way for that app to <a href="https://danielraffel.me/2024/12/06/how-to-set-up-your-honeywell-alarm-with-apple-home-for-push-notifications-and-geofence-based-arming-disarming/" rel="noreferrer">send push notifications even when a security alarm is adjusted via a geofence</a>. However, I’m glad I found another use for Pushover!</p><hr><h3 id="how-to-set-up-hazel-and-pushover-notifications-for-apple-books">How to Set Up Hazel and Pushover Notifications for Apple Books</h3><h4 id="step-1-create-a-pushover-account-and-api-token"><strong>Step 1: Create a Pushover Account and API Token</strong></h4><ol><li>Go to <a href="https://pushover.net/?ref=danielraffel.me">Pushover.net</a> and create an account.</li><li>Create a new application to get your unique API token.<ul><li>You’ll need both the <strong>API token</strong> and your <strong>User key</strong> for the script.</li></ul></li></ol><h4 id="step-2-set-up-hazel-rule"><strong>Step 2: Set Up Hazel Rule</strong></h4><ol><li>Open <strong>Hazel</strong> and create or edit the rule that processes the PDFs.</li><li>Add conditions to ensure Hazel only runs this rule for PDFs in your watched folder. For example:<ul><li><em>If extension is <code>pdf</code></em></li></ul></li><li>Add an action to run a shell script at the end of the rule.</li></ol><h4 id="step-3-write-the-shell-script"><strong>Step 3: Write the Shell Script</strong></h4><p>Here’s the script I use:</p><pre><code class="language-bash">#!/bin/bash

# Pushover API credentials
API_TOKEN="your_pushover_api_token"
USER_KEY="your_pushover_user_key"

# Extract only the file name from Hazel
FILE_NAME=$(basename "$1")

# Send push notification
curl -s \
  --form-string "token=$API_TOKEN" \
  --form-string "user=$USER_KEY" \
  --form-string "message=Hazel added file: $FILE_NAME to Apple Books" \
  https://api.pushover.net/1/messages.json
</code></pre><ul><li>Replace <code>your_pushover_api_token</code> and <code>your_pushover_user_key</code> with your Pushover credentials.</li><li>The script uses <code>basename</code> to extract just the filename (e.g., <code>File.pdf</code>) from the full file path Hazel passes to the script.</li></ul><h4 id="step-4-add-the-script-to-hazel"><strong>Step 4: Add the Script to Hazel</strong></h4><ol><li>In the Hazel rule editor, choose the "Run Shell Script" action.</li><li>Copy and paste the script into the editor.</li><li>Hazel automatically passes the file path of the processed file as <code>$1</code> to the script—no additional configuration needed.</li></ol><h4 id="step-5-test-the-rule"><strong>Step 5: Test the Rule</strong></h4><ol><li>Drop a PDF into the folder monitored by Hazel.</li><li>Hazel will process the file, add it to Apple Books, and trigger the script.</li></ol><p>You should receive a Pushover notification like this:</p><blockquote>Hazel added file: File.pdf to Apple Books</blockquote><hr><h3 id="additional-notes">Additional Notes</h3><ul><li><strong>Why Pushover?</strong> I discovered Pushover recently, and its simple API makes it perfect for custom notifications like this. Plus, it works across all my devices.</li><li>If you want to batch notifications for multiple files or customize the message further, you can extend the script to handle more complex scenarios.</li></ul><hr><p>This small upgrade to an existing automation is a nice way to learn when new reading material is ready to consume in Apple Books.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-23.png" class="kg-image" alt="" loading="lazy" width="1179" height="317" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-23.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-23.png 1000w, https://danielraffel.me/content/images/2024/12/image-23.png 1179w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Resolve NeuralDSP Cortex Control Crashes on macOS Sequoia and Make It Launch ]]></title>
        <description><![CDATA[ As an owner of the NeuralDSP Quad Cortex, I&#39;ve come to rely on the Cortex Control Desktop software. But after upgrading to macOS Sequoia, I encountered persistent crashes with Cortex Control. I recently discovered a straightforward solution. ]]></description>
        <link>https://danielraffel.me/til/2024/12/12/how-i-got-neuraldsp-cortex-control-to-launch-on-macos-sequoia/</link>
        <guid isPermaLink="false">675b26946f8e0d03503e8f55</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 12 Dec 2024 10:33:14 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/minimalist.quad.png" medium="image"/>
        <content:encoded><![CDATA[ <p>As an owner of the <a href="https://neuraldsp.com/quad-cortex?ref=danielraffel.me" rel="noreferrer">NeuralDSP Quad Cortex</a>, I've come to rely on the <a href="https://neuraldsp.com/cortex-control?ref=danielraffel.me" rel="noreferrer">Cortex Control</a> Desktop software. But after upgrading to macOS Sequoia in September 2024, I encountered persistent crashes with Cortex Control v1.1.0. Even after NeuralDSP released v1.2.0 in late November 2024, which claimed to address connection-related crashes, I still couldn’t launch the application. Determined to fix this, I discovered a straightforward solution.</p><h3 id="the-problem"><strong>The </strong>Problem</h3><p>The v1.2.0 update promised improved connection handling to resolve random crashes during startup. Despite this fix, the application wouldn’t launch after installing v1.2.0 alongside CorOS 3.1.0. Something deeper was preventing Cortex Control from functioning properly on macOS.</p><h3 id="the-solution">The Solution</h3><p>1.	<strong>Investigating the Installer</strong><br>Using&nbsp;<a href="https://mothersruin.com/software/SuspiciousPackage/?ref=danielraffel.me">Suspicious Package</a>, I analyzed the Cortex Control installer to pinpoint what it installs on the system. This tool revealed several files and directories worth investigating.</p><p>2.	<strong>Removing Old Files</strong><br>I manually removed the following files and directories to clear remnants of the previous installation:</p><p>Application and support files:</p><pre><code class="language-bash">/Applications/Neural DSP/Cortex Control.app&nbsp;&nbsp;
/Library/Application Support/Neural DSP/Cortex Control</code></pre><p>Configuration and log files:</p><pre><code class="language-bash">~/Library/Preferences/com.NeuralDSP.CortexControl.plist&nbsp;&nbsp;
~/Library/Logs/ndsp_appnapdisable.log</code></pre><ol start="3"><li><strong>Updating macOS and Reinstalling Cortex Control:</strong></li></ol><p>Since it had just come out I upgraded macOS to version 15.2, restarted my system, and reinstalled Cortex Control v1.2.0 using the official&nbsp;.pkg&nbsp;installer. This resolved the issue, and the application launched successfully!</p><h3 id="conclusion"><strong>Conclusion</strong></h3><p>If you’re experiencing issues with Cortex Control on macOS Sequoia, removing old files and updating to the latest macOS version might do the trick. NeuralDSP’s v1.2.0 improvements are great, but a clean slate ensures compatibility. I’m back to seamlessly controlling my Quad Cortex on the floor from my desktop!</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Set Up Your Honeywell Alarm with Apple Home for Push Notifications and Geofence-Based Arming/Disarming ]]></title>
        <description><![CDATA[ I’ve lived in my house for nearly 20 years and have automated many aspects of it. However, my Honeywell alarm system remained one of the only devices I hadn’t automated. Here’s how I integrated my Honeywell alarm with Apple Home to enable push notifications and geofence-based arming and disarming. ]]></description>
        <link>https://danielraffel.me/2024/12/06/how-to-set-up-your-honeywell-alarm-with-apple-home-for-push-notifications-and-geofence-based-arming-disarming/</link>
        <guid isPermaLink="false">67422d87e347f9035084cf21</guid>
        <category><![CDATA[ 🏠 Home Automation ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 06 Dec 2024 13:36:28 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/DALL-E-2024-12-06-13.33.17---A-minimalistic-illustration-in-the-style-of-Paul-Rand--showcasing-a-smart-home-concept.-The-image-features-a-stylized-modern-house-with-geometric-shap.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I’ve lived in my house for nearly 20 years and have automated many things. However, one of the last holdouts was my Honeywell Ademco Vista 20p alarm system. I was initially hesitant to modify such an outdated system, but I finally decided to take the plunge. My goal was to enable remote arming and disarming, receive push notifications, use the sensors in automations, and integrate the alarm with HomeKit to automate arming and disarming based on presence—for example, arming when the last person leaves and disarming when the first person arrives.</p><p>To achieve this, I purchased the <a href="https://www.eyezon.com/duo.html?ref=danielraffel.me" rel="noreferrer">Eyezon Duo</a>. After it arrived, I followed the <a href="https://www.eyezon.com/EyezonEnvisalinkHoneywellInstallationGuide2018.pdf?ref=danielraffel.me#page=2" rel="noreferrer">initial software activation instructions</a> to register the device to my account.</p><p>Next came the installation, which involved <a href="https://www.eyez-on.com/EZMAIN/DuoQuickStartEZ_2022_Web.pdf?ref=danielraffel.me" rel="noreferrer">wiring</a> the Duo to my alarm system and running an Ethernet cable through my attic, as the device doesn’t support Wi-Fi and requires a wired internet connection. </p><p>To create a custom-length cable, I bought a spool of Cat6e cable from Best Buy Insignia. They sell an entire spool, measure what you take, and allow you to use what you need. You return the remainder for a refund, and it costs approximately $0.30 per foot.</p><p>I was motivated to get the wiring done while I had some solar installers on-site so that I could borrow their Ethernet crimper, cable tester and ladder to access my attic. Thanks, <a href="https://yelp.to/XXnYGW7G4M?ref=danielraffel.me" rel="noreferrer">Jesse</a>!</p><hr><h3 id="retrieving-the-honeywell-installer-code">Retrieving the Honeywell Installer Code</h3><p>After installing the Eyezon Duo, I needed to add it as a virtual keypad/panel to my Honeywell alarm system. To do this, the first step was <a href="https://www.eyezon.com/EyezonEnvisalinkHoneywellInstallationGuide2018.pdf?ref=danielraffel.me#page=11" rel="noreferrer">retrieving the installer code</a> for my alarm. Once I had the installer code, I was able to move forward with the setup. <em>Make sure to save this code since you may need it again in the future. I attempted to get this code from the folks who manage my alarm but it was against their policy since they don’t want folks messing up their systems.</em></p><pre><code class="language-txt">Don't Know Your Installer Code?

Step 1: Shut down the system.
1. Shut down transformer.
2. Shut down battery.
3. Leave system off for 60 seconds.

Step 2: Bring the system back up.
1. Repower the battery.
2. Repower the transformer.

Step 3: Force system to programming mode and retrieve existing installer code.
1. Within 30 seconds of rebooting system, push and hold the * and # keys together on a keypad.
2. Hold for 2 seconds and the 20 should appear on the display which indicated you are in programming mode. If 20 does not appear the panel is locked and you must contact the installer for the code.
3. Enter #20 and the display will show the existing installed for digit code, one digit at a time.</code></pre><hr><p><em>I ran into the following challenges with the Eyezon Duo instructions.</em></p><h3 id="finding-the-wire-to-connect-the-duo-to-the-alarm-panel">Finding the Wire to Connect the Duo to the Alarm Panel</h3><p>The first challenge was figuring out how to wire the Duo to my alarm system. The instructions mentioned that the necessary cable wasn’t included. However, when I thoroughly unpacked the box, I unexpectedly found a short cable tucked away in the packaging.</p><h3 id="logging-into-the-duo-device-on-the-local-network">Logging into the Duo Device on the Local Network</h3><p>The second challenge was figuring out how to connect to the Eyezon Duo device on my local network. I stumbled on some instructions online that mentioned the default username is “user,” and the password is the last six digits of the device’s MAC address (which can be found on a sticker on the bottom of the device). It’s a good idea to snap a photo and jot down these details for reference.</p><h3 id="adding-the-duo-as-a-keyboard-to-the-honeywell-panel">Adding the Duo as a Keyboard to the Honeywell Panel</h3><p>Next, I encountered a challenge adding the Duo as a keypad, as the default address <code>18</code> was already in use by my Honeywell panel. To resolve this, I logged into the Eyezon Duo device portal and registered the Duo to a different keypad address—in my case, address <code>21</code> was available.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-14.png" class="kg-image" alt="" loading="lazy" width="1494" height="182" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-14.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-14.png 1000w, https://danielraffel.me/content/images/2024/12/image-14.png 1494w" sizes="(min-width: 720px) 720px"></figure><p>To complete adding the Duo as a keypad I needed to enter the following codes on my Honeywell panel:</p><pre><code class="language-bash">(Installer Code) 800 *194 10 *99</code></pre><hr><h3 id="programming-the-duo-to-work-with-the-honeywell-alarm">Programming the Duo to Work with the Honeywell Alarm</h3><p>Once the Duo was configured as a keypad, I <a href="https://www.eyezon.com/EyezonEnvisalinkHoneywellInstallationGuide2018.pdf?ref=danielraffel.me#page=9" rel="noreferrer">programmed the Honeywell Vista Panel following the 23-step</a> process outlined on pages 9 and 10 of the manual. Initially, I was unsure if every step was necessary—but <strong>yes, they all are</strong>.</p><pre><code class="language-txt">Honeywell Vista 10P, 15P, 20P &amp; 21iP

1. Enter (Installer Code) + 800 to access installer programming mode. “20” should appear on the display. If you
do not know the Installer Code, see Troubleshooting Tips on page 13.

2. Keypad programming: *190 to *196 (addresses 17-23) are the keypad programming sections. Enable the assigned Envisalink4 addresses as needed. If it is a single partition system using the default Envisalink4 address of 18, the programming section is *191. In section *191 enter 10. If the keypad was on Partition 2, you would enter 20.

3. *29 Enable IP/GSM (IP/GSM/LRR support required for your Envisalink4 to transmit alerts): The codes to enter in the section depend on the age of the panel. Start by entering 1, If you hear 3 beeps proceed to #4; if not continue with *0**.

4. *48 Report Format: Enter 77 (this is pre-set and cannot be changed on ADT Panels).

5. *49 Split/Dual Reporting: Enter 5 and you will hear 3 beeps.

6. *50 Burglary Dialer Delay: Enter 0 and you will hear 3 beeps.

7. *54 Dynamic Signalling Delay: Set to 0 and you will hear 3 beeps.

8. *55 Dynamic Signalling Priority: Set to 1 and you will hear 3 beeps.

9. *59 Exit Error Alarm Report Code: Set to 0 and you will hear 3 beeps.

10. *60 Trouble Report Code: Enter 10 and you will hear 3 beeps.

11. *62 AC Loss Report Code: Enter 10 and you will hear 3 beeps.

12. *63 Low Battery Report Code: Enter 10 and you will hear 3 beeps.

13. *64 Test Report Code: Enter 10 and you will hear 3 beeps.

14. *65 Open Report Code: Enter 110 and you will hear 3 beeps.

15. *66 Arm Away/Stay Report Code: Enter 111100 and you will hear 3 beeps.

16. *67 RF Transmitter Low Battery Report Code: Enter 10 and you will hear 3 beeps.

17. *70 Alarm Restore Report Code: Enter 1 and you will hear 3 beeps.

18. *71 Trouble Resolve Report Code: Enter 10 and you will hear 3 beeps.

19. *73 AC Restore Report Code: Enter 10 and you will hear 3 beeps.

20. *74 Low Battery Restore Report Code: Enter 10 and you will hear 3 beeps.

21. *75 RF Low Battery Restore Report Code: Enter 10 and you will hear 3 beeps.

22. *84 Auto Stay Arm: Enter 0 and you will hear 3 beeps.

23. Enter *99 to exit programming.</code></pre><p>Once the panel programming was complete I connected to the Eyezon portal (and mobile app) and confirmed I was able to control the Honeywell alarm. The next step was figuring out how to connect the Duo to HomeKit.</p><hr><h3 id="connecting-the-duo-to-homebridge-to-expose-it-in-homekit">Connecting the Duo to Homebridge to Expose it in HomeKit</h3><p>To connect unsupported 3rd party devices to HomeKit I am using <a href="https://homebridge.io/?ref=danielraffel.me" rel="noreferrer">Homebridge</a>. I installed the <a href="https://github.com/haywirecoder/homebridge-envisalink-ademco?ref=danielraffel.me" rel="noreferrer">homebridge-envisalink-ademco</a> plugin and then configured the plugin with the IP address of the Duo on my network, the port, the MacID password, and my alarm code (which happens to be user code #2.) </p><p>Since the plugin was now referencing the Duo device by its IP address, I wanted to ensure the address didn’t change. To achieve this, I logged into the Duo device and configured it to use a static IP address under the network options. I then assigned this static IP address to the device’s MAC ID in my router’s settings.</p><p>In the Apple Home app, I was now able to see and adjust the alarm states. However, there was still more work to be done to see and control all the devices connected to the alarm.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-25.png" class="kg-image" alt="" loading="lazy" width="334" height="262"></figure><p>I wanted to have access to all the contact sensors, motion sensors, smoke alarms, and other devices in the Apple Home app. To achieve this, I needed to retrieve the corresponding zones and device names from the Honeywell panel.</p><hr><h3 id="retrieving-zones-from-the-honeywell-panel">Retrieving Zones from the Honeywell Panel</h3><p>Using ChatGPT, I got some helpful instructions to guide me through the process. Starting with zone 01, I worked through each zone incrementing by one until no more appeared, taking photos of the zones and their names for future reference.</p><pre><code class="language-txt">Step-by-Step to View Zone Names:

1. Enter Programming Mode:
* Enter your Installer Code (default is 4112), followed by 8 and 00. Example: 4112 + 8 + 00.
* The keypad will display Installer Code 20.

2. Enter Zone Descriptor Programming:
* Press *82.
* The keypad will display "Alpha?".
* Select Yes by pressing 1 (to confirm Alpha programming).

3. View the Zone Descriptors:
* The keypad will display "Enter Zone No." prompting you to input a zone number.
* Input a zone number to view its programmed descriptor (e.g., 01 for Zone 1).
   * The keypad will show the name (if programmed) for that zone.
   * To move to the next zone, simply enter the next zone number or use the arrow keys (* to proceed and # to go back).

4. Review Additional Zones:
* Repeat the process for each zone to view its name.
   * Example: Enter 02 for Zone 2, 03 for Zone 3, etc.

5. Exit Programming Mode:
* After reviewing all zones, exit programming mode by pressing *99.
* This ensures the system returns to normal operation.</code></pre><hr><h3 id="configuring-the-envisalink-ademco-plugin-with-honeywell-zone-names-and-adding-speedkey-switches">Configuring the Envisalink Ademco Plugin with Honeywell Zone Names and Adding SpeedKey Switches</h3><p>I began by reviewing and updating the&nbsp;Homebridge Envisalink Ademco<strong> </strong><a href="https://github.com/haywirecoder/homebridge-envisalink-ademco/blob/master/config.schema.json?ref=danielraffel.me" rel="noreferrer">config.schema</a>&nbsp;to integrate my alarm sensors into a custom configuration file. This ensures that all devices connected to my alarm system—such as contact sensors, motion sensors, door/window sensors, and smoke detectors—are visible to the plugin, appear in HomeKit, and can be included in automations.</p><p>Since my partner and I use Apple Home’s geofence settings for&nbsp;<em>first-person-arrived</em>&nbsp;and&nbsp;<em>last-person-left</em>&nbsp;automations, I wanted to automate arming and disarming the alarm based on presence. This eliminates the need to manually interact with the alarm panel every time we come and go. While Apple Home allows geofence-based triggers for sensitive devices like alarms, locks, and garage doors, it requires user confirmation when these actions are triggered. While this is a sensible security measure for most users, we prefer automatic arming and disarming with a push notification for validation.</p><p>Apple’s confirmation requirement can be bypassed by using&nbsp;switches, which allow direct control of sensitive devices in automations. The Homebridge Envisalink Ademco plugin supports <a href="https://github.com/haywirecoder/homebridge-envisalink-ademco/tree/master?tab=readme-ov-file&ref=danielraffel.me#configuration-options" rel="noreferrer">speedkeys</a>, which act as custom switches that send specific commands to the alarm system. Below is an example of how I configured the&nbsp;<strong>Zones</strong>&nbsp;and&nbsp;<strong>Speedkeys</strong>&nbsp;sections in the configuration file.</p><p><strong>Zones</strong><br>This section includes the alarm zones I retrieved from my panel. These zones ensure the supported sensors (e.g., door, window, smoke, motion) are detected and displayed in HomeKit.</p><pre><code class="language-bash">    "zones": [
        {
            "name": "Tamper",
            "partition": "1",
            "zoneNumber": "8",
            "sensorType": "door"
        },
        {
            "name": "XX Window",
            "partition": "1",
            "zoneNumber": "13",
            "sensorType": "window"
        },
        {
            "name": "XX Heat",
            "partition": "1",
            "zoneNumber": "14",
            "sensorType": "smoke"
        },
        {
            "name": "XX Motion",
            "partition": "1",
            "zoneNumber": "15",
            "sensorType": "motion"</code></pre><p><strong>Speedkeys</strong><br>In this section, I created custom switches (speedkeys) for each alarm state, such as <strong>Disarm</strong>&nbsp;and&nbsp;<strong>Away</strong>. These switches send raw command strings in the format&nbsp;&lt;PIN&gt;&lt;Ademco Alarm Command&gt;&nbsp;directly to the <a href="https://forum.eyezon.com/viewtopic.php?t=301&ref=danielraffel.me" rel="noreferrer">EnvisaLink TPI</a>, part of the Duo hardware. The following snippet worked without any modifications:</p><pre><code class="language-bash">"speedkeys": [
    {
        "name": "Disarm",
        "speedcommand": "@pin1"
    },
    {
        "name": "Away",
        "speedcommand": "@pin2"
    }
]</code></pre><p><strong>Behavior in the Home App</strong><br>The speedkeys will appear as switches in the Home app. By accessing the settings for these switches, you have the option to show them as separate tiles, rename them and make other adjustments. Because the switches have identical names by default, you’ll need to test each one to determine which controls arming and which controls disarming. I recommend renaming them for clarity.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/Screenshot-2024-12-17-at-11.38.16-AM.png" class="kg-image" alt="" loading="lazy" width="471" height="1022"><figcaption><span style="white-space: pre-wrap;">Here’s how the speedkeys will appear in the Home app. By tapping on their settings, you can rename and configure them.</span></figcaption></figure><p>The Homebridge Envisalink Ademco plug-in continuously monitors the EnvisaLink TPI stream, ensuring the alarm state updates accurately in HomeKit. If you want to add additional alarm commands (e.g., <em>Bypass, Max, Night</em>, <em>Stay</em>, etc.), refer to the&nbsp;<a href="https://github.com/haywirecoder/homebridge-envisalink-ademco/blob/03dc71a797b76027274a2fb44429676c8bb956e3/tpi.js?ref=danielraffel.me#L243" rel="noreferrer">tpi.js</a>&nbsp;file under&nbsp;exports.alarmcommand.</p><p>With these speedkeys configured, I successfully created switches that automate arming and disarming the alarm when triggered by a geofence crossing. This approach not only avoids Apple’s confirmation prompts but also provides push notifications for validation. I'll show how I use these switches in the section below on configuring geofence alarm triggers in the Home App.</p><p><strong>Note</strong>: I’ve included&nbsp;<code># comments</code>&nbsp;for clarity in the example configuration file, but you may need to remove them to save the file correctly.</p><pre><code class="language-Js">{
    "name": "Envisalink-Ademco",
    "host": "192.168.XX.XX", # Enter the IP for your Duo
    "port": 4025,
    "deviceType": "20P",
    "password": "XXXXXX", # Enter last 6 digits of your Duo MacID
    "pin": "XXXX", # Enter your alarm pin
    "changePartition": false,
    "openZoneTimeout": 30,
    "heartbeatInterval": 30,
    "commandTimeOut": 10,
    "autoReconnect": true,
    "sessionWatcher": true,
    "chimeToggle": false,
    "envisalinkFailureSuppress": false,
    "ignoreFireTrouble": false,
    "ignoreSystemTrouble": false,
    "maintenanceMode": false,
    "policePanic": {
        "enabled": true,
        "name": "Police Panic"
    },
    "firePanic": {
        "enabled": true,
        "name": "Fire Panic"
    },
    "ambulancePanic": {
        "enabled": true,
        "name": "Ambulance Panic"
    },
    "partitions": [
        {
            "name": "House"
        }
    ],
    "zones": [
        {
            "name": "Tamper",
            "partition": "1",
            "zoneNumber": "8",
            "sensorType": "door"
        },
        {
            "name": "XX Door",
            "partition": "1",
            "zoneNumber": "9",
            "sensorType": "door"
        },
        {
            "name": "XX Door",
            "partition": "1",
            "zoneNumber": "10",
            "sensorType": "door"
        },
        {
            "name": "XX Door",
            "partition": "1",
            "zoneNumber": "11",
            "sensorType": "door"
        },
        {
            "name": "XX Window",
            "partition": "1",
            "zoneNumber": "12",
            "sensorType": "window"
        },
        {
            "name": "XX Window",
            "partition": "1",
            "zoneNumber": "13",
            "sensorType": "window"
        },
        {
            "name": "XX Heat",
            "partition": "1",
            "zoneNumber": "14",
            "sensorType": "smoke"
        },
        {
            "name": "XX Motion",
            "partition": "1",
            "zoneNumber": "15",
            "sensorType": "motion"
        }
    ],
    "bypass": [
        {
            "enabledbyPass": true
        }
      ],
    "speedKeys": [
        {
            "name": "Disarm",  
            "speedcommand": "@pin1"
        },
        {
            "name": "Away",
            "speedcommand": "@pin2"
        }
    ],
    "_bridge": {
        "username": "XX:XX:XX:XX:XX:XX", # Auto-added just redacted mine
        "port": 50577
    },
    "platform": "Envisalink-Ademco"
}</code></pre><h3 id="adding-the-json-configuration-file-to-the-plugin">Adding the JSON Configuration File to the Plugin</h3><p>I then pasted my updated configuration file into the Homebridge Envisalink Ademco plugin settings under the <code>JSON Config</code> section in Homebridge and restarted Homebridge.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-16.png" class="kg-image" alt="" loading="lazy" width="1432" height="826" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-16.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-16.png 1000w, https://danielraffel.me/content/images/2024/12/image-16.png 1432w" sizes="(min-width: 720px) 720px"></figure><h3 id="creating-a-child-bridge-and-pairing-with-the-home-app">Creating a Child Bridge and Pairing with the Home App</h3><p>After restarting, I returned to the plugin, selected&nbsp;<strong>Child Bridge Config</strong>, and paired it with the Home app. </p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/Screenshot-2024-12-17-at-4.27.36-PM.png" class="kg-image" alt="" loading="lazy" width="1304" height="846" srcset="https://danielraffel.me/content/images/size/w600/2024/12/Screenshot-2024-12-17-at-4.27.36-PM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/Screenshot-2024-12-17-at-4.27.36-PM.png 1000w, https://danielraffel.me/content/images/2024/12/Screenshot-2024-12-17-at-4.27.36-PM.png 1304w" sizes="(min-width: 720px) 720px"></figure><p>A QR code will be displayed, which you can scan with your iOS device to guide you through the pairing process.<strong><em> </em></strong></p><p><strong>Once the plugin is successfully paired as a bridge, a confirmation message will appear on the screen.</strong></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-24.png" class="kg-image" alt="" loading="lazy" width="1582" height="1224" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-24.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-24.png 1000w, https://danielraffel.me/content/images/2024/12/image-24.png 1582w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="adding-honeywell-zones-to-the-duo-portal">Adding Honeywell Zones to the Duo Portal</h3><p>Performing the next series of steps simply ensures I have pretty names for all of my devices if I opt to use the Eyezon services. I logged into the <a href="https://portal.eyezon.com/app/login.php?ref=danielraffel.me" rel="noreferrer">Eyezon web portal</a> and navigated to <strong>Systems &gt; Settings</strong>. </p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-18.png" class="kg-image" alt="" loading="lazy" width="1420" height="544" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-18.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-18.png 1000w, https://danielraffel.me/content/images/2024/12/image-18.png 1420w" sizes="(min-width: 720px) 720px"></figure><p>I scrolled down to <strong>Zone Labels</strong> and selected <strong>Manage Zone Labels</strong>. </p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-19.png" class="kg-image" alt="" loading="lazy" width="1776" height="288" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-19.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-19.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-19.png 1600w, https://danielraffel.me/content/images/2024/12/image-19.png 1776w" sizes="(min-width: 720px) 720px"></figure><p>Next, I added each zone and label name from my Homebridge Envisalink Ademco configuration file to the Duo portal, ensuring I selected the appropriate partition before submitting each entry.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-20.png" class="kg-image" alt="" loading="lazy" width="1948" height="988" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-20.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-20.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-20.png 1600w, https://danielraffel.me/content/images/2024/12/image-20.png 1948w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="viewing-alarm-sensors-in-the-apple-home-app">Viewing Alarm Sensors in the Apple Home App</h3><p>With the Homebridge Envisalink Ademco plugin configured and Duo zones updated, I launched the Apple Home app. Sure enough, now all of my alarm’s contact sensors, motion sensors, and alarm sensors appeared, ready to be integrated into Apple Home automations. <em>In my case, some alarm entities were displayed in a room as tabs above physical devices.</em></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-22.png" class="kg-image" alt="" loading="lazy" width="1933" height="175" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-22.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-22.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-22.png 1600w, https://danielraffel.me/content/images/2024/12/image-22.png 1933w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="configuring-geofence-alarm-triggers-in-the-home-app">Configuring Geofence Alarm Triggers in the Home App</h3><blockquote>Refer to the 9/25 update at the end, which notes a few minor adjustments for setting up speed keys in v2.1.0. <a href="https://github.com/haywirecoder/homebridge-envisalink-ademco/issues/66?ref=danielraffel.me" rel="noreferrer">FWIW I self-resoled in this</a><a href="https://github.com/haywirecoder/homebridge-envisalink-ademco/issues/66?ref=danielraffel.me" rel="noreferrer"> Github issue</a>.</blockquote><p>Since my partner and I use Apple Home’s geofence settings for <em>first-person-arrived</em> and <em>last-person-left</em> automations, I wanted to automate arming and disarming the alarm based on presence, removing the need to manually access the panel each time we come and go. </p><p>To enable this feature, I added the arm and disarm switches I created using speedkeys to their respective automations. It's worth noting before doing this I renamed my speedkey switches to <strong>Disarmed Home</strong>&nbsp;and&nbsp;<strong>Armed Away</strong>&nbsp;for better clarity.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/renamed.png" class="kg-image" alt="" loading="lazy" width="471" height="1022"></figure><p>Using these switches, I configured my geofence automations to trigger the appropriate switches, allowing the alarm to automatically arm and disarm as people come and go.</p><h3 id="configuring-an-automation-to-arm-the-alarm-when-the-last-person-leaves">Configuring an Automation to Arm the Alarm When the Last Person Leaves</h3><p>In the “When the Last Person Leaves” automation, I set it to turn on “Armed Away,” which arms the alarm (e.g., sets it to Away mode).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/leaves-1.png" class="kg-image" alt="" loading="lazy" width="471" height="1022"></figure><h3 id="configuring-an-automation-to-disarm-the-alarm-when-the-first-person-arrives">Configuring an Automation to Disarm the Alarm When the First Person Arrives</h3><p>In the “When the First Person Arrives” automation, I set it to turn on “Disarmed Home,” which disarms the alarm (e.g., sets it to off).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/arrives.png" class="kg-image" alt="" loading="lazy" width="471" height="1022"></figure><p>Now, when the first person arrives or the last person leaves, the alarm in the home automatically adjusts, and my partner and I receive push notifications from the Home app about the alarm state changes.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/393369697-1cbda533-c8a6-4977-94b0-e627f38d36dd.png" class="kg-image" alt="" loading="lazy" width="752" height="170" srcset="https://danielraffel.me/content/images/size/w600/2024/12/393369697-1cbda533-c8a6-4977-94b0-e627f38d36dd.png 600w, https://danielraffel.me/content/images/2024/12/393369697-1cbda533-c8a6-4977-94b0-e627f38d36dd.png 752w" sizes="(min-width: 720px) 720px"></figure><p><em>If you rely on Apple Home Activity History to track who locked/unlocked or armed/disarmed, note that <strong>geofence-triggered arm/disarm events are not attributed to any specific user.</strong></em></p><hr><h3 id="enabling-critical-alerts-to-get-push-notifications-when-the-alarm-goes-off">Enabling Critical Alerts to Get Push Notifications When the Alarm Goes Off</h3><p>Apple Home supports sending&nbsp;<a href="https://support.apple.com/en-us/108781?ref=danielraffel.me" rel="nofollow">Critical Alerts</a>&nbsp;on iOS 15 and iPadOS 15 or later. To ensure these alerts are active, verify that they are enabled under&nbsp;<strong>Settings &gt; Notifications &gt; Home.</strong></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/IMG_1848.png" class="kg-image" alt="" loading="lazy" width="1179" height="1528" srcset="https://danielraffel.me/content/images/size/w600/2024/12/IMG_1848.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/IMG_1848.png 1000w, https://danielraffel.me/content/images/2024/12/IMG_1848.png 1179w" sizes="(min-width: 720px) 720px"></figure><p>Once configured, if the security system triggers the alarm when in Away, Home, or Night (e.g., the alarm is activated), you’ll automatically receive the corresponding alert. <strong><em>While no additional automations are required to enable these critical alerts it's worth intentionally setting off your alarm to confirm everything is setup and working correctly.</em></strong></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/IMG_1849.jpeg" class="kg-image" alt="" loading="lazy" width="1179" height="258" srcset="https://danielraffel.me/content/images/size/w600/2024/12/IMG_1849.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2024/12/IMG_1849.jpeg 1000w, https://danielraffel.me/content/images/2024/12/IMG_1849.jpeg 1179w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="manually-adjusting-the-alarm-in-the-apple-home-app">Manually Adjusting the Alarm in the Apple Home app</h3><p>It is also possible to manually adjust the alarm modes directly in the Apple Home app.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-17.png" class="kg-image" alt="" loading="lazy" width="1179" height="2556" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-17.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-17.png 1000w, https://danielraffel.me/content/images/2024/12/image-17.png 1179w" sizes="(min-width: 720px) 720px"></figure><p><em>If you rely on Apple Home Activity History to monitor who armed or disarmed the system, note that <strong>manually triggered arm/disarm events are attributed to a specific user</strong>, while geofence-triggered events are not.</em></p><h3 id="when-push-notifications-are-sent">When<strong> Push Notifications Are Sent</strong></h3><p>Push notifications are sent in these scenarios:</p><ul><li>When someone else sets a mode.</li><li>When an automation sets a mode.</li><li>When you set a mode, but the Home app is closed or the device is locked (conditions set by Apple).</li></ul><p><em>Note: Each person in the household who wants to receive push notifications from the Home app should check <strong>Settings &gt; Notifications &gt; Home. </strong>For instance, I made sure critical alerts were enabled on my partners phone.</em></p><h3 id="refining-alarm-notification-wording">Refining Alarm Notification Wording</h3><p>A nice implementation detail I discovered is that by assigning the alarm and its sensors to a room named after my home address, I can receive more descriptive notifications. For example, moving the alarm to a room named “123 Main” will generate a push notification that says, “123 Main Alarm was disarmed.” Since the push notification will use the room name where the alarm is installed in the Home app it might read a bit goofy if you don’t adjust it.</p><hr><h3 id="using-alarm-contact-sensors-and-motion-sensors-in-apple-home-automations">Using Alarm Contact Sensors and Motion Sensors in Apple Home Automations </h3><p>With the Envisalink Ademco plugin configured to expose all entities connected to the alarm, it’s possible to create automations utilizing them. </p><p>For instance, I set up an automation to turn on the Den light at night whenever the Front Door is opened, leveraging the Front Door Alarm contact sensor.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/IMG_4720.jpeg" class="kg-image" alt="" loading="lazy" width="1179" height="2556" srcset="https://danielraffel.me/content/images/size/w600/2024/12/IMG_4720.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2024/12/IMG_4720.jpeg 1000w, https://danielraffel.me/content/images/2024/12/IMG_4720.jpeg 1179w" sizes="(min-width: 720px) 720px"></figure><hr><h3 id="final-thoughts">Final Thoughts</h3><p>In the end, integrating my aging Honeywell alarm system into a modern smart home ecosystem was a rewarding challenge. By using Eyezon Duo, Homebridge and carefully configuring the Envisalink Ademco plugin, I was able to breathe new life into an older system while achieving seamless functionality with Apple Home. Automating my alarms with geofencing has significantly enhanced the convenience of securing my home. If you’re considering a similar upgrade, this demonstrates that even legacy systems can adapt to modern smart home needs with the right tools and effort.</p><p><strong>A big thank you to </strong><a href="https://github.com/haywirecoder?ref=danielraffel.me"><strong>haywirecoder</strong></a><strong> for developing the Homebridge plugin for Envisalink Ademco and for taking the time to explain to me how to use speedkeys for arming and disarming.</strong> <strong>And, for updating it to work with the new Homebridge v2 plugin spec. 🙇</strong></p><hr><p><strong>Update 9/25</strong></p><p>I recently upgraded to <a href="https://github.com/haywirecoder/homebridge-envisalink-ademco?ref=danielraffel.me">homebridge-envisalink-ademco version 2.1.0</a>  and found that my config required a very minor tweak (in fact, I believe it may have already been broken sometime before the update). If you are upgrading, the speed key is removed from existing first person arrives home, last person leaves home automations and needs to be manually re-added. I adjusted my Speed Key settings to look like this in the Homebridge envisalink-ademco plug-in config:</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2025/09/image-1.png" class="kg-image" alt="" loading="lazy" width="1536" height="1480" srcset="https://danielraffel.me/content/images/size/w600/2025/09/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2025/09/image-1.png 1000w, https://danielraffel.me/content/images/2025/09/image-1.png 1536w" sizes="(min-width: 720px) 720px"></figure><p>And, my JSON config now looks like this:</p><pre><code>"speedKeys": [
    {
        "name": "Disarm",
        "speedcommand": "custom",
        "customcommand": "@pin1"
    },
    {
        "name": "Away",
        "speedcommand": "custom",
        "customcommand": "@pin2"
    }
</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Setup SSL on a Locally Hosted Home Assistant Instance for Push Notifications ]]></title>
        <description><![CDATA[ To enable push notifications in Home Assistant, you need to register SSL certificates for your domain. Fortunately, the Nginx Proxy Manager add-on makes this process straightforward. ]]></description>
        <link>https://danielraffel.me/til/2024/12/02/how-to-easily-setup-ssl-on-a-locally-hosted-home-assistant-instance-for-push-notifications/</link>
        <guid isPermaLink="false">674e074eabb4b1035236ab99</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 02 Dec 2024 12:30:57 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/12/DALL-E-2024-12-02-12.28.17---A-visually-striking-image-in-the-style-of-Paul-Rand--focusing-on-a-minimalist-design-to-convey-the-concept-of-setting-up-SSL-for-a-locally-hosted-Home.png" medium="image"/>
        <content:encoded><![CDATA[ <p>To enable push notifications in <a href="http://home-assistant.io/?ref=danielraffel.me" rel="noreferrer">Home Assistant</a>, you need to register SSL certificates for your domain. If your Home Assistant instance is running on a private network without exposed public ports (e.g., your home network), you’ll need to take a few extra steps to obtain an SSL certificate. Fortunately, the <a href="https://github.com/NginxProxyManager/nginx-proxy-manager?ref=danielraffel.me" rel="noreferrer">Nginx Proxy Manager</a> add-on makes this process straightforward. This guide covers configuring a domain hosted on Cloudflare but should be (mostly) applicable to other domain hosts as well.</p><h3 id="step-1-set-up-a-subdomain-on-your-dns-provider">Step 1: Set Up a Subdomain on Your DNS provider</h3><p>If you’re adding a subdomain to an existing domain, log in to your DNS provider (e.g., Cloudflare) and create a new A record. For this example, we'll set up <code>your.site.com</code> pointing to your Home Assistant's private IP address, such as <code>192.168.86.99</code>.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-3.png" class="kg-image" alt="" loading="lazy" width="2000" height="984" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-3.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-3.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-3.png 1600w, https://danielraffel.me/content/images/2024/12/image-3.png 2094w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Cloudflare DNS Management</span></figcaption></figure><h3 id="step-2-create-api-token-cloudflare-specific">Step 2: Create API Token (Cloudflare specific)</h3><p>This will differ depending on your DNS provider. On Cloudflare, you will need to create an API token by following these steps:</p><p>1. Register a new <a href="https://dash.cloudflare.com/profile/api-tokens?ref=danielraffel.me" rel="noreferrer">API Token</a> by tapping <strong>Create New Token</strong></p><p>2.<strong> Complete the following fields:</strong></p><ul><li><strong>Token Name:</strong> Name the token appropriately (e.g., "Home_Assistant") for clarity.</li><li><strong>Permissions:</strong><ul><li>Set <code>Zone &gt; Zone</code> to <code>Read</code>.</li><li>Set <code>Zone &gt; DNS</code> to <code>Edit</code>. This allows you to modify DNS records while ensuring other zone settings remain secure.</li></ul></li><li><strong>Zone Resources:</strong><ul><li>Restrict the token to a specific zone (<code>site.com</code>) to minimize exposure.</li></ul></li><li><strong>Client IP Address Filtering (Optional):</strong><ul><li>If you want to restrict API access to specific IPs (e.g., your Home Assistant instance), set the <code>Operator</code> to <code>Include</code> and input the trusted IP address or range.</li></ul></li></ul><p><strong>3. TTL (Optional):</strong></p><ul><li>If you prefer the token to have a limited lifespan, define the start and end date.</li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-12.png" class="kg-image" alt="" loading="lazy" width="1500" height="1534" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-12.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-12.png 1000w, https://danielraffel.me/content/images/2024/12/image-12.png 1500w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Cloudflare API Taken Creation for </span><code spellcheck="false" style="white-space: pre-wrap;"><span>site.com</span></code></figcaption></figure><p>After proceeding, you’ll receive your unique API token. Be sure to save it in a secure location. You’ll also need to use it in Step 4 when registering a new SSL certificate, where you’ll be prompted for the Cloudflare Credential file API token.</p><h3 id="step-3-install-nginx-proxy-manager">Step 3: Install Nginx Proxy Manager</h3><p>1. Install the <a href="https://github.com/NginxProxyManager/nginx-proxy-manager?ref=danielraffel.me" rel="noreferrer">Nginx Proxy Manager</a> add-on in Home Assistant.</p><p>2. Go to the <strong>Info</strong> tab and click <strong>Start</strong> to activate the add-on.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-6.png" class="kg-image" alt="" loading="lazy" width="2000" height="1066" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-6.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-6.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-6.png 1600w, https://danielraffel.me/content/images/2024/12/image-6.png 2128w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Info tab</span></figcaption></figure><p>3. Navigate to the <strong>Configuration</strong> tab, specify the port you’ll use for SSL (e.g., 443), and click <strong>Save</strong>.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-5.png" class="kg-image" alt="" loading="lazy" width="2000" height="1015" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-5.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-5.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-5.png 1600w, https://danielraffel.me/content/images/2024/12/image-5.png 2120w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Configuration tab</span></figcaption></figure><p>4. Return to the <strong>Info</strong> tab and click <strong>Open Web UI</strong> to access the Nginx Proxy Manager interface.</p><h3 id="step-4-configure-the-proxy-host">Step 4: Configure the Proxy Host</h3><p>1. In the Nginx Proxy Manager interface, go to <strong>Hosts &gt; Proxy Hosts</strong>.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-7.png" class="kg-image" alt="" loading="lazy" width="2000" height="507" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-7.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-7.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-7.png 1600w, https://danielraffel.me/content/images/size/w2400/2024/12/image-7.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Nginx Proxy Manager interface</span></figcaption></figure><p>2. On the new page, tap <strong>Add Proxy Host</strong>. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-8.png" class="kg-image" alt="" loading="lazy" width="238" height="98"><figcaption><span style="white-space: pre-wrap;">After navigating to proxy hosts, tap </span><code spellcheck="false" style="white-space: pre-wrap;"><span>add proxy host</span></code></figcaption></figure><p>3. When the proxy host window pops up enter your domain name (e.g., <code>your.site.com</code>), configure the forwarding hostname/IP (e.g., <code>192.168.86.99</code>), set the forward port (e.g., 8123), and enable <strong>Websockets Support</strong>.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image-1.png" class="kg-image" alt="" loading="lazy" width="980" height="1068" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-1.png 600w, https://danielraffel.me/content/images/2024/12/image-1.png 980w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">New proxy</span></figcaption></figure><p>4. Switch to the <strong>SSL</strong> tab and select:</p><ul><li><strong>Request a new SSL certificate</strong>. </li><li>Enable options: <strong>Force SSL</strong>, <strong>HTTP/2 Support</strong>, <strong>HSTS Enabled</strong>, <strong>HSTS Subdomains</strong>, and <strong>Use a DNS Challenge</strong>. </li><li>Select your DNS provider (eg Cloudflare). In the <code>Credentials File Content</code> section, make sure to update it with the API token from Step 2. </li><li>Attempt to save.</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-2.png" class="kg-image" alt="" loading="lazy" width="970" height="1550" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-2.png 600w, https://danielraffel.me/content/images/2024/12/image-2.png 970w" sizes="(min-width: 720px) 720px"></figure><h3 id="might-need-to-troubleshoot-networking-issues">Might Need to Troubleshoot Networking Issues</h3><p>If you encounter issues while saving, check the Home Assistant logs to identify the problem:</p><p>1. Go to <strong>Settings &gt; System &gt; Logs</strong> in Home Assistant.</p><p>2. Tap the overflow menu and select <strong>Show Raw Logs</strong> to review detailed error messages.</p><p>For example, I encountered the following error:</p><pre><code class="language-bash">Logger: homeassistant.components.http.forwarded
Source: components/http/forwarded.py:90
Integration: HTTP
A request from a reverse proxy was received from 172.30.33.2, but your HTTP settings do not allow it.
</code></pre><p>This error indicates that the reverse proxy's IP (<code>172.30.33.2</code>) was being denied. To resolve this, add the following lines to your <code>configuration.yaml</code> file:</p><pre><code class="language-yaml">http:
  use_x_forwarded_for: true
  trusted_proxies:
    - 172.30.33.0/24
</code></pre><p>Restart Home Assistant for the changes to take effect.</p><h3 id="step-5-register-the-domain-again">Step 5: Register the Domain Again</h3><p>After addressing the issue, return to the Nginx Proxy Manager web UI and attempt to register the SSL certificate again. This time, the domain should hopefully register successfully.</p><h3 id="step-6-add-the-html-add-on">Step 6: Add the HTML Add-on</h3><p>I can’t remember if it was strictly necessary or exactly when I did it in this process, but I definitely installed the <a href="https://www.home-assistant.io/integrations/html5/?ref=danielraffel.me" rel="noreferrer">HTML5 Push Notifications</a> add-on and recommend doing the same.</p><h3 id="step-7-test-sending-a-notification-to-the-home-assistant-mobile-app">Step 7: Test Sending a Notification to the Home Assistant Mobile App</h3><p>1. In Home Assistant go to <strong>Developer Tools &gt; Actions</strong></p><p>2. Assuming you've already installed the Home Assistant mobile app and successfully logged in select the action <code>Notifications: Send a notification via mobile_app</code> and enter a message and title and tap <code>perform action</code>.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-9.png" class="kg-image" alt="" loading="lazy" width="2000" height="1207" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-9.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-9.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/12/image-9.png 1600w, https://danielraffel.me/content/images/size/w2400/2024/12/image-9.png 2400w" sizes="(min-width: 720px) 720px"></figure><p>3. Assuming all is working you'll hopefully see the notification on your mobile device.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-10.png" class="kg-image" alt="" loading="lazy" width="946" height="308" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-10.png 600w, https://danielraffel.me/content/images/2024/12/image-10.png 946w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Optimized Charging My EV Using Excess Solar Power with Home Assistant ]]></title>
        <description><![CDATA[ As an EV owner with home solar, I wanted to optimize my charging process to make the most out of excess solar energy. This motivated me to explore ways to automate my EV charging, ultimately leading to a complex setup with two automations specifically designed to meet my goals. ]]></description>
        <link>https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/</link>
        <guid isPermaLink="false">674a044447db19034c3ea6e7</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 29 Nov 2024 21:37:58 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-29-21.33.30---A-Paul-Rand-inspired-geometric-illustration-featuring-a-Tesla-car-charging-from-a-solar-powered-system.-The-design-uses-bold--clean-lines-with-abstrac.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR: How I transformed my basic EV charger into a smart one for just $6, enabling it to charge from excess solar and battery storage—skipping the need for a $600-$1200 aftermarket charger with fewer features and no battery support. </strong></p><p><strong>Update: </strong><a href="https://danielraffel.me/2025/01/03/how-i-automated-ev-charging-using-evcc-an-eg4-solar-inverter-and-a-tesla-mobile-connector/" rel="noreferrer"><strong>Definitely consider EVCC i</strong></a><a href="https://danielraffel.me/2025/01/03/how-i-automated-ev-charging-using-evcc-an-eg4-solar-inverter-and-a-tesla-mobile-connector/" rel="noreferrer"><strong>nstead</strong></a><strong>.</strong></p><h3 id="overview">Overview</h3><p>As a Tesla owner with an <a href="https://danielraffel.me/2024/10/28/how-i-built-a-real-time-solar-dashboard-for-my-eg4-18kpv-inverter-using-home-assistant/" rel="noreferrer">EG4 home solar setup</a>, I wanted to optimize my charging process to make the most out of excess solar energy. I charge my vehicle with the <a href="https://www.tesla.com/support/charging/mobile-connector?ref=danielraffel.me" rel="noreferrer">mobile connector</a> that came with my car. It lacks internet connectivity, cannot be programmed to start or stop charging remotely, and cannot dynamically adjust amperage based on the excess energy currently being harvested.</p><p>This inspired me to explore automation options for my EV charging, ultimately leading to the development of two automations in Home Assistant tailored to achieve my goals.</p><p>While I thoroughly enjoyed going down this rabbit hole, I’d not recommend it to others unless you really enjoy <a href="https://dictionary.cambridge.org/us/dictionary/english/geek-out?ref=danielraffel.me" rel="noreferrer">geeking out</a>. Premium vertical solar integrations from <a href="https://www.tesla.com/support/tesla-app/charge-on-solar?ref=danielraffel.me" rel="noreferrer">Tesla</a>, <a href="https://www.span.io/drive?ref=danielraffel.me" rel="noreferrer">Span</a>, <a href="https://support.enphase.com/s/article/how-do-i-charge-my-ev-using-only-solar-energy?ref=danielraffel.me" rel="noreferrer">Enphase</a>, and aftermarket EV smart chargers like <a href="https://www.emporiaenergy.com/emporia-ev-charger-with-load-management/?ref=danielraffel.me" rel="noreferrer">Emporia</a> and <a href="https://wallbox.com/en_us/pulsar-plus-ev-charger?ref=danielraffel.me" rel="noreferrer">Wallbox</a> offer reliable and robust smart EV charging solutions compared to the custom setup I’m about to describe in detail. They're worth considering! While they do have some limitations, they provide a hassle-free experience. That said, I wanted to challenge myself to build a custom setup on the cheap and add some functionality they don’t support.</p><hr><h2 id="table-of-contents">Table of Contents</h2><ul><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#background" rel="noreferrer">Background</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#use-case-overview" rel="noreferrer">Use Case Overview</a><ul><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#energy-flow-and-efficiency-challenges" rel="noreferrer">Energy Flow and Efficiency Challenges</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#about-the-automations" rel="noreferrer">About the Automations</a></li></ul></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#prerequisites" rel="noreferrer">Prerequisites</a><ul><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#mqtt-broker-configuration" rel="noreferrer">MQTT Broker Configuration</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#home-assistant-configuration" rel="noreferrer">Home Assistant Configuration</a><ul><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#creating-input-numbers-for-thresholds" rel="noreferrer">Creating Input Numbers for Thresholds</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#input-datetime-for-tracking" rel="noreferrer">Input Datetime for Tracking</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#binary-sensors" rel="noreferrer">Binary Sensors</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#input-boolean-for-manual-charging" rel="noreferrer">Input Boolean for Manual Charging</a></li></ul></li></ul></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#the-automation-workflows" rel="noreferrer">The Automation Workflows</a><ul><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#charge-tesla-without-draining-home-battery" rel="noreferrer">Charge Tesla Without Draining Home Battery</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#charge-tesla-from-pv-excess" rel="noreferrer">Charge Tesla from PV Excess</a></li></ul></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#custom-dashboard-setup" rel="noreferrer">Custom Dashboard Setup</a></li><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/#conclusion" rel="noreferrer">Conclusion</a></li></ul><hr><h2 id="background">Background</h2><p>When I installed solar panels and batteries, I initially explored upgrading to a smart EV charger, such as the <a href="https://shop.emporiaenergy.com/products/emporia-level-2-ev-charger-with-load-management?srsltid=AfmBOooK0Bn2vXoSulum2yw_S84oWIeLRlwlFrwjtkWv4I1jNCy_5grl&ref=danielraffel.me" rel="noreferrer">Emporia Level 2 EV Charger with PowerSmart Load Management</a> or the <a href="https://wallbox.com/en_us/pulsar-plus-ev-charger?ref=danielraffel.me" rel="noreferrer">Wallbox Pulsar Plus</a> paired with the <a href="https://wallbox.com/en_ca/power-meter-for-energy-management-solutions?ref=danielraffel.me" rel="noreferrer">required Power Meter</a>. </p><p>Both of these solutions can adjust the amperage they supply to an EV based on excess solar production. By dynamically reducing or increasing the charging rate, they ensure efficient use of available solar energy, minimize reliance on the grid, and help avoid overloading the system during periods of lower solar output.</p><p>However, their costs range from $600 to $1,200. Moreover, the Emporia charger depends on a cloud provider that could potentially shut down, leaving the unit unusable. Neither option offered integrations that consider home batteries, limiting sourcing options to grid-only, grid and photovoltaic (PV), or PV alone. </p><p>For me, it was important to account for both charging and discharging my home battery when charging my electric vehicle (EV). While I’m comfortable using some of my home battery’s stored energy to charge my EV, it's not efficient and I want to ensure there’s always enough reserve for essential household needs and emergencies.</p><p>While researching how to integrate my EV charger into my solar setup, I discovered that my inverter features a <a href="https://eg4electronics.com/wp-content/uploads/2024/09/EG4-18kPV-Smart-Loads-Wiring-Diagram.pdf?ref=danielraffel.me" rel="noreferrer">Smart Loads</a> plug. </p><blockquote>Smart Loads are loads that the user wants to intelligently, strategically, and automatically enable and disable for the purposes of <a href="https://www.techtarget.com/searchdatacenter/definition/load-shedding?ref=danielraffel.me#:~:text=Load%20shedding%20(loadshedding)%20is%20a,primary%20power%20source%20can%20supply." rel="noreferrer">Load Shedding</a> and Power Shedding in order to maximize Time of Use savings, Off-Grid operation, or maximize or minimize power sold back to the grid.</blockquote><p>Initially, it seemed like a promising solution, offering configurable thresholds and failover scenarios to manage how the grid, batteries, and PV supply power to the plug.</p><p>However, further research revealed Smart Loads to be buggy, with unclear logic and poor documentation. My conclusion was that the inverter’s hardware was originally designed as an input plug for generators, and later adapted via software for dual-purpose use as a power output—without fully addressing the hardware requirements for the additional functions that were added in software. I decided I wasn't comfortable relying on a hacky plug for my EV charging.</p><p>To address these challenges, I aimed to purpose build a solution using a combination of software tools and low-cost hardware components. The setup described below dynamically adjusts the Tesla’s charging amperage based on real-time solar production, home battery status, and my specific charging requirements.</p><p>I chose to use the <a href="https://github.com/yoziru/esphome-tesla-ble?ref=danielraffel.me" rel="noreferrer">ESPHome Tesla BLE</a> integration, which communicates with the car via Bluetooth to adjust charging settings,&nbsp; instead of the <a href="https://developer.tesla.com/docs/fleet-api?ref=danielraffel.me" rel="noreferrer">Tesla Fleet API</a> for several reasons. By avoiding the Fleet API, I sidestepped potential rate limits and dependency on a hosted cloud service for real-time adjustments. I also anticipated Tesla introducing fees that would make the API prohibitively expensive, a prediction that has since <a href="https://electrek.co/2024/11/28/tesla-releases-api-pricing-dev-says-would-cost-60-million-per-year-to-run-his-3rd-party-app/?ref=danielraffel.me" rel="noreferrer">proven accurate</a>.</p><p>In a previous post, I explained <a href="https://danielraffel.me/til/2024/11/27/how-i-set-up-an-esp32-ble-key-with-my-tesla-and-home-assistant-for-solar-charging-optimization/" rel="noreferrer">how I set up ESPHome on an ESP32 chip and integrated it with my Tesla and Home Assistant</a> to optimize solar charging.</p><p>In this post, I’ll guide you through automating the system using <a href="https://www.home-assistant.io/?ref=danielraffel.me" rel="noreferrer">Home Assistant</a>, <a href="https://esphome.io/?ref=danielraffel.me" rel="noreferrer">ESPHome</a>, <a href="http://teslamate.org/?ref=danielraffel.me" rel="noreferrer">Teslamate</a>, <a href="https://mosquitto.org/?ref=danielraffel.me" rel="noreferrer">Eclipse MQTT</a>, and a <a href="https://eg4electronics.com/categories/inverters/eg4-18kpv-12lv-all-in-one-hybrid-inverter/?ref=danielraffel.me" rel="noreferrer">EG4 solar inverter</a>. I’ll also show you how to use a dashboard to monitor the automations and adjust parameters. </p><p><strong>A quick note: if you decide to use this, proceed at your own risk—I make no guarantees about the reliability of my Rube Goldberg-style charging system.</strong></p><hr><h2 id="use-case-overview">Use Case Overview</h2><p>The goal is to dynamically adjust the Tesla's charging amperage based on:</p><ul><li><strong>Excess Solar Production</strong>: Utilize surplus solar energy that would otherwise be stored in home batteries or exported to the grid.</li><li><strong>Home Battery Status</strong>: Prioritize charging the home battery based on preferred battery reserves.</li><li><strong>Car's Rated Battery Range</strong>: Adjust charging priority dynamically for daily use—lowering priority when the range exceeds 120 miles and raising it when the range drops below 99 miles.</li><li><strong>Support for Manual Override:</strong>&nbsp;For long trips requiring a full charge, I need the option to activate a highly reliable “one-click” mode to charge the car at maximum amperage using the grid while setting a custom battery discharge threshold to prevent the home battery from depleting to its minimum allowed levels. Although this scenario is infrequent, it is crucial that, once activated, it runs uninterrupted by any other automations during the charging period.</li></ul><h3 id="energy-flow-and-efficiency-challenges">Energy Flow and Efficiency Challenges</h3><p>I quickly discovered that optimizing energy usage requires a basic understanding of the physics of electrical flow, particularly the interplay between DC (direct current) and AC (alternating current). Solar panels and home batteries typically operate on DC power, while most household appliances use AC. Converting between these forms introduces efficiency losses, and while my solar system can invert in one direction and convert in the other, it cannot perform both functions simultaneously.</p><p>In an ideal setup, I would have the flexibility to power my house with excess PV energy during the day while simultaneously charging my home battery and, when needed, discharging the battery to supply power to both my car and home concurrently. Unfortunately, I do not have that option. </p><p>To maximize energy efficiency and operate within physical constraints, I discovered that I need to carefully manage power flows, balancing the charging of the home and car batteries during the day with excess solar energy while reserving the battery to power the house at night.</p><p>A smart energy management system is essential to address these challenges. It optimizes energy efficiency, fulfills typical daily EV driving requirements, and ensures the home battery remains a dependable power source, reducing reliance on the grid. To achieve this, I developed a few automations.</p><h3 id="about-the-automations">About the Automations</h3><ol><li><strong>Charge Tesla from PV Excess</strong>&nbsp;(Automatic):</li></ol><ul><li><strong>Purpose</strong>: Automatically harvest excess solar energy to charge the Tesla, suitable for daily driving needs.</li><li><strong>Operation:</strong>&nbsp;This automation runs during daylight hours when solar production is sufficient, dynamically adjusting the EV charging rate based on real-time solar data, the car’s battery range, and home battery thresholds.</li><li><strong>Priority</strong>: Lower priority; designed not to interfere with manual charging needs.</li></ul><ol start="2"><li><strong>Charge Tesla Without Draining Home Battery</strong>&nbsp;(Manual):</li></ol><ul><li><strong>Purpose</strong>: Manually triggered to charge the Tesla at maximum power without fully depleting the home battery. Designed for infrequent usage when the car needs to be fully charged for a long road trip.</li><li><strong>Operation</strong>: Temporarily overrides the PV excess automation to prioritize rapid charging, with an adjustable threshold for the battery’s discharge rate. This threshold ensures the home battery contributes without being fully drained and resets to its original setting afterward.</li><li><strong>Priority</strong>: Higher priority; it can override the PV excess automation but not vice versa, preventing potential disruptions to important travel plans.</li></ul><p>By implementing both automations, it's possible to:</p><ul><li><strong>Optimize Daily Charging</strong>: Make the most of solar energy for everyday driving.</li><li><strong>Ensure Readiness for Trips</strong>: Quickly charge the car when needed without worrying about home battery levels.</li></ul><hr><h2 id="prerequisites">Prerequisites</h2><p><a href="https://www.youtube.com/watch?v=T5ut41xU2TI&ref=danielraffel.me" rel="noreferrer">Brace yourself Jason</a>. To set up this system, you'll need the following:</p><ul><li><a href="http://home-assistant.io/?ref=danielraffel.me" rel="noreferrer">Home Assistant</a> instance: To host the automations, integrate all the data sources and expose a real-time dashboard.</li><li><a href="https://github.com/zakery292/monitormysolar/tree/main?ref=danielraffel.me">Monitor My Solar HACS Integration</a> with their <a href="https://monitormy.solar/detail/13?ref=danielraffel.me">Dongle</a>: For reading and writing data to an <a href="https://eg4electronics.com/categories/inverters/eg4-12kpv-all-in-one-hybrid-inverter/?ref=danielraffel.me" rel="noreferrer">EG4 18kpv solar inverter</a> connected to the grid, batteries, PV panels and an EV charger.</li><li><a href="https://teslamate.org/?ref=danielraffel.me">Teslamate</a> with the <a href="https://docs.teslamate.org/docs/integrations/home_assistant?ref=danielraffel.me">Teslamate HA MQTT Integration</a>: For reading the vehicles rated battery range in miles.</li><li><a href="https://mosquitto.org/?ref=danielraffel.me" rel="noreferrer">Eclipse MQTT</a> (in bridge mode): To allow Home Assistant to import MQTT data from both the solar inverter Dongle and vehicle stats from Teslamate.</li><li><a href="https://github.com/yoziru/esphome-tesla-ble?ref=danielraffel.me">ESPHome Tesla BLE</a> with an <a href="https://danielraffel.me/til/2024/11/27/how-i-set-up-an-esp32-ble-key-with-my-tesla-and-home-assistant-for-solar-charging-optimization/">ESP32 Chip</a>: To communicate with the Tesla over Bluetooth to know when the charge flap is open, adjust amperage rates, and manage charging.</li></ul><h2 id="mqtt-broker-configuration">MQTT Broker Configuration</h2><p>Since Home Assistant only allows one MQTT integration, I needed to bridge data from both the solar inverter dongle and Teslamate. I used <a href="https://mosquitto.org/?ref=danielraffel.me">Eclipse Mosquitto</a> to handle this.</p><p><strong>Update your Mosquitto configuration file at <code>/usr/local/etc/mosquitto/mosquitto.conf</code>:</strong></p><pre><code class="language-ini"># Default listener for authenticated clients (dongle)
listener 1883 0.0.0.0
allow_anonymous false
password_file /etc/mosquitto/passwd

# Listener for TeslaMate (no authentication required)
listener 1884 0.0.0.0
allow_anonymous true

# Persistence settings
persistence true
persistence_location /usr/local/var/lib/mosquitto/

# Logging settings
log_dest file /usr/local/var/log/mosquitto/mosquitto.log
log_type error
log_type warning
log_type notice
log_type information

# Bridge configuration to ingest TeslaMate data
connection teslamate_bridge
address tesla.yourdomain.com:1883
topic teslamate/# in
</code></pre><p><em><strong>Note:</strong> Replace <code>tesla.yourdomain.com</code>  with your actual Teslamate domain.</em></p><p><strong>Key Details:</strong></p><ul><li><strong>Multiple Listeners</strong>: Configured two listeners to handle authenticated (dongle) and unauthenticated (Teslamate) clients.</li><li><strong>Bridging</strong>: The <code>connection</code> and <code>topic</code> settings bridge data from Teslamate into the MQTT broker.</li><li><strong>Note:</strong> My Teslamate instance is running on my Tailscale network with no open ports, so I’m not using a password. However, you might want to consider setting one!</li></ul><hr><h2 id="home-assistant-configuration">Home Assistant Configuration</h2><p>All these YAML configuration files must be added to Home Assistant, and the system needs to be restarted for the changes to take effect.</p><h3 id="add-to-configurationyaml">Add to <code>configuration.yaml</code></h3><p>To keep things organized, I included separate YAML files for different configurations. These files will need to be created and live at the same level as your Home Assistant <code>configuration.yaml</code>.</p><pre><code class="language-yaml">input_number: !include tesla_charging.yaml
input_datetime: !include input_datetime.yaml
input_boolean: !include input_boolean.yaml
</code></pre><h3 id="creating-input-numbers-for-thresholds">Creating Input Numbers for Thresholds</h3><p><strong>Create <code>tesla_charging.yaml</code> with the following content:</strong></p><pre><code class="language-yaml">tesla_range_threshold:
  name: Tesla Range Threshold  
  min: 50                             
  max: 200                            
  step: 1                             
  unit_of_measurement: "mi"
  icon: mdi:car-battery        
  mode: box                           
  initial: 120                        

tesla_priority_range_threshold:
  name: Tesla Priority Range Threshold
  min: 50                          
  max: 200                         
  step: 1                          
  unit_of_measurement: "mi"
  icon: mdi:car-electric     
  mode: box                        
  initial: 99                      

tesla_high_battery_threshold:
  name: Home Battery High Threshold
  min: 0                          
  max: 100                        
  step: 1                         
  unit_of_measurement: "%"
  icon: mdi:battery-high    
  mode: box                       
  initial: 65                     

tesla_low_battery_threshold:
  name: Home Battery Low Threshold
  min: 0                  
  max: 100                
  step: 1                
  unit_of_measurement: "%"
  icon: mdi:battery-low   
  mode: box                    
  initial: 50                  

tesla_home_power_buffer:
  name: Home Power Buffer 
  min: 0                       
  max: 2000                       
  step: 100                       
  unit_of_measurement: "W"        
  icon: mdi:home-lightning-bolt
  mode: box                 
  initial: 500                    

tesla_validation_period:
  name: Validation Wait Time
  min: 1                
  max: 60                   
  step: 1                   
  unit_of_measurement: minutes
  icon: mdi:timer-outline
  mode: box    
  initial: 3  # Default wait time
</code></pre><p><strong>Key Details:</strong></p><ul><li><strong>Thresholds</strong>: Define customizable thresholds for car range, battery levels, and power buffer.</li><li><strong>Adjustable Parameters</strong>: These inputs allow you to tweak the system behavior directly from a friendly dashboard in Home Assistant so you don't need to dive into code to modify the <strong>Charge Tesla from PV Excess</strong> automation business logic.</li></ul><h3 id="input-datetime-for-tracking">Input Datetime for Tracking</h3><p>Used to store timestamps for when certain conditions are met.</p><p><strong>Create <code>input_datetime.yaml</code>:</strong></p><pre><code class="language-yaml">tesla_last_below_min_amps:
  name: Last Time Below Min Amps
  has_date: true
  has_time: true
</code></pre><h3 id="binary-sensors">Binary Sensors</h3><p>Define sensors to monitor the charging status of the car and home battery.</p><p><strong>Add to <code>binary_sensor.yaml</code>:</strong></p><pre><code class="language-yaml">- platform: template
  sensors:
    tesla_charging_status:
      friendly_name: "EV Charging"
      value_template: &gt;-
        {{ is_state('switch.tesla_ble_your_device_id_charger_switch', 'on') }}
      attribute_templates:
        power: &gt;-
          {% if is_state('switch.tesla_ble_your_device_id_charger_switch', 'on') %}
            {% set amps = states('number.tesla_ble_your_device_id_charging_amps') | float %}
            {{ (amps * 240) | round }}
          {% else %}
            0
          {% endif %}

    home_battery_charging_status:
      friendly_name: "Battery Charging"
      value_template: &gt;-
        {% set flow = states('sensor.dongle_device_id_batteryflow_live') | float %}
        {{ flow &gt; 0 }}
      attribute_templates:
        power: &gt;-
          {% set flow = states('sensor.dongle_device_id_batteryflow_live') | float %}
          {{ flow if flow &gt; 0 else 0 }}
</code></pre><p><strong>Note:</strong> To customize code on this page for your own use, you'll need to replace:</p><ul><li><code>your_device_id</code>&nbsp;with your <a href="https://danielraffel.me/til/2024/11/27/how-i-set-up-an-esp32-ble-key-with-my-tesla-and-home-assistant-for-solar-charging-optimization/" rel="noreferrer">Tesla BLE device ID</a> (e.g., if your device ID is abc123, use&nbsp;<code>tesla_ble_abc123_charging_amps</code>)</li><li><code>dongle_device_id</code>&nbsp;with your device ID (e.g., if your <a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/" rel="noreferrer">dongle ID</a> is dongle-XX:XX:XX:XX:XX:XX, use&nbsp;<code>dongle_XX_XX_XX_XX_XX_XX_soc</code>)</li></ul><h3 id="input-boolean-for-manual-charging">Input Boolean for Manual Charging</h3><p>To manually control the charging process and prevent automations from interfering.</p><p><strong>Create <code>input_boolean.yaml</code>:</strong></p><pre><code class="language-yaml">tesla_manual_charging_active:
  name: Tesla Manual Charging Active
  initial: off
</code></pre><hr><h2 id="the-automation-workflows">The Automation Workflows</h2><p>In Home Assistant, the following two automations run independently: one is manual, and the other is automatic. They are specifically designed to avoid interfering with each other, even if triggered simultaneously. Additionally, they include some mechanisms to recover and resume functionality if Home Assistant is interrupted or restarted.</p><h3 id="charge-tesla-without-draining-home-battery">Charge Tesla Without Draining Home Battery</h3><p>This automation is triggered manually and is designed to ensure that charging the Tesla doesn't deplete the home battery below desired thresholds. It also includes recovery logic to reset settings if charging is interrupted. Currently, I’m allowing my batteries to help charge the car when I’m in a rush, though this may not be a long-term solution due to efficiency losses—around 3% for DC-to-DC conversion and another 10% for DC-to-AC. My reasoning for doing this now is that I installed the system in winter and won’t start earning credits by selling energy back until summer.</p><h4 id="business-logic">Business Logic</h4><ul><li><strong>Manual Activation</strong>: This automation is manually triggered when you need to charge the car at maximum power, such as before a long trip.</li><li><strong>Home Battery Protection</strong>:<br>My home batteries are set to discharge to 10% when connected to the grid.<ul><li>If the home battery's state of charge (SOC) is above 51%, it allows the battery to discharge down to 50%.</li><li>If the SOC is below 50%, it sets the end of discharge (EOD) to 50% to prevent the battery from draining further. When EV charging is complete the EOD is set back to 10%.<ul><li><em><strong>Note: </strong>I’ve heard that the chips in the EG4 inverter use NOR flash, which may have a write cycle lifespan of 10,000 to 100,000 cycles. To preserve their longevity, I’ve been cautious about minimizing the frequency of register writes.</em></li></ul></li></ul></li><li><strong>Charging Process</strong>:<ul><li>Wakes up the car and sets the charging amperage to 32A (maximum).</li><li>Turns on the charger switch to start charging.</li></ul></li><li><strong>Monitoring and Adjustment</strong>:<ul><li>Checks if the charger is still on and adjusts the EOD based on the home battery SOC.</li><li>Stops when the car reaches its charging limit or if charging is manually stopped.</li></ul></li><li><strong>Recovery Logic</strong>:<ul><li>If charging is interrupted, the automation resets the home battery EOD to its initial value.</li><li>Turns off the manual charging indicator to allow other automations to run when complete even if there's a reboot mid process.</li></ul></li></ul><h4 id="adjustable-parameters">Adjustable Parameters</h4><ul><li><strong>Home Battery Thresholds</strong>:<ul><li>Adjust&nbsp;<code>tesla_high_battery_threshold</code>&nbsp;and&nbsp;<code>tesla_low_battery_threshold</code>&nbsp;to set when the automation should adjust the home battery EOD.</li></ul></li><li><strong>Car Charging Limit</strong>:<ul><li>Set via the car's settings; the automation respects the car's charging limit (e.g. 80%, etc.)</li></ul></li></ul><p><strong>Automation YAML:</strong></p><pre><code class="language-yaml">alias: Charge Tesla Don't Kill Home Battery
description: &gt;
  Charges Tesla while managing home battery discharge levels with recovery logic.
  Ensures the home battery doesn't drain excessively during car charging.

trigger:
  - platform: event
    event_type: call_service
    event_data:
      domain: automation
      service: trigger
      service_data:
        entity_id: automation.charge_tesla_dont_kill_home_battery
  - platform: homeassistant
    event: start
  - platform: state
    entity_id: switch.tesla_ble_your_device_id_charger_switch
    to: "off"

condition:
  - condition: state
    entity_id: input_boolean.tesla_manual_charging_active
    state: "on"
  - condition: template
    value_template: &gt;
      {{ (states('number.tesla_ble_your_device_id_battery_level') | float) &lt;
      (states('number.tesla_ble_your_device_id_charging_limit') | float) }}

action:
  - service: input_boolean.turn_on
    target:
      entity_id: input_boolean.tesla_manual_charging_active

  - service: switch.turn_on
    target:
      entity_id: switch.tesla_ble_your_device_id_ble_connection

  - delay: "00:00:01"

  - service: button.press
    target:
      entity_id: button.tesla_ble_your_device_id_wake_up

  - delay: "00:00:02"

  - service: number.set_value
    target:
      entity_id: number.tesla_ble_your_device_id_charging_amps
    data:
      value: 32

  - variables:
      initial_eod: "{{ states('number.dongle_device_id_eod') }}"

  - choose:
      - conditions:
          - condition: numeric_state
            entity_id: sensor.dongle_device_id_soc
            above: 51
        sequence:
          - service: number.set_value
            target:
              entity_id: number.dongle_device_id_eod
            data:
              value: 10
      - conditions:
          - condition: numeric_state
            entity_id: sensor.dongle_device_id_soc
            below: 50
        sequence:
          - service: number.set_value
            target:
              entity_id: number.dongle_device_id_eod
            data:
              value: 50

  - service: switch.turn_on
    target:
      entity_id: switch.tesla_ble_your_device_id_charger_switch

  - repeat:
      sequence:
        - condition: state
          entity_id: switch.tesla_ble_your_device_id_charger_switch
          state: "on"
        - choose:
            - conditions:
                - condition: numeric_state
                  entity_id: sensor.dongle_device_id_soc
                  below: 50
              sequence:
                - service: number.set_value
                  target:
                    entity_id: number.dongle_device_id_eod
                  data:
                    value: 50
        - delay: "00:00:30"
        - condition: template
          value_template: &gt;
            {{ (states('number.tesla_ble_your_device_id_battery_level') | float) &lt;
            (states('number.tesla_ble_your_device_id_charging_limit') | float) }}
      until:
        - condition: or
          conditions:
            - condition: template
              value_template: &gt;
                {{ (states('number.tesla_ble_your_device_id_battery_level') | float) &gt;=
                (states('number.tesla_ble_your_device_id_charging_limit') | float) }}
            - condition: state
              entity_id: switch.tesla_ble_your_device_id_charger_switch
              state: "off"

  - service: input_boolean.turn_off
    target:
      entity_id: input_boolean.tesla_manual_charging_active

  - service: number.set_value
    target:
      entity_id: number.dongle_device_id_eod
    data:
      value: "{{ initial_eod }}"
</code></pre><p><em><strong>Note:</strong> Replace <code>your_device_id</code> with the name of your </em><a href="https://danielraffel.me/til/2024/11/27/how-i-set-up-an-esp32-ble-key-with-my-tesla-and-home-assistant-for-solar-charging-optimization/" rel="noreferrer"><em>ESPHome device</em></a><em> and <code>dongle_device_id</code> with the name of your </em><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/" rel="noreferrer"><em>dongle device macID</em></a><em>.</em></p><p><strong>Key Modifications and Choices:</strong></p><ul><li><strong>Manual Charging Indicator</strong>: Uses <code>input_boolean.tesla_manual_charging_active</code> to prevent other automations from interfering.</li><li><strong>Recovery Logic</strong>: If charging is interrupted (e.g., charger switch turns off), the automation resets the home battery's EOD to its initial value.</li><li><strong>Dynamic EOD Adjustment</strong>: Adjusts the home battery EOD based on its current SOC.</li><li><strong>Repeat Loop with Exit Conditions</strong>: Continues to adjust EOD and monitor charging until the car is fully charged or charging stops.</li></ul><hr><h3 id="charge-tesla-from-pv-excess">Charge Tesla from PV Excess</h3><p>This automation is triggered automatically and is designed to dynamically adjust the Tesla's charging amperage based on excess solar production and the state of the home battery.</p><h4 id="business-logic-1">Business Logic</h4><ul><li><strong>Automatic Operation</strong>: Runs automatically during daylight hours when the car is plugged in and excess solar power is available.</li><li><strong>Charging Logic</strong>:<ul><li><strong>Car Range Priority</strong>:<ul><li>If car range is below the priority threshold (e.g., 99 miles), prioritize charging the car with up to 95% of the excess solar power.</li><li>If car range is between the priority threshold and the general threshold (e.g., 99–120 miles), allocate 50–70% of excess power to the car based on home battery SOC.</li><li>If car range is above the general threshold (e.g., above 120 miles), allocate less power to the car, prioritizing the home battery.</li></ul></li><li><strong>Home Battery SOC</strong>:<ul><li>Adjusts the allocation of excess power based on the home battery's SOC.</li><li>If the home battery is low, more power is allocated to it; if it's high, more power can go to the car.</li></ul></li></ul></li><li><strong>Amperage Calculation</strong>:<ul><li>Calculates the available amperage for charging based on excess solar power.</li><li>Applies a safety limit to ensure the amperage stays between 0 and 32A.</li></ul></li><li><strong>Validation Period</strong>:<ul><li>Includes a validation period (e.g., 3 minutes) to ensure that once the amperage falls to zero it is sustainable before attempting re-engage the charger.</li><li>This is one of the few stopgaps to prevent frequent on/off cycles that could lead to <a href="https://www.tesla.com/support/range?ref=danielraffel.me#:~:text=In%20some%20cases%2C%20you%20may,using%20those%20features%20when%20possible." rel="noreferrer">vampire drain</a>.</li></ul></li></ul><h4 id="adjustable-parameters-1">Adjustable Parameters</h4><ul><li><strong>Thresholds</strong>:<ul><li><code>tesla_range_threshold</code>: Above this range, car charging is deprioritized.</li><li><code>tesla_priority_range_threshold</code>: Below this range, car charging is prioritized.</li><li><code>tesla_high_battery_threshold</code>&nbsp;and&nbsp;<code>tesla_low_battery_threshold</code>: Define the SOC levels for the home battery to adjust power allocation.</li></ul></li><li><strong>Home Power Buffer</strong>:<ul><li><code>tesla_home_power_buffer</code>: Watts reserved to account for rapid fluctuations in home consumption.</li></ul></li><li><strong>Validation Period</strong>:<ul><li><code>tesla_validation_period</code>: Time in minutes that the amperage must be above the minimum sustainable level (e.g. zero) before starting charging.</li></ul></li></ul><p><strong>Automation YAML:</strong></p><pre><code class="language-yaml">alias: 🔌 Charge Tesla From PV Excess
description: &gt;
  Optimizes Tesla charging based on solar production and home battery state.
  Includes a validation period to prevent frequent adjustments.

trigger:
  - platform: state
    entity_id: binary_sensor.tesla_plugged_in
    to: "on"
  - platform: time_pattern
    minutes: "/1"

condition:
  - condition: state
    entity_id: input_boolean.tesla_manual_charging_active
    state: "off"
  - condition: state
    entity_id: binary_sensor.tesla_plugged_in
    state: "on"
  - condition: template
    value_template: &gt;
      {{ states('number.tesla_ble_your_device_id_battery_level') | float &lt; 
      states('number.tesla_ble_your_device_id_charging_limit') | float }}
  - condition: numeric_state
    entity_id: sensor.dongle_device_id_pall
    above: 1000
  - condition: sun
    before: sunset
    after: sunrise
  - condition: template
    value_template: &gt;
      {% set solar_production = states('sensor.dongle_device_id_pall') | float %}
      {% set home_consumption = states('sensor.dongle_device_id_pload') | float %}
      {% set home_buffer_watts = states('input_number.tesla_home_power_buffer') | float %}
      {% set solar_excess = solar_production - home_buffer_watts - home_consumption %}
      {{ solar_excess &gt; 0 }}

action:
  - variables:
      calculated_amps: &gt;
        {# Retrieve thresholds #}
        {% set range_threshold = states('input_number.tesla_range_threshold') | float %}
        {% set priority_range_threshold = states('input_number.tesla_priority_range_threshold') | float %}
        {% set high_battery_threshold = states('input_number.tesla_high_battery_threshold') | float %}
        {% set low_battery_threshold = states('input_number.tesla_low_battery_threshold') | float %}
        {% set home_buffer_watts = states('input_number.tesla_home_power_buffer') | float %}

        {# Calculate available power #}
        {% set solar_production = states('sensor.dongle_device_id_pall') | float %}
        {% set home_consumption = states('sensor.dongle_device_id_pload') | float %}
        {% set solar_excess = solar_production - home_buffer_watts - home_consumption %}
        {% set excess_pv_energy = solar_excess / 240 %}
        {% set car_range = states('sensor.tesla_rated_battery_range_mi') | float %}
        {% set house_battery_soc = states('sensor.dongle_device_id_soc') | float %}

        {# Charging logic #}
        {% if car_range &lt; range_threshold %}
          {# Below range threshold #}
          {% if car_range &lt; priority_range_threshold %}
            {% set max_amps = (excess_pv_energy * 0.95) | int %}
          {% elif house_battery_soc &gt; high_battery_threshold %}
            {% set max_amps = (excess_pv_energy * 0.70) | int %}
          {% else %}
            {% set max_amps = (excess_pv_energy * 0.50) | int %}
          {% endif %}
        {% else %}
          {# Above range threshold #}
          {% if house_battery_soc &lt; low_battery_threshold %}
            {% set max_amps = (excess_pv_energy * 0.30) | int %}
          {% elif house_battery_soc &lt; high_battery_threshold %}
            {% set max_amps = (excess_pv_energy * 0.40) | int %}
          {% else %}
            {% set max_amps = (excess_pv_energy * 0.60) | int %}
          {% endif %}
        {% endif %}

        {# Safety limits #}
        {% if max_amps &gt; 32 %}32{% elif max_amps &lt; 0 %}0{% else %}{{ max_amps }}{% endif %}

  - condition: template
    value_template: &gt;
      {% set current_time = now() %}
      {% set last_below_min = states('input_datetime.tesla_last_below_min_amps') %}
      {% set validation_period_minutes = states('input_number.tesla_validation_period') | float %}
      {% set has_sufficient_amps = calculated_amps | int &gt;= min_sustainable_amps %}

      {% if has_sufficient_amps %}
        {% if not last_below_min %}
          true
        {% else %}
          {% set time_diff = (as_timestamp(current_time) - as_timestamp(last_below_min)) / 60 %}
          {{ time_diff &gt;= validation_period_minutes }}
        {% endif %}
      {% else %}
        false
      {% endif %}

  - choose:
      - conditions:
          - condition: template
            value_template: "{{ calculated_amps | int &lt; min_sustainable_amps }}"
        sequence:
          - service: input_datetime.set_datetime
            target:
              entity_id: input_datetime.tesla_last_below_min_amps
            data:
              timestamp: "{{ now().timestamp() }}"

  - choose:
      - conditions:
          - condition: template
            value_template: "{{ calculated_amps | int &gt;= min_sustainable_amps }}"
        sequence:
          - service: switch.turn_on
            target:
              entity_id: switch.tesla_ble_your_device_id_ble_connection

          - delay: "00:00:01"

          - service: button.press
            target:
              entity_id: button.tesla_ble_your_device_id_wake_up

          - delay: "00:00:02"

          - service: number.set_value
            target:
              entity_id: number.tesla_ble_your_device_id_charging_amps
            data:
              value: "{{ calculated_amps }}"

          - service: switch.turn_on
            target:
              entity_id: switch.tesla_ble_your_device_id_charger_switch

      - conditions:
          - condition: template
            value_template: "{{ calculated_amps | int &lt; min_sustainable_amps }}"
          - condition: state
            entity_id: switch.tesla_ble_your_device_id_charger_switch
            state: "on"
        sequence:
          - service: switch.turn_off
            target:
              entity_id: switch.tesla_ble_your_device_id_charger_switch

variables:
  min_sustainable_amps: 1

mode: single
max_exceeded: silent
</code></pre><p><em><strong>Note:</strong> Replace <code>your_device_id</code> with the name of your </em><a href="https://danielraffel.me/til/2024/11/27/how-i-set-up-an-esp32-ble-key-with-my-tesla-and-home-assistant-for-solar-charging-optimization/" rel="noreferrer"><em>ESPHome device</em></a><em> and <code>dongle_device_id</code> with the name of your </em><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/" rel="noreferrer"><em>dongle device macID</em></a><em>.</em></p><p><strong>Key Features:</strong></p><ul><li><strong>Dynamic Amperage Calculation</strong>: Adjusts charging amperage based on solar excess, car range, and battery SOC.</li><li><strong>Validation Period</strong>: Includes a validation period (adjustable via the dashboard) to ensure the charging current remains above a minimum sustainable level before re-engaging the charger.</li><li><strong>Priority Logic</strong>: Prioritizes car charging or home battery charging based on defined thresholds.</li><li><strong>Safety Checks</strong>: Ensures amperage stays within safe limits (1–32 A).</li></ul><p><strong>Key Modifications and Choices:</strong></p><ul><li><strong>Avoiding Frequent BLE Communication</strong>: The validation period attempts to reduces the frequency of Bluetooth communications with the car, which can be resource-intensive.</li><li><strong>Manual Charging Override</strong>: Checks if manual charging is active to prevent automation conflicts.</li><li><strong>Use of Variables</strong>: Improves readability and maintainability by using variables for thresholds and calculations.</li></ul><hr><h2 id="custom-dashboard-setup">Custom Dashboard Setup</h2><p>I created a custom dashboard in Home Assistant to monitor and adjust the system parameters.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/12/image.png" class="kg-image" alt="" loading="lazy" width="784" height="698" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image.png 600w, https://danielraffel.me/content/images/2024/12/image.png 784w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">When the "Charge Tesla from PV Excess" automation is running I have some visibility into what's happening and why. And, I can see if the Manual grid charging automation is running.</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/image-24.png" class="kg-image" alt="" loading="lazy" width="778" height="878" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-24.png 600w, https://danielraffel.me/content/images/2024/11/image-24.png 778w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">I've exposed most of the parameters in the "Charge Tesla from PV Excess" automation so that I can easily tweak them</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/image-26.png" class="kg-image" alt="" loading="lazy" width="766" height="126" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-26.png 600w, https://danielraffel.me/content/images/2024/11/image-26.png 766w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">To make it easy to fast charge from the grid I added a "button" to manually trigger the automation to "Charge Tesla Without Draining Home Battery" and added a shortcut on my phone</span></figcaption></figure><p><strong>Dashboard YAML Configuration:</strong></p><pre><code class="language-yaml">title: Tesla PV Charging
views:
  - title: Main
    path: main
    badges: []
    cards:
      - type: markdown
        style: |
          ha-card {
            height: auto;
            min-height: 100px;
            padding: 16px;
            margin: 8px;
            background: var(--card-background-color);
            opacity: 0;
            animation: fadeIn 0.5s ease-in forwards;
            animation-delay: 0.5s;
          }
          @keyframes fadeIn {
            to {
              opacity: 1;
            }
          }
          ha-markdown {
            font-size: 14px !important;
            line-height: 1.8 !important;
            min-height: 400px;
          }
        content: |-
          {% if states('sensor.tesla_rated_battery_range_mi') != 'unavailable' 
             and states('sensor.dongle_device_id_soc') != 'unavailable' %}
            {% set range_threshold = states('input_number.tesla_range_threshold') | float(120) %}
            {% set priority_range_threshold = states('input_number.tesla_priority_range_threshold') | float(100) %}
            {% set high_battery_threshold = states('input_number.tesla_high_battery_threshold') | float(80) %}
            {% set low_battery_threshold = states('input_number.tesla_low_battery_threshold') | float(20) %}
            {% set solar_production = states('sensor.dongle_device_id_pall') | float(0) %}
            {% set home_consumption = states('sensor.dongle_device_id_pload') | float(0) %}
            {% set home_buffer_watts = states('input_number.tesla_home_power_buffer') | float(500) %}
            {% set solar_excess = solar_production - home_buffer_watts - home_consumption %}
            {% set excess_pv_energy = solar_excess / 240 %}
            {% set car_range = states('sensor.tesla_rated_battery_range_mi') | float(0) %}
            {% set house_battery_soc = states('sensor.dongle_device_id_soc') | float(0) %}
            {% set actual_charging_current = states('sensor.tesla_charger_actual_current') | float(0) %}
            🚙 Range: {{ car_range | round(1) }} mi
            🔋 Added: {{ states('sensor.tesla_charge_energy_added') | float(0) | round(1) }} kWh{% if car_range &lt; priority_range_threshold %}
            ⚠️ Car: Priority Charging needed (below {{ priority_range_threshold }}mi){%- elif car_range &lt; range_threshold -%}
            📊 Car: Normal Charging mode (below {{ range_threshold }}mi){% else %}
            ⛽ Car: Conservative Charging (above {{ range_threshold }}mi){% endif %}
            🚗 Set Amps: {{ states('number.tesla_ble_your_device_id_charging_amps') | float(0) }}A{% set manual_charging = is_state('input_boolean.tesla_manual_charging_active', 'on') %}
            ⚡ Manual Grid Charging: {{ "On" if manual_charging else "Off" }}
            👀 EV Charging: {% if actual_charging_current &gt; 0 %}Yes {{ (actual_charging_current * 240) | round }}W{% else %}No{% endif %}

            🏡 Battery: {{ house_battery_soc | round(1) }}%
            🪫 Discharge Level: {{ states('number.dongle_device_id_eod') | float(10) }}%{% if house_battery_soc &lt; low_battery_threshold %}
            ⚠️ House: Prioritizing Home Battery (below {{ low_battery_threshold }}%){% elif house_battery_soc &lt; high_battery_threshold %}
            📊 House: Moderate Charging mode (below {{ high_battery_threshold }}%){% else %}
            ✅ House: Increased Car Charging enabled (above {{ high_battery_threshold }}%){% endif %}
            👀 Battery Charging: {% if is_state('binary_sensor.home_battery_charging_status', 'on') %}Yes {{ state_attr('binary_sensor.home_battery_charging_status', 'power') | float(0) | round }}W{% else %}No{% endif %}

            🌞 Harvest: {{ solar_production | round }}W
            🏠 Use: {{ home_consumption | round }}W
            🛟 Buffer: {{ home_buffer_watts | round }}W
            ➕ Excess: {{ solar_excess | round }}W
            🔌 Available for car: {{ excess_pv_energy | round(1) }}A
          {% else %}
            Loading data...
          {% endif %}
      - type: entities
        title: System Thresholds
        entities:
          - entity: input_number.tesla_range_threshold
            name: Range Threshold
          - entity: input_number.tesla_priority_range_threshold
            name: Priority Range
          - entity: input_number.tesla_high_battery_threshold
            name: High Battery
          - entity: input_number.tesla_low_battery_threshold
            name: Low Battery
          - entity: input_number.tesla_home_power_buffer
            name: Home Power Buffer
          - entity: input_number.tesla_validation_period
            name: Wait Time &gt;= 1A
            control_mode: box
      - type: entities
        entities:
          - type: button
            name: 🔌 Tesla 🚙 Don't Kill 🏡 🔋
            icon: mdi:car-electric-outline
            tap_action:
              action: call-service
              service: automation.trigger
              target:
                entity_id: automation.charge_tesla_dont_kill_home_battery
      - type: horizontal-stack
        cards:
          - type: gauge
            name: Car Range
            entity: sensor.tesla_rated_battery_range_mi
            min: 0
            max: 300
            severity:
              green: 120
              yellow: 99
              red: 0
          - type: gauge
            name: Home Battery
            entity: sensor.dongle_device_id_soc
            unit: '%'
            min: 0
            max: 100
            severity:
              green: 65
              yellow: 50
              red: 0
      - type: history-graph
        title: Power Distribution (24h)
        hours_to_show: 24
        entities:
          - entity: sensor.dongle_device_id_pall
            name: Solar Production
          - entity: sensor.dongle_device_id_pload
            name: Home Consumption
          - entity: number.tesla_ble_your_device_id_charging_amps
            name: Car Charging Rate</code></pre><p><strong>Key Features:</strong></p><ul><li><strong>Dynamic Data Display</strong>: The markdown card shows real-time data and status messages.</li><li><strong>Adjustable Thresholds</strong>: Entities card allows you to adjust system thresholds on the fly.</li><li><strong>Manual Control</strong>: A button to manually trigger the "Charge Tesla Without Draining Home Battery" automation.</li><li><strong>Visual Gauges</strong>: Quick view of car range and home battery status.</li><li><strong>Historical Graph</strong>: Monitor power distribution over the last 24 hours.</li></ul><p><strong>Note:</strong> Replace <code>your_device_id</code> and <code>dongle_device_id</code> with your actual device IDs in the dashboard configuration.</p><h2 id="conclusion">Conclusion</h2><p>By integrating data into Home Assistant via MQTT from Teslamate, my solar inverter using the Monitor My Solar dongle, and my Tesla using an ESP32 chip running ESPHome Tesla BLE, I created a dynamic system that optimizes Tesla charging based on real-time solar production and home energy demands. </p><p>The customizable dashboard provides monitoring and adjustment of parameters, ensuring efficient management of both the car and home battery. Additionally, when rapid EV charging is needed, the system avoids depleting the home battery. </p><p>Although I learned a lot and already had many prerequisites in place, which made the configuration relatively quick, I find the setup fragile and likely to be annoying to maintain. I plan to keep it running and hope it won’t cause too many headaches. However, I intend to be cautious about investing significant additional effort into optimizing it unless the improvements are clearly worthwhile.</p><p><strong>Benefits of this setup:</strong></p><ul><li><strong>Energy Efficiency</strong>: Maximizes the use of excess solar energy.</li><li><strong>Cost Savings</strong>: Reduces reliance on grid electricity, lowering energy bills.</li><li><strong>Environmental Impact</strong>: Enhances the use of renewable energy sources.</li><li><strong>Affordability:</strong>&nbsp;Extremely budget-friendly, with the ESP32 chip costing $6.</li><li><strong>Customization and Control:</strong>&nbsp;Allows you to control the logic and easily tweak the setup to meet specific needs.</li></ul><p>I’d love to hear your thoughts, feedback, or experiences if you actually try using this approach. Hopefully, we’ll see simpler aftermarket solutions from vehicle manufacturers integrating this functionality into their chargers in an open and accessible way. </p><p><strong>Happy charging!</strong></p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Set Up an ESP32 BLE Key with my Tesla and Home Assistant for Solar Charging Optimization ]]></title>
        <description><![CDATA[ I set up an ESP32 chip to connect to my Tesla over BLE. With this integration, I can automate adjusting charging amperage and starting/stopping charging based on surplus solar energy. ]]></description>
        <link>https://danielraffel.me/til/2024/11/27/how-i-set-up-an-esp32-ble-key-with-my-tesla-and-home-assistant-for-solar-charging-optimization/</link>
        <guid isPermaLink="false">67455249e347f9035084cf35</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 26 Nov 2024 21:22:25 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-26-21.14.29---A-minimalistic-and-modern-illustration-in-the-style-of-Paul-Rand--depicting-an-ESP32-M5NanoC6-development-kit-chip-wirelessly-connecting-to-a-Tesla-ca.png" medium="image"/>
        <content:encoded><![CDATA[ <h3 id="tldr">TL;DR</h3><p>I set up an <a href="https://shop.m5stack.com/products/m5stack-nanoc6-dev-kit?srsltid=AfmBOoqG-X4jcqD1in4r5g6P7ZXyeCDMhJb3RZUBIgWi-vN2orgWMkmr&ref=danielraffel.me" rel="noreferrer">ESP32 M5NanoC6</a> development kit chip with the <a href="https://github.com/yoziru/esphome-tesla-ble?ref=danielraffel.me" rel="noreferrer">esphome-tesla-ble</a> project to connect to my Tesla over Bluetooth. This setup allows the chip to communicate with the car when it’s nearby, using the <a href="https://esphome.io/components/api.html?ref=danielraffel.me" rel="noreferrer">ESPHome native API</a> to share data with Home Assistant over Wi-Fi. With this integration, I can automate tasks like adjusting charging amperage and starting/stopping charging based on surplus solar energy from my inverter. For now, this guide covers flashing the chip and integrating it with Home Assistant. I’ll follow up with a post to <a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/" rel="noreferrer">automations designed to optimize charging from excess solar</a>.</p><hr><h3 id="thanks">Thanks</h3><p>This wouldn't be possible without <a href="https://yasirekinci.com/?ref=danielraffel.me" rel="noreferrer">Yasir Ekinci</a> who created the <a href="https://github.com/yoziru/tesla-ble?ref=danielraffel.me">tesla-ble</a><strong> </strong>library for<strong> </strong>communicating with Tesla vehicles locally via the BLE API. And, for integrating that work into <a href="https://github.com/yoziru/esphome-tesla-ble?ref=danielraffel.me">esphome-tesla-ble</a> so that it works with <a href="https://esphome.io/?ref=danielraffel.me" rel="noreferrer">ESPHome</a> and exposes dashboards in <a href="https://www.home-assistant.io/?ref=danielraffel.me" rel="noreferrer">Home Assistant</a>.</p><hr><h3 id="background-and-objective">Background and Objective</h3><p>This project uses the <a href="https://github.com/yoziru/esphome-tesla-ble?ref=danielraffel.me" rel="noreferrer">esphome-tesla-ble</a> library to enable local BLE communication between an ESP32 and a Tesla, providing an offline alternative to the official Tesla API while avoiding rate limits. The goal is to dynamically manage charging by creating <a href="https://www.home-assistant.io/?ref=danielraffel.me" rel="noreferrer">Home Assistant</a> automations, such as adjusting amperage or stopping charging based on solar production. Since the car defaults to the mobile wall charger’s maximum of 32 amps, regulating amperage is essential to prevent exceeding available solar capacity and I needed a way to automate that.</p><hr><h3 id="tested-hardware-and-environment">Tested Hardware and Environment</h3><ol><li><strong>Hardware</strong>: <a href="https://shop.m5stack.com/products/m5stack-nanoc6-dev-kit?srsltid=AfmBOoqG-X4jcqD1in4r5g6P7ZXyeCDMhJb3RZUBIgWi-vN2orgWMkmr&ref=danielraffel.me" rel="noreferrer">ESP32 M5NanoC6</a> development kit (other models work, but this is the one I used and can confirm is compatible), Tesla Vehicle Key Card.</li><li><strong>Environment</strong>: Apple Silicon MacBook Pro running macOS Sequoia.</li><li><strong>Software</strong>: <a href="https://github.com/yoziru/esphome-tesla-ble?ref=danielraffel.me" rel="noreferrer">ESPHome Tesla BLE</a> repo by Yoziru, local <a href="https://www.home-assistant.io/?ref=danielraffel.me" rel="noreferrer">Home Assistant</a> instance, and <a href="https://github.com/esphome/home-assistant-addon?ref=danielraffel.me" rel="noreferrer">ESPHome add-on</a> for Home Assistant.</li></ol><hr><h3 id="flashing-the-esp32">Flashing the ESP32</h3><p>The ESP32 chip needs to be flashed twice.</p><p><strong>About the First Flash: Enable the BLE Listener</strong><br>The initial flash enables a BLE (Bluetooth Low Energy) listener on the ESP32. This listener’s purpose is to scan for nearby BLE devices and capture the Tesla’s unique BLE MAC ID. The Tesla emits this MAC ID when you place your key card on the car’s designated holder, signaling the car to start broadcasting BLE data. This step is critical because the Tesla’s BLE MAC ID is required to configure the ESP32 as a BLE key. Without discovering this ID, the ESP32 cannot interact with the Tesla via BLE.</p><p><strong>About the Second Flash: Configure and Finalize the ESP32</strong><br>Once the Tesla’s BLE MAC ID is obtained and added to the configuration (secrets.yaml), the BLE listener is no longer needed. The second flash:</p><p>1. Disables the BLE listener to stop scanning for devices (which would interfere with normal operations).</p><p>2. Configures the ESP32 to use the discovered Tesla BLE MAC ID to authenticate and act as a BLE key.</p><p>3. Finalizes the firmware, allowing the ESP32 to communicate with the Tesla and control features such as charging and amperage settings.</p><hr><p><em>The following steps assume you already have Home Assistant set up and have an ESP32 M5NanoC6 chip.</em></p><h3 id="flashing-the-chip-and-integrating-it-with-home-assistant">Flashing the chip and integrating it with Home Assistant</h3><p><strong>1. Set up your Python environment and clone the ESPHome Tesla BLE repository:</strong></p><pre><code class="language-bash"># Create a new Python 3.11 virtual environment named 'esphome-tesla-ble'
python3.11 -m venv esphome-tesla-ble

# Change into the newly created virtual environment directory
cd esphome-tesla-ble

# Activate the virtual environment
source bin/activate

# Clone the yoziru/esphome-tesla-ble repository from GitHub
git clone https://github.com/yoziru/esphome-tesla-ble

# Change into the cloned repository directory
cd esphome-tesla-ble

# Install the required Python dependencies from the requirements file
pip install -r requirements.txt</code></pre><p><strong>2. Configure secrets.yaml</strong></p><p>Before compiling, create and populate a&nbsp;<code>secrets.yaml</code> file at <code>esphome-tesla-ble/esphome-tesla-ble</code>&nbsp;file with your Wi-Fi credentials, an API encryption key, default BLE macID, and your Tesla VIN:</p><pre><code class="language-bash">wifi_ssid: "my_wifi"
wifi_password: "my_password"
wifi_hotspot_password: "my_hotspot_password"
ota_password: ""
api_encryption_key: "my_encryption_key" # random key from https://esphome.io/components/api.html#configuration-variables
ble_mac_address: "A0:B1:C2:D3:E4:F5"
tesla_vin: "my_tesla_vin"</code></pre><p><strong>3. Enable the BLE Listener</strong></p><p>In the&nbsp;<code>tesla-ble-m5stack-nano6.yml</code>&nbsp;configuration file, enable the BLE listener by uncommenting the line so it looks like this:</p><pre><code class="language-bash">listener: !include packages/listener.yml</code></pre><p>This instructs the ESP32 to run in “listening mode” after the first flash.</p><p><strong>4. Compile the Firmware</strong></p><p>Run the following command inside <code>esphome-tesla-ble/esphome-tesla-ble</code> to compile the firmware for the M5NanoC6 board:</p><pre><code class="language-bash">make compile BOARD=m5stack-nanoc6</code></pre><p>This will generate the firmware binary tailored to your ESP32 board with the BLE listener enabled. </p><p><em>Note: If you're using a different chip than me you could try just pointing at the yaml you're using and compile like this <code>esphome compile tesla-ble-esp32-generic.yml</code></em></p><p><strong>5. Flash the ESP32 the first time</strong></p><p>Now upload the compiled firmware to the ESP32 from <code>esphome-tesla-ble/esphome-tesla-ble</code>:</p><ul><li>Connect the ESP32 to your computer via USB.</li><li>Run the upload command</li></ul><pre><code class="language-bash">make upload</code></pre><ul><li>When prompted, select the USB port for your ESP32 (e.g.,&nbsp;/dev/cu.usbmodem1101).</li></ul><p><strong>6. In your terminal note down the ESP32 Device Name Suffix</strong></p><p>After flashing, the ESP32 will reboot and appear with a name like&nbsp;tesla-ble-&lt;suffix&gt;. The&nbsp;&lt;suffix&gt;&nbsp;is a unique identifier, such as&nbsp;5ae9cc. Note this suffix, as you’ll use it to monitor logs in a future step.</p><p><strong>7. Locate the ESP32’s IP Address on Your Network</strong></p><p>Using a network scanner, or by logging in to your router’s admin interface, check for the ESP32 device on your Wi-Fi network and note its IP Address as you may need it to add the device manually to Home Assistant if it’s not auto-discovered:</p><ul><li>Manufacturer:&nbsp;<strong>Espressif Inc.</strong></li><li>MAC Address: May start with&nbsp;<code>40:4C:CA</code></li><li>Name: Might appear as&nbsp;tesla-ble-&lt;suffix&gt;.lan&nbsp;(though this may not resolve due to your network config, eg using stuff like Tailscale).</li></ul><p><em>Note: If you end up having to manually add the ESP device by its IP address you might want to consider setting the ESP device with a reserved IP address on your router.</em></p><p><strong>8. Install the ESPHome add-on to Home Assistant</strong></p><p>In Home Assistant, I navigated to&nbsp;<code>Settings &gt; Devices &amp; Services</code>, I tapped on <code>Add Integration</code> and searched for and installed the ESPHome addon.</p><p><strong>9. Add the ESPHome device to Home Assistant and Configure</strong></p><p>My ESP32 immediately appeared as a device ready to be added. If you don't see yours you can add it by its IP Address.</p><p>After adding the device, I entered my encryption key when prompted.</p><p>Then, I scrolled to the configured devices, selected ESPHome, and enabled <code>debug logging</code>. This allows me to use the logs in Home Assistant to debug but I stuck with my terminal for debugging. </p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-11.png" class="kg-image" alt="" loading="lazy" width="1576" height="678" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-11.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-11.png 1000w, https://danielraffel.me/content/images/2024/11/image-11.png 1576w" sizes="(min-width: 720px) 720px"></figure><p>Under the&nbsp;<code>1 device</code>&nbsp;section, I accessed the diagnostics menu, enabled the <code>BLE Connection</code>, and confirmed that pressing the physical button on the ESP32 temporarily toggled the diagnostics Button status to “On.”</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-17.png" class="kg-image" alt="" loading="lazy" width="530" height="1468"></figure><p><strong>10. Monitor Logs to Discover Tesla BLE MAC ID</strong></p><p>Open a new terminal tab. With the ESP32 in listening mode, you can now monitor its logs to find your Tesla’s BLE MAC ID:</p><ul><li>Run the following command in the terminal:</li></ul><pre><code class="language-bash">make logs HOST_SUFFIX=-&lt;suffix&gt;</code></pre><p><em>Note: Replace&nbsp;&lt;suffix&gt;&nbsp;with the unique identifier from Step 6 (e.g., mine would be&nbsp;<code>make logs HOST_SUFFIX=-5ae9cc</code>).</em></p><p><strong>11. Take your computer, ESP32, and Tesla key card to your vehicle</strong></p><p>Sitting inside the car being sure your chip can get wifi, place your Tesla key card on the armrest holder, and monitor the logs. The Tesla should begin to broadcast its BLE MAC ID, which will be displayed in the terminal output.</p><pre><code class="language-bash">[20:04:45][I][tesla_ble_listener:050]: Found Tesla vehicle | Name: [VIN_REDACTED]
MAC: BO:F8:XX:XX:XX</code></pre><p><strong>12. Update Secrets.yaml with BLE macID</strong></p><p>Update <code>secrets.yaml</code> file at <code>esphome-tesla-ble/esphome-tesla-ble</code> and replace the ble_mac_address field with your macID (eg <code>BO:F8:XX:XX:XX</code>):</p><pre><code class="language-bash">ble_mac_address="BO:F8:XX:XX:XX"</code></pre><p><strong>13. Disable the BLE Listener</strong></p><p>In the&nbsp;<code>tesla-ble-m5stack-nano6.yml</code>&nbsp;configuration file, disable the BLE listener by commenting the line so it looks like this:</p><pre><code class="language-bash"># listener: !include packages/listener.yml</code></pre><p>This instructs the ESP32 to stop running in “listening mode” since it's no longer necessary.</p><p><strong>14. Flash the ESP32 the second time</strong></p><pre><code class="language-bash">make clean
make compile BOARD=m5stack-nanoc6
make upload
</code></pre><p>When prompted, select the USB port for the ESP32 (e.g., <code>/dev/cu.usbmodem1101</code>).</p><p><strong>15. Pair the ESP32 as a Tesla Key</strong></p><ul><li>Return to Home Assistant and navigate to the ESPHome diagnostics screen.<ul><li><code>Settings &gt; Devices &amp; Services</code>, scroll to the configured section and find ESPHome and tap on <code>1 device</code> </li></ul></li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-14.png" class="kg-image" alt="" loading="lazy" width="786" height="396" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-14.png 600w, https://danielraffel.me/content/images/2024/11/image-14.png 786w" sizes="(min-width: 720px) 720px"></figure><ul><li>On the ESPHome diagnostics card tap the text that says Press next to "Pair BLE Key".</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-13.png" class="kg-image" alt="" loading="lazy" width="474" height="92"></figure><ul><li>Place your Tesla key card on the armrest again and accept the pairing request from the car. After accepting, rename the key to something memorable like <code>ESPHome Key</code> since it will otherwise be called <code>Unknown device</code>.</li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/image-22.png" class="kg-image" alt="" loading="lazy" width="1490" height="1217" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-22.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-22.png 1000w, https://danielraffel.me/content/images/2024/11/image-22.png 1490w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">M5nanoC6 ESP chip paired as a keycard with Tesla</span></figcaption></figure><p><strong>16. Confirm integration works in Home Assistant</strong></p><ul><li>On the ESPHome diagnostics screen attempt to adjust the controls such as charging amps and confirm it updates in realtime in the car charging UX.</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-16.png" class="kg-image" alt="" loading="lazy" width="674" height="632" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-16.png 600w, https://danielraffel.me/content/images/2024/11/image-16.png 674w"></figure><hr><h3 id="results">Results</h3><p>After completing these steps, the ESP32 successfully connected to Home Assistant and began sending and receiving data from the Tesla. It's now possible to view Tesla sensors and controls in Home Assistant:</p><ul><li>Wake Vehicle</li><li>Set charging amps</li><li>Set charging limit (percent)</li><li>Turn on/off charging</li><li>BLE information sensors<ul><li>Asleep / awake</li><li>Doors locked / unlocked</li><li>User present / not present</li><li>Charging flap open / closed (only when vehicle is awake)</li><li>BLE signal strength</li></ul></li></ul><hr><h3 id="next-steps">Next Steps</h3><ul><li><a href="https://danielraffel.me/2024/11/30/how-i-optimized-charging-my-ev-using-excess-solar-power-withhome-assistant/" rel="noreferrer"><strong>Create Automations</strong></a>: Set up Home Assistant automations to dynamically manage Tesla charging based on solar production.<br> </li></ul> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Hummingbird Fuels ]]></title>
        <description><![CDATA[ Lisa and Richard are familiar figures in the SF cycling community. Today, I learned they have started a new company called Hummingbird Fuels, set to launch in January 2025. I couldn&#39;t be more thrilled for them. ]]></description>
        <link>https://danielraffel.me/2024/11/18/hummingbird-fuels/</link>
        <guid isPermaLink="false">673b7413295cb5034a676c36</guid>
        <category><![CDATA[ 🛒 Buying ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 18 Nov 2024 09:27:44 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-18-09.27.02---A-dynamic-and-vibrant-illustration-of-a-hummingbird-in-the-style-of-Paul-Rand--utilizing-a-gradient-color-palette-inspired-by-blue--green--yellow--ora.png" medium="image"/>
        <content:encoded><![CDATA[ <p><a href="https://www.hummingbirdfuels.com/lisa-richard-about?ref=danielraffel.me" rel="noreferrer">Lisa and Richard</a> are familiar figures in the SF cycling community. I've joined a few group rides where I had the pleasure of meeting them briefly and witnessing their incredible passion, generosity, and unwavering support for fellow athletes.</p><p>Today, my wife returned from a <a href="https://www.fatcake.cc/rides?ref=danielraffel.me" rel="noreferrer">Fatcake ride</a> with something exciting: their new 60 for 60 formula carb sports drink! It turns out Lisa and Richard have started a new company called <a href="https://www.hummingbirdfuels.com/?ref=danielraffel.me" rel="noreferrer">Hummingbird Fuels</a>, set to launch in January 2025. I couldn't be more thrilled for them.</p><p>I absolutely ❤️ this from their site:</p><blockquote>We created this product because we saw a gap in the market. Too many brands creating complicated formulas, celebrating “<em>bro science”</em> and promoting the lie that you need to suffer, to be considered an athlete.<br><br><strong>THIS IS NOT OUR VIBE.</strong></blockquote><p>As my wife rode home with the package sticking out of her jersey, a fellow cyclist noticed it and exclaimed, "You’ve got Hummingbird Fuels?! I love Lisa and Richard!" Clearly, we’re not the only ones excited about their launch. Keep an eye out.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/IMG_4320.png" class="kg-image" alt="" loading="lazy" width="856" height="1142" srcset="https://danielraffel.me/content/images/size/w600/2024/11/IMG_4320.png 600w, https://danielraffel.me/content/images/2024/11/IMG_4320.png 856w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://www.hummingbirdfuels.com/?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Hummingbird Fuels</div><div class="kg-bookmark-description">Carbohydrate sports drink delivering 60 grams of carbs for 60 minutes of activity. Our 60 for 60 formula makes fueling your activities easy. Swimming, running, cycling, whatever. Drink your fuel. Fuel your fun.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/favicon-1.ico" alt=""><span class="kg-bookmark-author">Hummingbird Fuels</span><span class="kg-bookmark-publisher">Skip to Content</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/HBFbirdicon.png" alt="" onerror="this.style.display = 'none'"></div></a></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Nachobursts: A New Chapter for Les Pauls ]]></title>
        <description><![CDATA[ Nacho Guitars, renowned for their meticulous recreations of early 1950s Telecasters—dubbed “Nachocasters”— have now expanded to include Gibson Les Paul replicas, known as “Nachobursts.” ]]></description>
        <link>https://danielraffel.me/2024/11/15/nachobursts-a-new-chapter-for-les-pauls/</link>
        <guid isPermaLink="false">6737cd5c5faa6305d6acb02e</guid>
        <category><![CDATA[ 🛒 Buying ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 15 Nov 2024 15:16:57 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-15-15.14.46---An-artistic-illustration-in-the-style-of-Paul-Rand-featuring-Nacho-Ba-os-and-Billy-Gibbons-of-ZZ-Top.-The-scene-highlights-their-connection-through-ic.png" medium="image"/>
        <content:encoded><![CDATA[ <p><a href="https://www.nachoguitars.com/?ref=danielraffel.me" rel="noreferrer">Nacho Guitars</a>, renowned for their meticulous recreations of early 1950s Telecasters—dubbed “Nachocasters”—have now expanded to include Gibson Les Pauls, known as “<a href="https://www.nachoguitars.com/portfolio-item/galery02/?ref=danielraffel.me" rel="noreferrer">Nachobursts</a>.” First hinted at in a 2023 <a href="https://www.instagram.com/nachoguitars/p/C0mh9rrM3Ya/?img_index=1&ref=danielraffel.me" rel="noreferrer">Instagram post</a>, this evolution reflects founder Nacho Baños’s dedication to authentically reproducing iconic vintage guitars.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/image-10.png" class="kg-image" alt="" loading="lazy" width="1125" height="1280" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-10.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-10.png 1000w, https://danielraffel.me/content/images/2024/11/image-10.png 1125w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Billy Gibbons and Nacho Baños</span></figcaption></figure><p>Baños’s passion and expertise are well-documented through his publications, <a href="https://reverb.com/item/16339043-fender-telecaster-book-the-blackguard-by-nacho-banos?ref=danielraffel.me" rel="noreferrer">The Blackguard Book</a> and <a href="https://pinecasterbook.com/?ref=danielraffel.me" rel="noreferrer">The Pinecaster Book</a>, which explore the history and construction of early solid-body electric guitars. His commitment to detail ensures that each guitar he recreates captures the essence of the original instruments, appealing to enthusiasts seeking vintage aesthetics and sound.</p><p>To achieve this level of authenticity for the Burst series, Nacho Guitars has developed custom pickguards, pickup rings, nickel-plated brass hardware parts (saddles, thumbwheels, posts), NAF pickups (Nacho Applied For), vintage-spec 500k Nacho-pots, lacquer formulas, and more. Every detail is crafted using their original Gibson collection as a reference, with guidance and support from a board of Gibson Les Paul experts.</p><p>Nacho Guitars has attracted fans ranging from <a href="http://www.julianlage.com/?ref=danielraffel.me" rel="noreferrer">Julian Lange</a> to Billy Gibbons of ZZ Top, both of whom have publicly praised their craftsmanship. Gibbons’s endorsement of the Burst series underscores the quality and authenticity of these instruments. In <a href="https://billygibbons.com/2022/01/interview-by-nacho-banos-author-of-the-pinecaster/?ref=danielraffel.me" rel="noreferrer">interviews</a>, Gibbons has shared his appreciation for Nacho Guitars and their faithful reproductions of vintage classics. </p><p>I can only hope to experience one of these masterpieces in person someday.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-9.png" class="kg-image" alt="" loading="lazy" width="1334" height="982" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-9.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-9.png 1000w, https://danielraffel.me/content/images/2024/11/image-9.png 1334w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Install Weather Underground on Home Assistant and Configure a Simple Dashboard ]]></title>
        <description><![CDATA[ Home Assistant offers a custom integration specifically for Weather Underground personal weather station users. It includes a native Home Assistant weather entity along with various weather sensors. To set it up, follow these steps, ensuring you have your Weather Underground API key and Station ID.

 1. On your Home Assistant ]]></description>
        <link>https://danielraffel.me/til/2024/11/14/how-to-install-weather-underground-on-home-assistant-and-configure-a-simple-dashboard/</link>
        <guid isPermaLink="false">67357c55ee9b01034ba86382</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 13 Nov 2024 21:12:32 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-13-21.08.01---A-minimalistic-and-abstract-Paul-Rand-inspired-design-for-a-weather-dashboard--focusing-on-Home-Assistant-s-features-for-Weather-Underground-personal-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Home Assistant offers a custom integration specifically for <a href="https://www.wunderground.com/pws/overview?ref=danielraffel.me" rel="noreferrer">Weather Underground personal weather station</a> users. It includes a native Home Assistant weather entity along with various weather sensors. To set it up, follow these steps, ensuring you have your <a href="https://www.wunderground.com/member/api-keys?ref=danielraffel.me" rel="noreferrer">Weather Underground API key and Station ID</a>.</p><ol><li>On your Home Assistant VM create a script to install the weather underground integration:</li></ol><pre><code class="language-bash">vi install_wundergroundpws.sh</code></pre><pre><code class="language-bash">#!/bin/bash

# Step 1: Create a temporary directory
mkdir -p /tmp/wundergroundpws_install
cd /tmp/wundergroundpws_install

# Step 2: Download the file
wget https://github.com/cytech/Home-Assistant-wundergroundpws/archive/refs/tags/v2.0.9.tar.gz

# Step 3: Extract the tar.gz file
tar -xzf v2.0.9.tar.gz

# Step 4: Move the `wundergroundpws` folder to the target directory
sudo mv Home-Assistant-wundergroundpws-2.0.9/custom_components/wundergroundpws /opt/homeassistant_config/custom_components/

# Step 5: Clean up the temporary directory
cd ~
rm -rf /tmp/wundergroundpws_install

echo "wundergroundpws custom component has been moved to /opt/homeassistant_config/custom_components/"</code></pre><p>Note: you will likely need to modify the script to the <a href="https://github.com/cytech/Home-Assistant-wundergroundpws/releases/latest?ref=danielraffel.me" rel="noreferrer">latest binary</a> and the path to your <code>.homeassistant</code> directory.</p><ol start="2"><li>Make the script executable:</li></ol><pre><code class="language-bash">chmod +x install_wundergroundpws.sh</code></pre><ol start="3"><li>Run the script which will download the integration and move it to the correct directory:</li></ol><pre><code class="language-bash">./install_wundergroundpws.sh</code></pre><ol start="4"><li>Restart Home Assistant so that it can recognize the new integration.</li><li>Once rebooted in Home Assistant navigate to Settings &gt; Devices &amp; Services. Then, tap on <code>Add Integration</code> and search for the <code>wundergroundpws</code> integration.</li></ol><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-3.png" class="kg-image" alt="" loading="lazy" width="978" height="462" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-3.png 600w, https://danielraffel.me/content/images/2024/11/image-3.png 978w" sizes="(min-width: 720px) 720px"></figure><ol start="6"><li>Install the integration and when prompted enter your Weather Underground <code>API key</code> and your <code>Station ID</code>.</li><li>Once configured it's time to create a dashboard to expose the items. Here's a simple stab at creating a view of Todays Weather. Replace <code>YOUR_STATION_ID</code> with yours.</li></ol><pre><code class="language-bash">views:
  - title: Weather Station YOUR_STATION_ID
    path: weather
    cards:
      - type: entities
        title: Current Conditions
        entities:
          - entity: sensor.YOUR_STATION_ID_temperature
            name: Temperature
            icon: mdi:thermometer
          - entity: sensor.YOUR_STATION_ID_heat_index
            name: Heat Index
            icon: mdi:thermometer-high
          - entity: sensor.YOUR_STATION_ID_wind_chill
            name: Wind Chill
            icon: mdi:thermometer-low
          - entity: sensor.YOUR_STATION_ID_dewpoint
            name: Dew Point
            icon: mdi:water-percent
          - entity: sensor.YOUR_STATION_ID_relative_humidity
            name: Humidity
            icon: mdi:water-percent
      - type: weather-forecast
        entity: weather.YOUR_STATION_ID
        show_forecast: true
      - type: entities
        title: Precipitation
        entities:
          - entity: sensor.YOUR_STATION_ID_precipitation_rate
            name: Current Rate
            icon: mdi:weather-pouring
          - entity: sensor.YOUR_STATION_ID_precipitation_today
            name: Today's Total
            icon: mdi:weather-rainy
      - type: glance
        title: Wind Conditions
        entities:
          - entity: sensor.YOUR_STATION_ID_wind_speed
            name: Wind Speed
            icon: mdi:weather-windy
          - entity: sensor.YOUR_STATION_ID_wind_gust
            name: Wind Gust
            icon: mdi:weather-windy-variant
          - entity: sensor.YOUR_STATION_ID_wind_direction_cardinal
            name: Direction
            icon: mdi:compass
          - entity: sensor.YOUR_STATION_ID_wind_direction_degrees
            name: Degrees
            icon: mdi:compass-outline
      - type: history-graph
        title: Temperature History
        hours_to_show: 24
        entities:
          - entity: sensor.YOUR_STATION_ID_temperature
            name: Temperature
          - entity: sensor.YOUR_STATION_ID_heat_index
            name: Heat Index
          - entity: sensor.YOUR_STATION_ID_wind_chill
            name: Wind Chill
      - type: history-graph
        title: Wind History
        hours_to_show: 24
        entities:
          - entity: sensor.YOUR_STATION_ID_wind_speed
            name: Wind Speed
          - entity: sensor.YOUR_STATION_ID_wind_gust
            name: Wind Gust
      - type: entities
        title: Station Information
        entities:
          - entity: sensor.YOUR_STATION_ID_pressure
            name: Barometric Pressure
            icon: mdi:gauge
          - entity: sensor.YOUR_STATION_ID_elevation
            name: Elevation
            icon: mdi:elevation-rise
          - entity: sensor.YOUR_STATION_ID_station_id
            name: Station ID
            icon: mdi:weather-cloudy
          - entity: sensor.YOUR_STATION_ID_neighborhood
            name: Location
            icon: mdi:map-marker
          - entity: sensor.YOUR_STATION_ID_local_observation_time
            name: Last Updated
            icon: mdi:clock-outline
      - type: glance
        title: Solar Conditions
        entities:
          - entity: sensor.YOUR_STATION_ID_solar_radiation
            name: Solar Radiation
            icon: mdi:sun-wireless
          - entity: sensor.YOUR_STATION_ID_uv_index
            name: UV Index
            icon: mdi:sun-clock
</code></pre><p>The above dashboard will look something like this:</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-4.png" class="kg-image" alt="" loading="lazy" width="2000" height="1397" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-4.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-4.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/11/image-4.png 1600w, https://danielraffel.me/content/images/size/w2400/2024/11/image-4.png 2400w" sizes="(min-width: 720px) 720px"></figure><ol start="8"><li>I decided to enable weather forecasts which is an option in the integration. Here's a simple dashboard focused on exposing those attributes. Replace <code>YOUR_STATION_ID</code> with yours.</li></ol><pre><code class="language-bash">views:
  - title: Weather Forecast
    path: weather
    cards:
      - type: entities
        title: Tonight's Forecast
        entities:
          - entity: sensor.YOUR_STATION_ID_weather_summary_0
            name: Forecast
            icon: mdi:weather-night
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_1n
            name: Precipitation Chance
            icon: mdi:weather-rainy
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_1n
            name: Precipitation Amount
            icon: mdi:water
          - entity: sensor.YOUR_STATION_ID_snow_amount_0
            name: Snow Amount
            icon: mdi:snowflake
      - type: entities
        title: Tomorrow's Forecast
        entities:
          - entity: sensor.YOUR_STATION_ID_weather_summary_1
            name: Forecast
            icon: mdi:weather-sunny
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_2d
            name: Precipitation Chance
            icon: mdi:weather-rainy
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_2d
            name: Precipitation Amount
            icon: mdi:water
          - entity: sensor.YOUR_STATION_ID_snow_amount_1
            name: Snow Amount
            icon: mdi:snowflake
      - type: entities
        title: Tomorrow Night's Forecast
        entities:
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_3n
            name: Precipitation Chance
            icon: mdi:weather-rainy
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_3n
            name: Precipitation Amount
            icon: mdi:water
      - type: entities
        title: Extended Forecast - Nights
        entities:
          - type: section
            label: Friday Night
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_5n
            name: Precipitation Chance
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_5n
            name: Precipitation Amount
          - type: section
            label: Saturday Night
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_7n
            name: Precipitation Chance
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_7n
            name: Precipitation Amount
          - type: section
            label: Sunday Night
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_9n
            name: Precipitation Chance
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_9n
            name: Precipitation Amount
      - type: entities
        title: Extended Forecast - Days
        entities:
          - type: section
            label: Friday
          - entity: sensor.YOUR_STATION_ID_weather_summary_2
            name: Forecast
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_4d
            name: Precipitation Chance
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_4d
            name: Precipitation Amount
          - type: section
            label: Saturday
          - entity: sensor.YOUR_STATION_ID_weather_summary_3
            name: Forecast
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_6d
            name: Precipitation Chance
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_6d
            name: Precipitation Amount
          - type: section
            label: Sunday
          - entity: sensor.YOUR_STATION_ID_weather_summary_4
            name: Forecast
          - entity: sensor.YOUR_STATION_ID_precipitation_probability_8d
            name: Precipitation Chance
          - entity: sensor.YOUR_STATION_ID_precipitation_amount_8d
            name: Precipitation Amount
    type: masonry
</code></pre><p>The above dashboard will look something like this:</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/11/image-5.png" class="kg-image" alt="" loading="lazy" width="2000" height="1283" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-5.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-5.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/11/image-5.png 1600w, https://danielraffel.me/content/images/size/w2400/2024/11/image-5.png 2400w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Set Up a Local AI Model with Xcode, Ollama, Qwen2.5-Coder &amp; Alex Sidebar ]]></title>
        <description><![CDATA[ I wanted to test out Alex Sidebar (eg a desktop client that aims to enable Cursor like features in Xcode) to explore how well a local custom model could perform during development. ]]></description>
        <link>https://danielraffel.me/til/2024/11/13/how-to-set-up-a-local-ai-model-with-xcode-ollama-qwen2-5-coder-alex-sidebar/</link>
        <guid isPermaLink="false">6735165eee9b01034ba862c1</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 13 Nov 2024 15:39:15 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-13-15.36.14---Create-an-image-in-a-minimalist--modernist-style-inspired-by-Paul-Rand--depicting-a-digital-setup-without-any-text.-Show-symbols-representing-a-local-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I wanted to test out <a href="https://alexcodes.app/?ref=danielraffel.me" rel="noreferrer">Alex Sidebar</a> (eg a desktop client that aims to enable <a href="https://cursor.sh/?ref=danielraffel.me" rel="noreferrer">Cursor</a> like features in <a href="https://developer.apple.com/xcode/?ref=danielraffel.me" rel="noreferrer">Xcode</a>) to explore how well a local custom model could perform during development.</p><p>Alex Sidebar lets you set up a custom model that the editor can use for generating code. To try this out, I decided to run <a href="https://qwenlm.github.io/blog/qwen2.5-coder-family/?ref=danielraffel.me" rel="noreferrer">Alibaba’s Qwen2.5-Coder model</a> locally. This new model developed by Alibaba’s Qwen research team is open-source, Apache 2.0 licensed and optimized for coding tasks.</p><p>Here’s how I went about setting it up with <strong>Ollama</strong> as the local server and <strong>ngrok</strong> to make it accessible to Alex Sidebar.</p><hr><h3 id="why-ngrok">Why ngrok?</h3><p><a href="https://ngrok.com/?ref=danielraffel.me" rel="noreferrer">ngrok</a> is essential here because Alex Sidebar, the client application running alongside Xcode, needs a public URL to interact with a local model server. By default, your local server (<a href="https://ollama.com/?ref=danielraffel.me" rel="noreferrer">Ollama</a> in this case) runs on a private <code>localhost</code> address that’s inaccessible from outside your machine. <strong>ngrok solves this by creating a secure tunnel</strong> that links your local server to a public-facing URL, making it easy for external applications, like Alex Sidebar, to connect. In the coming weeks, Alex Sidebar will support localhost addresses directly, but for now, all traffic is still routed through their server making this a necessity.</p><hr><h3 id="steps-to-set-up-ollama-qwen25-coder-and-ngrok">Steps to Set Up Ollama, Qwen2.5-Coder, and ngrok</h3><h4 id="1-install-ollama-and-pull-the-model">1. <strong>Install Ollama and Pull the Model</strong></h4><p>To begin, make sure you have <a href="https://ollama.com/?ref=danielraffel.me" rel="noreferrer">Ollama</a> installed, as it will manage and serve the Qwen2.5-Coder model locally.</p><p><strong>Download Qwen2.5-Coder</strong>:<br>Once Ollama is installed, pull the model in your terminal using:</p><pre><code class="language-bash">ollama pull qwen2.5-coder:32b
</code></pre><p>This will download the Qwen2.5-Coder model to your machine.</p><h4 id="2-run-ollama-server-locally">2. <strong>Run Ollama Server Locally</strong></h4><p>With the model downloaded, start up the Ollama server to make it accessible on <code>localhost</code>:</p><pre><code class="language-bash">ollama serve
</code></pre><p>Ollama will begin serving the model at <code>http://localhost:11434</code>, the default port.</p><h4 id="3-install-and-configure-ngrok">3. <strong>Install and Configure ngrok</strong></h4><p>Next, set up <strong>ngrok</strong> to create a public URL for the Ollama server. This public URL will allow Alex Sidebar to connect to the local model server running on your Mac.</p><p><strong>Sign Up and Authenticate ngrok</strong>:<br>Go to <a href="https://ngrok.com/?ref=danielraffel.me">ngrok.com</a> to create a (free) account and get an authentication token from the <a href="https://dashboard.ngrok.com/get-started/your-authtoken?ref=danielraffel.me">dashboard</a>. </p><p><strong>Install ngrok on your machine</strong>:</p><pre><code class="language-bash">brew install ngrok
</code></pre><p><strong>Then, authenticate ngrok with </strong><a href="https://dashboard.ngrok.com/get-started/your-authtoken?ref=danielraffel.me" rel="noreferrer"><strong>your auth token</strong></a><strong>:</strong></p><pre><code class="language-bash">ngrok config add-authtoken YOUR_AUTH_TOKEN
</code></pre><p>Replace <code>YOUR_AUTH_TOKEN</code> with your unique token. This step ensures your ngrok tunnels are secure and linked to your account.</p><p><em>Optional: At this stage if you want to test the Qwen model in interactive mode in your terminal, run:</em></p><pre><code class="language-bash">ollama run qwen2.5-coder:32b
</code></pre><h4 id="4-expose-the-ollama-server-using-ngrok"><strong>4. Expose the Ollama Server Using ngrok</strong></h4><p>Now that ngrok is authenticated, expose the Ollama server. Since the server is running on port <code>11434</code>, run:</p><pre><code class="language-bash">ngrok http 11434 --host-header="localhost:11434"</code></pre><p>ngrok will generate a public URL, like <code>https://randomsubdomain.ngrok-free.app</code>, which securely tunnels traffic to <code>http://localhost:11434</code>. This URL is what you’ll use in the next step for configuring Alex Sidebar.</p><h4 id="5-configure-alex-sidebar-to-use-the-public-url">5. <strong>Configure Alex Sidebar to Use the Public URL</strong></h4><p>With ngrok up and running, go to Alex Sidebar settings and set up a <strong>custom model</strong>:</p><ul><li><strong>Model ID</strong>: <code>qwen2.5-coder:32b</code> (the name of the model)</li><li><strong>Base URL</strong>: Paste the public URL <a href="https://dashboard.ngrok.com/get-started/setup/macos?ref=danielraffel.me" rel="noreferrer">provided by ngrok</a> and append <code>/v1</code> (e.g., <code>https://randomsubdomain.ngrok-free.app/v1</code>).</li><li><strong>API Key</strong>: Paste the authtoken <a href="https://dashboard.ngrok.com/get-started/your-authtoken?ref=danielraffel.me" rel="noreferrer">provided by ngrok</a></li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/image-1.png" class="kg-image" alt="" loading="lazy" width="1034" height="564" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-1.png 1000w, https://danielraffel.me/content/images/2024/11/image-1.png 1034w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Custom Model in Alex Sidebar Configured with Qwen2.5-coder </span></figcaption></figure><p>This configuration tells Alex Sidebar to route its requests to the ngrok public URL, which then tunnels back to your local Ollama server.</p><hr><h3 id="using-alex-sidebar-with-qwen-an-an-m1-mac">Using Alex Sidebar with Qwen an an M1 Mac</h3><p>After setting up Alex Sidebar to use the Qwen model, I closed the application. Next, I launched Xcode and created a new Swift iOS app. Then, I reopened Alex Sidebar and asked it to ‘generate a simple login flow in Swift.’</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="200" height="113" src="https://www.youtube.com/embed/82Rz43tUeYA?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="" title="Generating Swift code on an M1 Max w/ 64 GB RAM using Qwen2.5-Coder-32B with Ollama on Sequoia 15.1"></iframe><figcaption><p><span style="white-space: pre-wrap;">Realtime genAI code on M1 Mac "generate a simple login flow in Swift" using Qwen</span></p></figcaption></figure><p>After one more prompt, I integrated the <code>LoginViewController</code> with the SwiftUI <code>ContentView</code> and got it working. The local model is a bit slow, so I’ll likely stick with hosted models, but this is an incredible free option! 🤯</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/image-2.png" class="kg-image" alt="" loading="lazy" width="2000" height="1298" srcset="https://danielraffel.me/content/images/size/w600/2024/11/image-2.png 600w, https://danielraffel.me/content/images/size/w1000/2024/11/image-2.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/11/image-2.png 1600w, https://danielraffel.me/content/images/size/w2400/2024/11/image-2.png 2400w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Alex Sidebar added the code into my project </span></figcaption></figure><p>Thanks to Daniel Edrisian, founder of Alex Sidebar, for troubleshooting and answering a few questions via Discord! </p><hr><h3 id="additional-notes-on-using-ngrox">Additional Notes on Using ngrox</h3><p>Free accounts can only have 1 active proxy</p><ul><li>To kill any running tunnels via the web you can use their <a href="https://dashboard.ngrok.com/agents?ref=danielraffel.me"><u>Dashboard</u></a>, or</li><li>To kill tunnels via terminal run: <code>pkill ngrok</code></li></ul><p>To validate the Qwen model is being served via the tunnel run the following command in a terminal. Note: update <code>randomsubdomain.ngrok-free.app</code> with the public URL <a href="https://dashboard.ngrok.com/get-started/setup/macos?ref=danielraffel.me" rel="noreferrer">provided by ngrok</a> and replace <code>YOUR_AUTH_TOKEN</code> with the one <a href="https://dashboard.ngrok.com/get-started/your-authtoken?ref=danielraffel.me" rel="noreferrer">provided by ngrok</a>.</p><pre><code class="language-bash">curl -X POST https://randomsubdomain.ngrok-free.app/api/generate \
&nbsp; -H "Content-Type: application/json" \
&nbsp; -H "Authorization: Bearer YOUR_AUTH_TOKEN" \
&nbsp; -d '{
&nbsp; &nbsp; &nbsp; &nbsp; "model": "qwen2.5-coder:32b",
&nbsp; &nbsp; &nbsp; &nbsp; "prompt": "build a little webpage"
&nbsp; &nbsp; &nbsp; }'</code></pre> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Safety Questions About Zoox&#x27;s Autonomous Taxi Rollout in SF ]]></title>
        <description><![CDATA[ Zoox recently announced their plans to launch an autonomous taxi service in San Francisco, and a few aspects of their announcement stood out to me. ]]></description>
        <link>https://danielraffel.me/2024/11/12/safety-questions-about-zooxs-autonomous-taxi-rollout-in-sf/</link>
        <guid isPermaLink="false">6732d8b8ee9b01034ba8627b</guid>
        <category><![CDATA[ 🌁 San Francisco ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 11 Nov 2024 20:46:25 -0800</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/11/DALL-E-2024-11-11-20.44.28---An-illustration-inspired-by-Paul-Rand-to-accompany-a-blog-post-about-the-rollout-of-Zoox-s-autonomous-taxi-in-San-Francisco.-The-vehicle-resembles-the.webp" medium="image"/>
        <content:encoded><![CDATA[ <p>Zoox <a href="https://zoox.com/journal/zoox-robotaxi-in-san-francisco?ref=danielraffel.me" rel="noreferrer">recently announced</a> their plans to launch an autonomous taxi service in San Francisco, and a few aspects of their announcement stood out to me.</p><p>First, Zoox’s vehicle design is custom-made. Since it's not sold direct to consumers the vehicle hasn't been evaluated by the <a href="https://www.nhtsa.gov/ratings?ref=danielraffel.me" rel="noreferrer">NHTSA</a> to determine how it performs in crash tests. While these autonomous taxis will be navigating city streets where speed limits average around 25 mph, it’s worth noting that speeding is common, and vehicles on the road are getting heavier, whether they’re larger SUVs, trucks, or EVs packed with batteries. I’d like to see Zoox consider creating and sharing detailed crash test safety data for added rider assurance.</p><p>Additionally, Zoox’s CEO, Aicha Evans, stated that they have “passed all critical safety measures,” but the announcement didn’t link to any specific data or sources. This leaves me wondering whether these are internal safety standards, independent third-party verifications, or a mix of both. It would be helpful to know exactly what “critical safety measures” Zoox vehicles have passed.</p><p>I'm probably being a bit cynical but the announcement struck me as somewhat disingenuous, given Zoox’s claim that they want to engage with the community and answer questions. While they express a desire for open dialogue, they haven’t provided any direct means to contact them with questions or concerns. I’d have preferred to share this feedback privately rather than in a public forum.</p><p>I’m a strong believer that our roads will become safer with the introduction of more automation. However, a rollout like this would benefit from greater transparency—especially around vehicle safety data, the standards they’re using to evaluate the fleet’s readiness, and an accessible channel for community feedback. After all, we’re all sharing the road, and building trust requires open communication.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Built a Real-Time Solar Dashboard for My EG4 18kPV Inverter Using Home Assistant ]]></title>
        <description><![CDATA[ I built a Home Assistant dashboard to monitor real-time solar data from my EG4 14kPV inverter. This post details how I used a third-party dongle from MonitorMy.Solar to access my solar data and bypass delays on the manufacturer’s portal. ]]></description>
        <link>https://danielraffel.me/2024/10/28/how-i-built-a-real-time-solar-dashboard-for-my-eg4-18kpv-inverter-using-home-assistant/</link>
        <guid isPermaLink="false">671eee0e9406ea035017dff2</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 28 Oct 2024 12:19:00 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/10/DALL-E-2024-10-28-12.12.29---A-visually-abstract-representation-inspired-by-Paul-Rand-s-style--focusing-on-geometric-shapes--clean-lines--and-playful-minimalism.-The-design-commun.png" medium="image"/>
        <content:encoded><![CDATA[ <p>A few days ago, I installed a solar system and wanted a way to monitor real-time data from my <a href="https://eg4electronics.com/categories/inverters/eg4-18kpv-12lv-all-in-one-hybrid-inverter/?ref=danielraffel.me" rel="noreferrer">EG4 18kPV hybrid solar inverter</a>. However, I found the <a href="http://monitor.eg4electronics.com/WManage/web/monitor/inverter?ref=danielraffel.me" rel="noreferrer">EG4 monitoring portal</a> frustrating, as it often displays data with a delay of around five minutes. To get faster updates, I decided to take matters into my own hands.</p><p>I purchased a third-party dongle from <a href="https://monitormy.solar/detail/13?ref=danielraffel.me" rel="noreferrer">monitormy.solar</a>, which gave me direct access to the inverter's data. This dongle gives me control over where I share data:<br><br>1. <a href="http://monitoring.monitormy.solar/?ref=danielraffel.me">MonitorMy.Solar</a> – a cloud-hosted portal managed by the dongle’s manufacturer<br>2. <a href="http://monitor.eg4electronics.com/WManage/web/monitor/inverter?ref=danielraffel.me" rel="noreferrer">EG4 monitoring portal</a> - the inverters official cloud-hosted portal<br>3. <a href="https://www.home-assistant.io/integrations/mqtt/?ref=danielraffel.me" rel="noreferrer">Home Assistant via MQTT Broker</a> - a managed or self-hosted Home Assistant instance. <em>Note: For compatibility reasons you will need to ensure HA is configured to use 24-hour time, MQTT is set up with a username and password, and the MQTT connection does not use SSL.</em> *</p><p>I can choose to share data with any, all, or none of these options. For local-only access, the dongle also provides a real-time monitoring portal. This flexibility lets me stop sharing with EG4 or any other platform whenever I want.</p><h2 id="my-current-setup">My Current Setup</h2><p>Now, my solar data flows directly to a Home Assistant instance running on a Google Cloud e2-micro instance, where I’ve built a very simple real-time dashboard.</p><p>For the dashboard design, I used Dante Winter’s <a href="https://github.com/DanteWinters/lux-power-distribution-card?ref=danielraffel.me">Lux Power Distribution Card</a> in Home Assistant to get an interactive power flow diagram. While the default setup for that Card relies on a LuxpowerTek integration hosted in a private repo by <a href="https://github.com/guybw?ref=danielraffel.me">Guy Wells</a>, I discovered this isn’t necessary if you have the <a href="https://monitormy.solar/detail/13?ref=danielraffel.me" rel="noreferrer">monitormy.solar</a> dongle. Instead, the sensor data is accessible via the dongle, though it requires a few minor adjustments to redirect to the dongle sensor data states, which I’ve outlined in my <a href="https://github.com/danielraffel/luxpower-dashboard/blob/main/lux.yaml?ref=danielraffel.me" rel="noreferrer"><code>lux.yaml</code></a>.</p><p>If you're curious, the dongle offers <a href="https://docs.google.com/spreadsheets/d/e/2PACX-1vR4tZ2UopYAXOcbB-V2PBKrQcNggaNLVUVmEvsgTGT4QR2sZgZDx9IUCbyvXlMDCiloVYx5vaqqbb4f/pubhtml?ref=danielraffel.me" rel="noreferrer">extensive access to EG4 sensor data,</a> and I highly recommend it for anyone wanting more control and better monitoring. Below, I’ve included a screenshot of my real-time dashboard, and the full setup instructions are available in my GitHub repository.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/screenshot.png" class="kg-image" alt="" loading="lazy" width="294" height="639"><figcaption><span style="white-space: pre-wrap;">Today’s: ☀️ Solar Yield | 🔋 Charged | 🪫 Drained | 🏡 Power Consumed</span></figcaption></figure><h2 id="get-the-code">Get the Code</h2><p>You can find my GitHub repository here:</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/luxpower-dashboard/tree/main?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/luxpower-dashboard: Simple realtime Home Assistant dashboard for monitoring an EG4 18kPV Hybrid Inverter solar power system</div><div class="kg-bookmark-description">Simple realtime Home Assistant dashboard for monitoring an EG4 18kPV Hybrid Inverter solar power system - danielraffel/luxpower-dashboard</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://github.githubassets.com/assets/pinned-octocat-093da3e6fa40.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://opengraph.githubassets.com/6d35ce4d6cfddc8d465ce1f04529166162e0b3c5fdc12abc02ffff8ee6da9f3f/danielraffel/luxpower-dashboard" alt="" onerror="this.style.display = 'none'"></div></a></figure><p><br>The dongle has been a game-changer, allowing me to monitor my solar power system reliably and without delays. If you're looking to bypass the laggy portals and gain full control of your solar data, I hope this guide and my dashboard can help you do the same!</p><p>Huge thanks to <a href="https://monitormy.solar/about?ref=danielraffel.me" rel="noreferrer">Zak</a> at MonitorMySolar for all his support!</p><hr><p>Note: My installer created conduit holes on the side of the EG4, just below the dongle’s attachment point, leaving no room for the antenna. To solve the space issue, I connected the dongle with its antenna to the EG4 using an HDMI cable and a <a href="https://www.amazon.com/AmazonBasics-Female-Coupler-Adapter-Black/dp/B06XR8RBTQ/ref=asc_df_B06XR8RBTQ/?tag=hyprod-20&linkCode=df0&hvadid=693569091832&hvpos=&hvnetw=g&hvrand=17757436991679685470&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9031942&hvtargid=pla-348800026707&psc=1&mcid=e6ff4ec1fbde3bf4acf7cd8cd5d9e25b&ref=danielraffel.me" rel="noreferrer">Female-to-Female HDMI adapter</a>, allowing me to place the dongle on top of the EG4 unit.</p><p>* The GitHub repository details how I configured my dongle to send data to a local MQTT broker on my Tailscale network. That machine then securely relays the data over Tailscale to the cloud-based Home Assistant VM.</p><p>Note: The EG4 uses 24-hour time format, so be sure to set it accordingly—it may not be immediately obvious in the user interface. Otherwise, your data in Home Assistant will be off.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Plan Your Golden Gate Bridge Crossing: Ideal for Cyclists, Runners and Walkers ]]></title>
        <description><![CDATA[ I created a Golden Gate Bridge Live Weather Viewer because I often cycle from San Francisco to Marin County and found myself unprepared for the weather on the bridge. ]]></description>
        <link>https://danielraffel.me/2024/10/22/plan-your-golden-gate-bridge-crossing-ideal-for-cyclists-runners-and-walkers/</link>
        <guid isPermaLink="false">670e2b8325b9e30349c05143</guid>
        <category><![CDATA[ 👨‍💻 Software ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 21 Oct 2024 21:36:23 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/10/ggb.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR:</strong>&nbsp;I built a site to identify <a href="http://danielraffel.github.io/ggb/?ref=danielraffel.me" rel="noreferrer">The Best Time for Athletes </a><a href="http://danielraffel.github.io/ggb/?ref=danielraffel.me" rel="noreferrer">to Cross the Golden Gate Bridge</a> which is great for cyclists, runners and walkers. </p><p>As someone who regularly cycles from San Francisco to Marin, I often found myself unprepared for the weather on the Golden Gate Bridge, where the temperature can be 10-15 degrees cooler by the time I arrive than when I leave home.</p><p>To address this, I created a site that offers real-time weather conditions—temperature, wind speed, rain probability, and sunset time—so you can dress accordingly before heading out. It also suggests the two best times for your crossing today.</p><p>Given that we’re living in the future, I built the site using <a href="https://www.flowvoice.ai/?ref=danielraffel.me">Flow</a> to quickly dictate prompts to <a href="https://www.cursor.com/?ref=danielraffel.me" rel="noreferrer">Cursor</a>, an AI code editor.&nbsp;<strong>Flow</strong>&nbsp;speeds up writing using voice commands, while&nbsp;<strong>Cursor</strong>&nbsp;handled the coding. The site is built with&nbsp;<strong>HTML5, Tailwind CSS, JavaScript</strong>, and&nbsp;<strong>Axios</strong>, which pulls real-time weather data from the&nbsp;<a href="https://open-meteo.com/en/docs?ref=danielraffel.me" rel="noreferrer"><strong>Open-Meteo API</strong></a>. The project is served through GitHub Pages using HTML files.</p><p>I previously detailed how I use <a href="https://danielraffel.me/til/2024/09/12/how-to-streamline-your-web-tasks-by-integrating-browserless-playwright-and-changedetection-io-with-n8n/" rel="noreferrer">N8N with Browserless and Playwright</a>. <a href="https://github.com/danielraffel/ggb/blob/main/n8n_workflow.json?ref=danielraffel.me" rel="noreferrer">This N8N workflow</a> captures a screenshot of the <a href="https://www.goldengate.org/bridge/visiting-the-bridge/current-weather/?ref=danielraffel.me" rel="noreferrer">Golden Gate Bridge webcam</a> every 5 minutes and posts it to the project’s GitHub repo. </p><p>Since athletes in SF often leave home, cross the bridge, and continue with a loop through the Headlands or other parts of Marin, the site includes a key feature: the ability to set time offsets. These offsets account for both the time it takes to reach the bridge from home and the time spent on the other side before returning. The personalized offsets are saved in your browser’s local storage, making it easy to align your typical workout routine with real-time weather forecasts. This allows you to quickly check conditions before heading out, ensuring you’re dressed appropriately for both legs of your trip. Additionally, the suggested crossing times highlight the best windows today, providing potential inspiration for your next outing.</p><p>Whether you walk, run, or cycle, I hope this tool helps you plan your visit to the Golden Gate Bridge better. As the saying goes, "There’s no bad weather, just bad gear!"</p><hr><p>I developed two slightly different versions of the site:</p><ol><li><a href="http://danielraffel.github.io/ggb/?ref=danielraffel.me" rel="noreferrer">The Best Time for Athletes to Cross the Golden Gate Bridge Today</a> is designed for cyclists and runners who want to:<ol><li>View a real-time image of the bridge</li><li>Check the expected weather for their planned crossings</li><li>Access a daily forecast</li><li>Get recommendations for the best times to cross today</li></ol></li><li><a href="http://danielraffel.github.io/ggb/crossingforecast.html?ref=danielraffel.me" rel="noreferrer">Golden Gate Bridge Weather Forecast for Cyclists and Runners</a> is tailored for athletes who know how long it takes to reach the bridge and simply want to:<ol><li>View a real-time image of the bridge</li><li>Check the expected weather for their planned crossings</li></ol></li></ol><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/IMG_3587.JPG" class="kg-image" alt="" loading="lazy" width="786" height="2732" srcset="https://danielraffel.me/content/images/size/w600/2024/10/IMG_3587.JPG 600w, https://danielraffel.me/content/images/2024/10/IMG_3587.JPG 786w" sizes="(min-width: 720px) 720px"><figcaption><a href="http://danielraffel.github.io/ggb/?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">The Best Time for Athletes to Cross the Golden Gate Bridge Today</span></a></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/IMG_3589.JPG" class="kg-image" alt="" loading="lazy" width="786" height="1552" srcset="https://danielraffel.me/content/images/size/w600/2024/10/IMG_3589.JPG 600w, https://danielraffel.me/content/images/2024/10/IMG_3589.JPG 786w" sizes="(min-width: 720px) 720px"><figcaption><a href="http://danielraffel.github.io/ggb/crossingforecast.html?ref=danielraffel.me" rel="noreferrer"><b><strong style="white-space: pre-wrap;">Golden Gate Bridge Weather Forecast for Cyclists and Runners</strong></b></a></figcaption></figure><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/ggb?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/ggb</div><div class="kg-bookmark-description">Contribute to danielraffel/ggb development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/pinned-octocat-093da3e6fa40.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/ggb" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>Note: Just after publishing this I made a sister site to identify <a href="http://danielraffel.github.io/bakarfitness?ref=danielraffel.me" rel="noreferrer">the best time to swim on the rooftop pool at UCSF Bakar Gym</a>.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/IMG_3592.JPG" class="kg-image" alt="" loading="lazy" width="786" height="1954" srcset="https://danielraffel.me/content/images/size/w600/2024/10/IMG_3592.JPG 600w, https://danielraffel.me/content/images/2024/10/IMG_3592.JPG 786w" sizes="(min-width: 720px) 720px"><figcaption><a href="http://danielraffel.github.io/bakarfitness?ref=danielraffel.me" rel="noreferrer"><span style="white-space: pre-wrap;">The best time to swim on the rooftop pool at UCSF Bakar Gym</span></a></figcaption></figure><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://github.com/danielraffel/bakarfitness?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GitHub - danielraffel/bakarfitness</div><div class="kg-bookmark-description">Contribute to danielraffel/bakarfitness development by creating an account on GitHub.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://github.githubassets.com/assets/pinned-octocat-093da3e6fa40.svg" alt=""><span class="kg-bookmark-author">GitHub</span><span class="kg-bookmark-publisher">danielraffel</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://opengraph.githubassets.com/dc3511097a668d322b0805f557a3dd0014cfe8efdad485349a1af2046b1251a0/danielraffel/bakarfitness" alt="" onerror="this.style.display = 'none'"></div></a></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ What&#x27;s Going On at SpeCEOlized? ]]></title>
        <description><![CDATA[ As a Specialized fan, I’m struggling to get excited about their current ebike offerings. The bike industry is certainly struggling right now, and Specialized reflects that instability, with three CEOs in two years and unclear leadership. ]]></description>
        <link>https://danielraffel.me/2024/10/09/whats-going-on-at-specialized/</link>
        <guid isPermaLink="false">67058f5be6ad71034b1c155d</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 09 Oct 2024 12:58:37 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/10/68cc0964-0bf1-41cf-9181-48d96e34edf3.png" medium="image"/>
        <content:encoded><![CDATA[ <p>As a <a href="http://specialized.com/?ref=danielraffel.me" rel="noreferrer">Specialized</a> fan considering an eMTB and an urban e-bike commuter, I'm finding it hard to get excited about their offerings right now, even though these segments seem like major growth areas. I recently received an email about the Turbo Vado SL 2 Carbon LTD, with the tagline "<a href="https://milled.com/specialized.com/it-took-50-years-of-innovation-to-create-this-bike-EO0LFGkY6KeoP7h5?ref=danielraffel.me" rel="noreferrer">It Took 50 Years of Innovation to Create This Bike</a>." It raised my expectations, but I was let down to see it was launching with a motor from a year ago that was not particularly inspiring.</p><p>Compared to the latest motors from <a href="https://www.bosch-ebike.com/us/products/drive-units?ref=danielraffel.me" rel="noreferrer">Bosch</a>, <a href="https://pinion.eu/en/?ref=danielraffel.me" rel="noreferrer">Pinion</a>, and <a href="https://www.dji.com/avinox?ref=danielraffel.me" rel="noreferrer">DJI</a>, I consider the Specialized motor outdated. Marketing a bike as cutting-edge while using stale tech makes me question their commitment to leading the market. I respect Specialized’s effort to develop custom software for their rebranded <a href="https://www.brose-ebike.com/de-en/?ref=danielraffel.me" rel="noreferrer">Brose</a> motors, but I’m not convinced this approach is working, as these motors are not the best in their class and seem to be falling behind in terms of specs.</p><p>After watching a video comparing some of the latest eMTB motors, I came away convinced that Pinion has the most future potential, and that Bosch and DJI are setting the bar right now.</p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/UyQpMH4NEtI?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="" title="DJI vs Bosch vs Pinion: Unbelievable Difference!!"></iframe></figure><p>If Specialized wants to retain riders like me who want the latest and greatest tech, I think they need to rapidly step up their offerings or embrace partnerships with different industry leaders. Without this, I suspect many of us will be drawn to other bike brands featuring more enticing technology.</p><p>Additionally, I’ve heard that Specialized has been putting all sorts of pressure on independent bike shops, making it harder for these local businesses to stay afloat, despite relying on them for sales. The company itself seems to be in some disarray, with three CEOs in two years, and even a lack of clarity on LinkedIn about who's actually in charge right now. I really hope they get things together, because Specialized has an amazing history of making great products—but as the bike world electrifies, they seem to be falling behind.</p><p><em>At the time of writing, these are the company’s three publicly listed CEOs. While LinkedIn profiles may not be the most critical detail to focus on, they do reflect a certain level of attention to detail.</em></p><figure class="kg-card kg-image-card"><a href="https://www.linkedin.com/in/armin-landgraf-5a19a35/?ref=danielraffel.me"><img src="https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-09-at-12.24.31-PM.png" class="kg-image" alt="" loading="lazy" width="1046" height="586" srcset="https://danielraffel.me/content/images/size/w600/2024/10/Screenshot-2024-10-09-at-12.24.31-PM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/Screenshot-2024-10-09-at-12.24.31-PM.png 1000w, https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-09-at-12.24.31-PM.png 1046w" sizes="(min-width: 720px) 720px"></a></figure><hr><figure class="kg-card kg-image-card"><a href="https://www.linkedin.com/in/scott-maguire/?ref=danielraffel.me"><img src="https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-09-at-12.24.19-PM.png" class="kg-image" alt="" loading="lazy" width="1076" height="492" srcset="https://danielraffel.me/content/images/size/w600/2024/10/Screenshot-2024-10-09-at-12.24.19-PM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/Screenshot-2024-10-09-at-12.24.19-PM.png 1000w, https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-09-at-12.24.19-PM.png 1076w" sizes="(min-width: 720px) 720px"></a></figure><hr><figure class="kg-card kg-image-card"><a href="https://www.linkedin.com/in/mike-sinyard-b882326/?ref=danielraffel.me"><img src="https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-09-at-12.23.58-PM.png" class="kg-image" alt="" loading="lazy" width="1088" height="614" srcset="https://danielraffel.me/content/images/size/w600/2024/10/Screenshot-2024-10-09-at-12.23.58-PM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/Screenshot-2024-10-09-at-12.23.58-PM.png 1000w, https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-09-at-12.23.58-PM.png 1088w" sizes="(min-width: 720px) 720px"></a></figure><p><strong>Update November 19, 2024:</strong> After reaching out to&nbsp;<a href="mailto:pr@specialized.com">pr@specialized.com</a>&nbsp;to inquire about the CEO at Specialized and whether they actually had three, it seems they finally did a little LinkedIn profile cleanup. My outsider take is that this is a messy situation. The fact that they've got two folks sharing CEO titles, combined with the founder being the only one with a LinkedIn Pro account, implies to me that things might not be going smoothly over there. How is it possible that they don’t have CEO leadership based out of their HQ in Morgan Hill?! Keep an eye on Specialized—I suspect some housecleaning might be on the horizon.</p><figure class="kg-card kg-image-card kg-card-hascaption"><a href="https://www.linkedin.com/in/armin-landgraf-5a19a35/?ref=danielraffel.me"><img src="https://danielraffel.me/content/images/2024/11/armin-1.png" class="kg-image" alt="" loading="lazy" width="294" height="219"></a><figcaption><span style="white-space: pre-wrap;">Armin is CEO of Specialized from Switzerland. HQ is in Morgan Hill, CA.</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><a href="https://www.linkedin.com/in/scott-maguire/?ref=danielraffel.me"><img src="https://danielraffel.me/content/images/2024/11/scott.png" class="kg-image" alt="" loading="lazy" width="294" height="219"></a><figcaption><span style="white-space: pre-wrap;">It appears Scott is CEO of innovation (whatever that means) from Singapore.</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/11/mike.png" class="kg-image" alt="" loading="lazy" width="294" height="219"><figcaption><span style="white-space: pre-wrap;">And, nothing really changed with Mike.</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Resize in Affinity Using Math Formulas ]]></title>
        <description><![CDATA[ I recently switched from Photoshop to Affinity Photo and was exporting an image when I needed to resize it. I discovered that you can simply enter a formula in the parameter to resize it. ]]></description>
        <link>https://danielraffel.me/til/2024/10/08/resize-in-affinity-using-math-formulas/</link>
        <guid isPermaLink="false">6705995ee6ad71034b1c156e</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Tue, 08 Oct 2024 13:51:43 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/10/resizeabstract.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently switched from Photoshop to Affinity Photo and was exporting an image when I needed to resize it. I discovered that you can enter a formula like <code>/2</code> or <code>*1.2</code>, and after tabbing out of the parameter, it will resize proportionally.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/image-4.png" class="kg-image" alt="" loading="lazy" width="408" height="62"><figcaption><span style="white-space: pre-wrap;">/2 will resize by 50%</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/image-3.png" class="kg-image" alt="" loading="lazy" width="424" height="70"><figcaption><span style="white-space: pre-wrap;">*1.2 will resize by 120%</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ You can get locked out of your Google account if Authenticator is enabled and you lose access to it, even if 2FA phone verification is set up. Make sure to save those backup codes! ]]></title>
        <description><![CDATA[ I assumed that having a phone number set up as a 2FA method on my Google account would allow me to recover access if needed. However, it turns out that’s not the case if you’ve previously configured Google Authenticator. ]]></description>
        <link>https://danielraffel.me/til/2024/10/04/locked-out-of-google-account-set-up-with-authenticator/</link>
        <guid isPermaLink="false">66ff6354f7adba034e53bdce</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 04 Oct 2024 09:57:58 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/10/Image.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR:</strong> I assumed that having a phone number set up as a 2FA method on my Google account would allow me to recover access if needed. However, it turns out that’s not the case if you’ve previously configured Google Authenticator.</p><p>Back in 2014, I enabled <a href="https://support.google.com/accounts/answer/185839?hl=en&co=GENIE.Platform%3DAndroid&ref=danielraffel.me" rel="noreferrer">2-step verification</a> on a secondary Google account, configuring it with both <a href="https://support.google.com/accounts/answer/1066447?hl=en&co=GENIE.Platform%3DAndroid&ref=danielraffel.me" rel="noreferrer">Google Authenticator</a> and a 2-Step Verification phone number to receive verification codes. Unfortunately, sometime in the past decade, I reset the Android device with the Authenticator app that generated verification codes for this account. While I likely received backup codes when I first set up 2FA, it seems I didn’t save them.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/image.png" class="kg-image" alt="" loading="lazy" width="1420" height="712" srcset="https://danielraffel.me/content/images/size/w600/2024/10/image.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/image.png 1000w, https://danielraffel.me/content/images/2024/10/image.png 1420w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">2-Step Verification and Authenticator enabled with a phone number (hidden)</span></figcaption></figure><p>Recently, when logging in with my username and password, I received a 2-step code via SMS to my registered phone number. However, when trying to launch a Virtual Machine on Google Cloud, I was prompted for a verification code from Google Authenticator. Without Authenticator, I had no way to generate a code, and I couldn’t remove Authenticator as a verification option. Every attempt to “try another way” only directed me back to Google Authenticator for verification.</p><p>At this point, the account is effectively dead to me, as privileged actions require an Authenticator code and I have no way to generate them. While I recognize that my own inaction contributed to this situation—and I’m fortunate it’s not a primary account I rely upon—it’s still frustrating.</p><p>I’m certain I received backup codes when I initially set up 2-Step Verification, and it was my own mistake not to save them. Still, I can’t shake the feeling that I’m completely locked out with no way to recover access, and it’s a bit surprising that having a 2FA phone number isn’t even a fallback option. I suppose the reality is that phone numbers are now considered an insecure verification method. Since I learned that SMS codes can’t replace all use cases for Authenticator codes, <strong>make sure you save those backup codes!</strong></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-04-at-9.40.09-AM.png" class="kg-image" alt="" loading="lazy" width="1790" height="982" srcset="https://danielraffel.me/content/images/size/w600/2024/10/Screenshot-2024-10-04-at-9.40.09-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/Screenshot-2024-10-04-at-9.40.09-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/10/Screenshot-2024-10-04-at-9.40.09-AM.png 1600w, https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-04-at-9.40.09-AM.png 1790w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Sign in with password</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-04-at-9.38.36-AM.png" class="kg-image" alt="" loading="lazy" width="1810" height="968" srcset="https://danielraffel.me/content/images/size/w600/2024/10/Screenshot-2024-10-04-at-9.38.36-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/Screenshot-2024-10-04-at-9.38.36-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/10/Screenshot-2024-10-04-at-9.38.36-AM.png 1600w, https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-04-at-9.38.36-AM.png 1810w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Challenged for Google Authenticator code</span></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-04-at-9.38.43-AM.png" class="kg-image" alt="" loading="lazy" width="1802" height="958" srcset="https://danielraffel.me/content/images/size/w600/2024/10/Screenshot-2024-10-04-at-9.38.43-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/Screenshot-2024-10-04-at-9.38.43-AM.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/10/Screenshot-2024-10-04-at-9.38.43-AM.png 1600w, https://danielraffel.me/content/images/2024/10/Screenshot-2024-10-04-at-9.38.43-AM.png 1802w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">No other verification option such as 2FA via Phone</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ 🍕Quick Detroit Cheese Pizza ]]></title>
        <description><![CDATA[ This same-day pizza dough for Detroit-style pan pies is fantastic because it requires minimal planning. In just 2 hours, you can be enjoying fresh pizza. ]]></description>
        <link>https://danielraffel.me/2024/09/30/quick-detroit-cheese-pizza/</link>
        <guid isPermaLink="false">66f897dd8aaa9d034b332b89</guid>
        <category><![CDATA[ 👨‍🍳 Cooking ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 30 Sep 2024 15:02:13 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/detroitpie.png" medium="image"/>
        <content:encoded><![CDATA[ <p>This same-day pizza dough for Detroit-style pan pies is fantastic because it requires minimal planning. In just 2 hours, you can be enjoying fresh pizza.</p><ul><li>Dough prep time: 15 minutes</li><li>Dough rising Time: 1 hour 15 minutes</li><li>Baking time: 20 minutes</li><li>Total time with baking: approximately 2 hours</li></ul><h2 id="ingredients">Ingredients</h2><p>5 g yeast<br>255 g water (hot)<br>315 g all-purpose flour<br>5 g salt<br>5 g sugar<br>20 g olive oil</p><h2 id="directions">Directions</h2><ol><li>Dissolve the yeast in hot water in a Pyrex cup, stirring until completely absorbed. In the bowl of a stand mixer, mix the flour, salt, and sugar thoroughly. Next, add the activated yeast, followed by the oil.</li><li>In a stand mixer:</li></ol><ul><li>Mix on&nbsp;<strong>low </strong>for 2 minutes, then stop and scrape down the bowl.</li><li>Mix on&nbsp;<strong>medium-low </strong>for another 2 minutes, scrape down again.</li><li>Mix on&nbsp;<strong>medium </strong>for 2 more minutes</li></ul><ol start="3"><li><strong>First Rise:</strong> Move dough to a large, oiled bowl, and cover it.&nbsp;</li></ol><ul><li>Proof the dough in an oven set to&nbsp;<strong>100°F for 45 minutes</strong></li><li>Add a generous amount of Extra Virgin Olive Oil (EVOO) on top.</li></ul><p>4. <strong>Second Rise:</strong> Butter the bottom and sides of your <a href="https://www.amazon.com/gp/product/B01FY5PHIK/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&th=1&ref=danielraffel.me" rel="noreferrer">10 × 14 pan</a>, then coat the bottom in a lite amount of EVOO.&nbsp;</p><ul><li>Spread the dough gently and cover to at least 70% of the pan. Generously oil the top of the dough again. Cover the pan with a towel.&nbsp;</li><li>Place the covered dough in a warm oven set to&nbsp;<strong>100ºF for 35 minutes</strong>.</li></ul><p>5. <strong>Par-bake:</strong> After the second rise, remove the dough being gentle when you set it down to avoid encouraging it to collapse.&nbsp;</p><ul><li>Preheat your oven to 550ºF with a rack in the middle.&nbsp;</li><li>Generously coat the top of the dough again with EVOO.&nbsp;</li><li>Par-bake the dough for <strong>5 minutes</strong> once the oven reaches temperature, <strong>rotating the pan halfway through</strong>. The dough should appear "dry" but not yet golden brown. This step ensures the crust rises well before adding toppings.</li></ul><p>6. Top dough with:</p><ul><li>Cheese (0.8 pounds low moisture Mozzarella cut in cubes)</li><li>Sauce</li></ul><p>7. Final Bake: Return the pizza to the oven on the middle rack.&nbsp;</p><ul><li>Bake for 10-15 minutes, until the top turns golden and the cheese and oil are visibly sizzling and bubbling.</li></ul><p>8. Let the pizza rest for 5 minutes, or enjoy it later at room temperature.</p><ul><li>To detach the dough from the pan use an offset spatula to gently chip away the crust from the edges going around the entire pie. </li><li>Using a grill turner, slide the pizza lengthwise onto a cutting board. Then slice it with a chef's knife or a pizza wheel.</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/IMG_3285.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="1500" srcset="https://danielraffel.me/content/images/size/w600/2024/09/IMG_3285.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2024/09/IMG_3285.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2024/09/IMG_3285.jpeg 1600w, https://danielraffel.me/content/images/size/w2400/2024/09/IMG_3285.jpeg 2400w" sizes="(min-width: 720px) 720px"></figure><p>Adapted from <a href="https://www.tumblr.com/mrgan/655448940618104833/the-fastest-pan-pizza?ref=danielraffel.me" rel="noreferrer">mrgan</a> to better suit the yeast activation process in my kitchen.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Effortlessly Keep Files In Sync Across Apple Devices Using iCloud Drive Without SyncThing or rsync ]]></title>
        <description><![CDATA[ I’m a big fan of SyncThing and use it to seamlessly sync folders across devices. However, now that iCloud Drive in macOS 15 and iOS 18 can automatically keep files and folders up-to-date, I can stop relying on SyncThing to keep my Apple devices in sync. ]]></description>
        <link>https://danielraffel.me/til/2024/09/28/how-to-effortlessly-keep-files-in-sync-across-apple-devices-using-icloud-drive-without-syncthing-or-rsync/</link>
        <guid isPermaLink="false">66f875976b0869034a32aef7</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 28 Sep 2024 15:11:41 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/DALL-E-2024-09-28-15.09.03---A-minimalist-illustration-inspired-by-Paul-Rand-s-style--depicting-interconnected-devices-like-a-Mac--iPhone--and-iPad-syncing-files.-The-design-uses-.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR: On macOS 15 and iOS 18, iCloud Drive can now automatically keep your files and folders up-to-date. Learn how to set it up on </strong><a href="https://support.apple.com/guide/iphone/automatically-files-date-icloud-ipha40cebde0/ios?ref=danielraffel.me" rel="noreferrer"><strong>iOS</strong></a><strong> and </strong><a href="https://support.apple.com/guide/mac-help/work-with-folders-and-files-in-icloud-drive-mchl1a02d711/mac?ref=danielraffel.me" rel="noreferrer"><strong>macOS</strong></a><strong>.</strong></p><p>I'm a big fan of <a href="http://syncthing.net/?ref=danielraffel.me" rel="noreferrer">SyncThing</a>. I use it to seamlessly transfer files across devices. SyncThing's primary selling point for me is its ability to reliably, securely and quickly synchronize keeping cross-platform devices up-to-date without relying on a third-party intermediary. It also offers encryption and the ability to easily ignore specific file patterns. For automating file sync between devices without involving a third party, I have found it to be the most robust solution I've ever used.</p><p>While most of my needs only require a one-way sync, there are times when I need two-way sync. Usually that's when I want to download something, move it to another location on a different device and then have that action remove the file from the original, source device. In these cases, I rely on a device that's always on to monitor for new files and then perform actions that ultimately sync back to my iPhone or iPad afterward. For example, when I want to download a filetype on one machine, automate uploading it on another machine and then have that action remove it everywhere. I frequently use <a href="http://noodlesoft.com/?ref=danielraffel.me" rel="noreferrer">Hazel</a> on a Mac mini to monitor folders and automate tasks like moving, uploading and deleting, while SyncThing ensures that the folders stay synchronized across devices.</p><p>Years ago, I tried using iCloud to sync devices across my Apple devices, with Hazel managing rules on my Mac mini. Unfortunately, iCloud didn't support always keeping local folders up-to-date, so it just didn't work. I couldn't rely on my Mac mini to actually download the file I put in a shared iCloud folder like Downloads. Frankly, I found it ridiculous. Years ago I worked around this by using a <a href="https://github.com/Obbut/iCloud-Control?ref=danielraffel.me" rel="noreferrer">selective sync extension for macOS</a> but even that eventually stopped working. </p><p>Thankfully, this has finally been remedied with Sequoia macOS 15 and iOS 18. Now, you can automatically keep your files and folders in iCloud up-to-date on <a href="https://support.apple.com/guide/iphone/automatically-files-date-icloud-ipha40cebde0/ios?ref=danielraffel.me" rel="noreferrer">iOS</a> and <a href="https://support.apple.com/guide/mac-help/work-with-folders-and-files-in-icloud-drive-mchl1a02d711/mac?ref=danielraffel.me" rel="noreferrer">macOS</a>.</p><p>This made me realize I can finally stop using SyncThing for scenarios where I want to share files across my iPhone and iPad. I can now download items to my iCloud Downloads folder on my iPhone, iPad, and Mac, and let Hazel monitor the folder and perform actions which orchestrate keeping everything in sync.</p><p>The timing couldn’t be better, as iOS currently has a bug that makes using the third party SyncThing app almost impossible. Safari refuses to reliably save files to a custom local folder, and the SyncThing AppStore app is unable to monitor the iCloud Downloads folder. No matter how many times I change Safari’s download folder settings on iOS, it keeps reverting to a different location—maddening! </p><p>Anyways, as a result of switching to using iCloud Drive for some basic two way sync, I no longer need to use the SyncThing app. I've disabled background app refresh for it on my iPhone and iPad, and hope for a little better battery life. All of this is to say that with iCloud Drive, I can finally sync and monitor my folders the way it should have worked from the beginning.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Set Up Last.fm Scrobbling with Spotify and Apple Music circa 2024 ]]></title>
        <description><![CDATA[ It’s been over ten years since I last used Last.fm for scrobbling, but I finally decided to set it up again. Turns out, the official app isn’t well-supported anymore, so folks use third party apps like Finale and Scrobbles. ]]></description>
        <link>https://danielraffel.me/til/2024/09/23/how-to-set-up-last-fm-scrobbling-with-spotify-and-apple-music-after-a-decade-away/</link>
        <guid isPermaLink="false">66e6126db66e7503522a3309</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 23 Sep 2024 10:18:25 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/scrobble.adf4c976-3262-4ba2-a632-906d32e588bd.png" medium="image"/>
        <content:encoded><![CDATA[ <p>It's been over a decade since I last used <a href="https://www.last.fm/user/danielraffel?ref=danielraffel.me" rel="noreferrer">Last.fm</a> for scrobbling, but I finally decided to set it up again. While much of my media consumption is tracked and available via APIs, my music listening hasn't been—until now.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://www.last.fm/user/danielraffel?ref=danielraffel.me"><div class="kg-bookmark-content"><div class="kg-bookmark-title">danielraffel’s Music Profile | Last.fm</div><div class="kg-bookmark-description">Listen to music from danielraffel’s library (40,255 tracks played). danielraffel’s top artists: Ludovico Einaudi, Max Richter, Wim Mertens. Get your own music profile at Last.fm, the world’s largest social music platform.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://danielraffel.me/content/images/icon/lastfm_avatar_applemusic.b06eb8ad89be.png" alt=""><span class="kg-bookmark-author">Last.fm</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://danielraffel.me/content/images/thumbnail/4897e8402a4d4994c9cf68c3d6863c06.png" alt="" onerror="this.style.display = 'none'"></div></a></figure><p>Last.fm hasn’t been maintaining all their integrations, so I had to cobble together several methods to scrobble music from all the surfaces where I listen to music.</p><ul><li>On iOS, I’m using <a href="https://finale.app/?ref=danielraffel.me" rel="noreferrer">Finale</a>, an open-source app that scrobbles everything played in the Apple Music app.</li><li>For streaming services, <a href="https://www.last.fm/about/trackmymusic?ref=danielraffel.me#spotify" rel="noreferrer">I connected my Spotify account to Last.fm</a>.  </li><li>On macOS, the only app that worked with the Apple Music desktop app for me was <a href="https://apps.apple.com/us/app/scrobbles-for-last-fm/id1344679160?mt=12&ref=danielraffel.me" rel="noreferrer">Scrobbles</a>. </li></ul><p>At some point, I'll pull my scrobble data into my journal.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How I Optimized My Development Workflow with Cursor Pro, ChatGPT and GitHub Desktop ]]></title>
        <description><![CDATA[ I recently adopted Cursor and refined some inefficient workflows, such as implementing better use of version control and eliminating copy-pasting between tools. These changes have led to more successful outcomes, making it easier to tackle complex coding projects with confidence. ]]></description>
        <link>https://danielraffel.me/til/2024/09/23/how-i-optimized-my-development-workflow-with-cursor-pro-chatgpt-and-github-desktop/</link>
        <guid isPermaLink="false">66f066d7d9062e034b228c75</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sun, 22 Sep 2024 19:41:17 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/laptop3-1.png" medium="image"/>
        <content:encoded><![CDATA[ <p>For the past few decades, I’ve worked as a Product Lead, primarily focused on defining what teams should build and why, rather than doing the coding myself. With the advent of LLM chat tools and coding assistants, I’ve been coding more frequently and tackling more ambitious side projects than I previously would have considered.</p><p>Initially, everything went smoothly—I began with smaller projects and saw quick wins using tools like ChatGPT and Claude. But as I took on more complex challenges, I started hitting roadblocks. After hitting <a href="https://en.wikipedia.org/wiki/SNAFU?ref=danielraffel.me" rel="noreferrer">snafus</a> and abandoning some projects prematurely, I realized I needed to revisit my approach.</p><p>Stepping back I concluded that my main problem, outside of being a better developer, was inefficient workflows. I had grown far too reliant on copying and pasting between different windows, and I wasn’t more regularly using version control or creating branches. This meant that when things went awry, I had no simple way to backtrack or return to a stable state. I ended up abandoning projects when they went off course and felt like they required too much effort to get back on track.</p><p>After a conversation with a friend (thanks, <a href="https://x.com/atupem?ref=danielraffel.me" rel="noreferrer">Carl</a>!) who shared his workflow, I decided to step back and reassess my own approach. By incorporating a few best practices, such as regularly versioning my code and adopting tools like <a href="https://desktop.github.com/download/?ref=danielraffel.me" rel="noreferrer">GitHub Desktop</a> and <a href="https://www.cursor.com/pricing?ref=danielraffel.me" rel="noreferrer">Cursor Pro</a>, I’ve already noticed significant improvements. While it’s still early days, this new structure is helping me manage more complex projects with greater ease, and I feel much more confident about taking on bigger challenges without burning out. In this post, I’ll walk through the various changes I’ve adopted over the past week.</p><h3 id="streamlining-development-with-branches-and-frequent-small-commits-in-github">Streamlining Development with Branches and Frequent Small Commits in GitHub</h3><p>One of the first changes I implemented was to stop working directly on the main branch of my projects. Instead, I now create a dev branch where I commit more frequently, keeping changes smaller and more manageable. For specific features, I create dedicated branches, merging into the main branch only when I’m confident everything is in good shape. This approach helps me preserve my progress when things get messy. If I don’t want to merge all the changes from a branch, I <a href="https://docs.github.com/en/desktop/managing-commits/squashing-commits-in-github-desktop?ref=danielraffel.me" rel="noreferrer">squash</a> and merge only the relevant commits.</p><p>Switching to using the <a href="https://github.com/apps/desktop?ref=danielraffel.me" rel="noreferrer">GitHub Desktop</a> client has also been helpful. It provides me with a much clearer overview of the changes I’m about to commit compared to the built-in source control tools in Cursor or my terminal. While the IDE’s tools are fine for quick commits or branch switching, for larger changes, I prefer the enhanced visibility that GitHub Desktop offers.</p><h3 id="development-planning-and-spec-creation-with-chatgpt">Development Planning and Spec Creation with ChatGPT</h3><p>For more complex tasks, I now start by mapping them out in ChatGPT before diving into coding. This approach ensures that I start coding with a clear sequence of steps and starting direction even if the plan changes. It also helps me offer more focused guidance to the coding assistant I'm using.</p><p>After outlining a plan, I either start working directly on the dev branch for smaller changes or, for larger tasks, I create one or more GitHub issues where I add my spec and set up a dedicated branch. This approach lets me switch tasks or step away without losing track of important details. Keeping tasks organized and tied to specific branches makes it much easier to pick up where I left off after a break.</p><p>I have found creating specs using <a href="https://platform.openai.com/docs/models/o1?ref=danielraffel.me" rel="noreferrer">o1-preview</a> is particularly effective when engaging in refactoring complex sections of code. When I aim to enhance or expand an MVP with new features and capabilities, having a detailed, step-by-step plan not only streamlines the refactor but also uncovers insights I hadn’t anticipated. Following those steps and committing incremental changes leads to noticeably improved outcomes.</p><h3 id="transitioning-from-web-chat-tools-to-cursor-for-coding">Transitioning from Web Chat Tools to Cursor for Coding</h3><p>The hardest adjustment for me was shifting to doing all of my coding in an IDE, rather than relying on regularly copy-pasting between the IDE and web-based chat tools. Fortunately, a <a href="https://www.jointakeoff.com/courses/cursor?ref=danielraffel.me" rel="noreferrer">course on using Cursor</a> by <a href="https://x.com/mckaywrigley?ref=danielraffel.me">Mckay Wrigley</a> made the transition quick and easy.</p><p>To ease into this new approach, I started using web-based chat assistants for strategic planning—figuring out how to approach a problem—and relying on Cursor’s coding assistant for tactical execution. This separation of tasks allows me to use each tool in a more focused manner. While it’s not strictly necessary to work this way, it has been a helpful way for me to transition to spending more of my time in Cursor.</p><p>The end result is a more efficient workflow where I can develop a clear, opinionated approach outside the IDE and use that output to guide how I work in Cursor. In turn, Cursor enables me to write higher quality code in an environment that tracks both code changes and chat history that contributed to development.</p><h2 id="why-cursor-is-a-game-changer-for-me">Why Cursor is a Game Changer for Me</h2><p>Implementing best practices like using Git, tracking issues, and creating specs has been valuable, but adopting Cursor has been the real game changer for me. Although I enjoyed using Copilot in VS Code, I still found myself regularly leaving the IDE for additional coding assistance. Cursor, on the other hand, provides a range of features that enable me to complete my work without ever needing to leave the editor.</p><p>Cursor’s built-in chat assistant <a href="https://docs.cursor.com/advanced/models?ref=danielraffel.me" rel="noreferrer">supports multiple models</a>, allowing me to switch between the best-suited ones for different tasks. I can also integrate third-party <a href="https://docs.cursor.com/advanced/api-keys?ref=danielraffel.me" rel="noreferrer">API keys</a> for added flexibility. By indexing my entire codebase, Cursor makes it much easier to integrate new features or libraries and apply suggested changes efficiently. Additionally, with the ability to @mention files, folders, web resources, or even my entire codebase, I can quickly reference the latest documentation for APIs, SDKs, or languages. Cursor is bundled with shortcuts to success.</p><h2 id="key-features">Key Features</h2><p>Cursor offers several features that significantly enhance the coding experience.</p><h3 id="codebase-indexing">Codebase Indexing</h3><p><a href="https://docs.cursor.com/context/codebase-indexing?ref=danielraffel.me" rel="noreferrer">Codebase indexing</a> generates embeddings for each file in my codebase, improving the accuracy of code suggestions. The index stays in sync with my latest changes, ensuring responses are always up-to-date and relevant.</p><h3 id="long-context-chat">Long-Context Chat</h3><p>Codebase indexing shines when combined with <a href="https://docs.cursor.com/chat/overview?ref=danielraffel.me#long-context-chat-beta" rel="noreferrer">long-context chat</a>. <a href="https://docs.cursor.com/advanced/models?ref=danielraffel.me#long-context-only-models" rel="noreferrer">Models with support for larger context windows</a> allow me to include entire folders in conversations, enabling Cursor to offer comprehensive feedback based on broader sections of my codebase.</p><h3 id="mentions-and-chat-history">@Mentions and Chat History</h3><p>What convinced me to switch to Cursor was the ability to <a href="https://docs.cursor.com/context/@-symbols/basic?ref=danielraffel.me" rel="noreferrer">reference specific files, folders, and code</a> directly in chats. By @mentioning them within the IDE, I stay in the flow without unnecessary interruptions. No more copying and pasting between windows. Plus, the <a href="https://docs.cursor.com/chat/overview?ref=danielraffel.me#chat-history" rel="noreferrer">chat history</a> is stored in the IDE, providing a central place to track past discussions and changes.</p><h3 id="adding-docs">Adding Docs</h3><p>One major advantage of using AI coding assistants is no longer needing to search through third-party docs when integrating new tools. Cursor has many third-party libraries indexed, accessible via mentioning them in chat with @Docs. For documentation that hasn’t been indexed yet, I can <a href="https://docs.cursor.com/context/@-symbols/@-docs?ref=danielraffel.me#add-custom-docs" rel="noreferrer">easily add it</a> by pasting a URL to the support materials, and Cursor will index it for use in prompts.</p><h3 id="apply-feature">Apply Feature</h3><p>The <a href="https://docs.cursor.com/chat/apply?ref=danielraffel.me" rel="noreferrer">apply</a> feature is another favorite of mine, allowing me to quickly integrate code suggestions from the chat. After applying, I can review the diffs and choose to accept or reject the changes. <br><br>Occasionally, I notice that this feature suggests applying the correct changes to the wrong file. Thankfully, it’s easy to spot when this happens, and I can either manually fix it by copying and pasting the code into the right file or ask the assistant to correct it. </p><p>Numerous features in Cursor require hands-on use to get a proper feel for them. Sometimes this is due to their non-deterministic behavior, while other times it's because they are a bit buggy or just slow. For example, I initially thought the apply feature wasn’t working because it takes a few moments to generate changes in local files, but I was simply being impatient with it. I’ve learned to give certain features in Cursor more than one try to work correctly.</p><h3 id="cursor-tab">Cursor Tab</h3><p><a href="https://www.cursor.com/cpp?ref=danielraffel.me" rel="noreferrer">Cursor Tab</a> is a powerful autocomplete tool that goes beyond simple suggestions.&nbsp;It predicts entire edits based on your recent changes and codebase knowledge, suggesting edits across multiple lines and even fixing your mistakes.</p><h3 id="ai-rules">AI Rules</h3><p>A handy feature is the ability to add <a href="https://docs.cursor.com/context/rules-for-ai?ref=danielraffel.me" rel="noreferrer">custom AI rules</a>, tailoring the assistant’s responses to fit my specific workflow. This ensures it focuses on the most helpful actions and avoids less useful suggestions. Apparently, there's a <a href="https://cursor.directory/?ref=danielraffel.me" rel="noreferrer">third party directory of suggestions</a> but <a href="https://x.com/mckaywrigley/status/1832936256298520602?ref=danielraffel.me" rel="noreferrer">I currently use this one</a>.</p><h3 id="markdown-files"><strong>Markdown Files</strong></h3><p>Since large language models (LLMs) are trained on extensive markdown (.MD) content and have well-structured knowledge, MD files work great with assistants in Cursor to provide actionable, step-by-step instructions. Coding chat assistants can generate relevant context that can be easily acted upon. For example, whenever you save changes to a note in Cursor, you can reference them in Chat. Here's how:</p><ul><li>Create a markdown file in cursor reference it in the chat.</li><li>By mentioning the markdown file in the chat assistant it can use that as context and even act upon it. </li></ul><p>For instance, <a href="https://x.com/imrat/status/1826638219733254616?ref=danielraffel.me" rel="noreferrer">Imran</a> realized you could make a markdown file that requests a visual component, uses the assistant to generate a suitable prompt, sends that prompt to <a href="https://v0.dev/?ref=danielraffel.me" rel="noreferrer">v0.dev</a>, and returns a link to view the visual component. If that's unclear what an example of it in action <a href="https://www.youtube.com/watch?v=Ajr7c9o70lM&ref=danielraffel.me" rel="noreferrer">here</a>.</p><p>This method also works for managing projects in Composer and directing chats to update project status once things are implemented. You can <a href="https://www.youtube.com/watch?v=bAAbrhb3QoM&t=422s&ref=danielraffel.me" rel="noreferrer">watch how markdown works in Composer</a>.</p><h3 id="composer"><strong>Composer</strong></h3><p>Composer is a feature that lets you use natural language prompts to create new projects from scratch, enhance existing ones, or generate code snippets. With Composer Projects, it takes the functionality even further, offering a huge time-saving advantage when starting a project. You can see Composer in action <a href="https://www.youtube.com/watch?v=9yS0dR0kP-s&ref=danielraffel.me" rel="noreferrer">here</a>. </p><h3 id="final-thoughts">Final Thoughts</h3><p>Although I still have plenty of room to grow as a developer, the changes I’ve adopted this past week have already improved my productivity. Compared to my previous workflow, taking on ambitious projects now feels both simpler and more enjoyable, and I believe it will enable me to see more projects through to completion.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ First impressions of macOS 15 Sequoia and iOS 18 ]]></title>
        <description><![CDATA[ I updated to macOS 15 and iOS 18 on launch day and decided to jot down a few things I stumbled upon and / or tweaked right away—nothing comprehensive, just what caught my eye. ]]></description>
        <link>https://danielraffel.me/til/2024/09/16/how-to-remove-those-annoying-gutter-margins-for-perfect-window-tiling-in-macos-sequoia-15/</link>
        <guid isPermaLink="false">66e87b4fb66e7503522a3327</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 16 Sep 2024 11:49:40 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/tilesmacos.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I updated to macOS 15 and iOS 18 on launch day and decided to jot down a few things I stumbled upon and / or tweaked right away—nothing comprehensive, just what caught my eye.</p><h2 id="macos-15-sequoia">macOS 15 Sequoia</h2><h3 id="tiling-tips">Tiling tips</h3><p>You can arrange windows for a pixel-perfect fit in macOS 15. Here’s a quick rundown of how to tile windows:</p><ol><li>Hold down the Option (⌥) key while dragging to start tiling. Drag the window to the screen edges:<br>• Top Center edge: maximizes the window.<br>• Top Left or Right edges: split the screen in half.<br>• Bottom Corners: fills the respective bottom quarters of the screen.</li><li>Hover over the green button in the top-left corner for a point-and-click tiling menu.</li></ol><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-2.png" class="kg-image" alt="" loading="lazy" width="540" height="582"></figure><ol start="3"><li>By default, tiled windows have a small gutter around them, which you can disable in <strong>System Settings &gt; Desktop &amp; Dock</strong>:<br>• Scroll to the “Windows” section and uncheck <strong>Tiled windows have margins</strong>. That gutter was driving me crazy! Had to turn it off immediately.</li></ol><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/09/Screenshot-2024-09-16-at-11.38.25-AM.png" class="kg-image" alt="" loading="lazy" width="962" height="630" srcset="https://danielraffel.me/content/images/size/w600/2024/09/Screenshot-2024-09-16-at-11.38.25-AM.png 600w, https://danielraffel.me/content/images/2024/09/Screenshot-2024-09-16-at-11.38.25-AM.png 962w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">That gutter was killing me! Had to disable immediately.</span></figcaption></figure><h3 id="messenges-tapback-shortcut">Messenges Tapback Shortcut</h3><p>Right-click a text message bubble (or press ⌘T) and you can Tapback with any emoji, not just the default six.</p><h3 id="voice-memo-audio-transcriptions">Voice Memo Audio Transcriptions</h3><p>In the Voice Memos app, you can now transcribe recorded conversations by tapping the transcription icon—it turns blue when enabled.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-3.png" class="kg-image" alt="" loading="lazy" width="430" height="72"></figure><h3 id="new-passwords-app">New Passwords App</h3><p>I decided to export my passwords from 1Password using a CSV file and imported them into the new Passwords app. I was hoping to have a more up-to-date backup in case I get locked out of 1Password. However, the CSV import wasn't comprehensive, as <a href="https://support.1password.com/export/?ref=danielraffel.me" rel="noreferrer">only the 1Password Unencrypted Export (.1pux) format</a> supports custom fields like security questions, linked items, linked apps, and two-factor authentication backup codes. It seems like <a href="https://support.1password.com/1pux-format/?ref=danielraffel.me" rel="noreferrer">.1pux is proprietary</a> to 1Password, and I couldn’t find an easy way to automate importing the data from that file. So much for trying to create a full backup.</p><h3 id="reminders-in-calendar">Reminders in Calendar </h3><p>It feels a bit slower than using the Reminders app, but apparently, I can now create reminders directly in the Calendar app.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-4.png" class="kg-image" alt="" loading="lazy" width="656" height="986" srcset="https://danielraffel.me/content/images/size/w600/2024/09/image-4.png 600w, https://danielraffel.me/content/images/2024/09/image-4.png 656w"></figure><h3 id="iphone-mirroring">iPhone Mirroring</h3><p>I found this feature really useful—sometimes when I’m working on my Mac, I want to stream music from my iPhone to my AirPlay speakers. Now, I can do it right from my Mac without needing to pick up my phone.</p><h3 id="xcode-16">Xcode 16</h3><p><s>I can't open Xcode 15 and expected to see Xcode 16 in the macOS AppStore. Odder yet, I </s><a href="https://developer.apple.com/download/applications/?ref=danielraffel.me" rel="noreferrer"><s>can't download the update on the Apple Dev Site</s></a><s>.</s> Now, available as a <a href="https://apps.apple.com/us/app/xcode/id497799835?mt=12&ref=danielraffel.me" rel="noreferrer">macOS AppStore update</a>.</p><h2 id="ios-18">iOS 18</h2><h3 id="safari">Safari</h3><p>Safari Tools is now accessible as an icon in the top left menu bar, and tapping it reveals options, including the ability to listen to a webpage.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-5.png" class="kg-image" alt="" loading="lazy" width="1179" height="1391" srcset="https://danielraffel.me/content/images/size/w600/2024/09/image-5.png 600w, https://danielraffel.me/content/images/size/w1000/2024/09/image-5.png 1000w, https://danielraffel.me/content/images/2024/09/image-5.png 1179w" sizes="(min-width: 720px) 720px"></figure><p>There are other Safari features, like Distraction Control, that are less useful to me since I use <a href="https://adguard.com/en/adguard-home/overview.html?ref=danielraffel.me" rel="noreferrer">AdGuard Home</a>, but you never know when you gotta block an annoying element on a webpage.</p><h3 id="disable-tinted-icons">Disable Tinted Icons</h3><p>I'm not a fan of the new tinted icons in iOS 18. To disable them, long-press anywhere on the Home Screen, tap "Edit" in the top left corner, then select "Customize" and choose "Light."</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-6.png" class="kg-image" alt="" loading="lazy" width="1179" height="548" srcset="https://danielraffel.me/content/images/size/w600/2024/09/image-6.png 600w, https://danielraffel.me/content/images/size/w1000/2024/09/image-6.png 1000w, https://danielraffel.me/content/images/2024/09/image-6.png 1179w" sizes="(min-width: 720px) 720px"></figure><h3 id="appstore">AppStore</h3><p>To view all reviews for a specific App and sort them by recent or other options, you now need to tap on the "Ratings &amp; Reviews" section header. This is definitely less obvious than it used to be, which benefits apps with a lot of recent negative reviews.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-7.png" class="kg-image" alt="" loading="lazy" width="1179" height="470" srcset="https://danielraffel.me/content/images/size/w600/2024/09/image-7.png 600w, https://danielraffel.me/content/images/size/w1000/2024/09/image-7.png 1000w, https://danielraffel.me/content/images/2024/09/image-7.png 1179w" sizes="(min-width: 720px) 720px"></figure><h3 id="view-electricity-usage-and-rates-in-homeapp">View electricity usage and rates in Home.app</h3><p>You can now monitor your home’s electricity usage in the Home app and easily track your consumption over time. <a href="https://support.apple.com/guide/iphone/view-electricity-usage-and-rates-iphb93a7973e/18.0/ios/18.0?ref=danielraffel.me" rel="noreferrer">Learn how to connect your utility</a>.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/10/image-1.png" class="kg-image" alt="" loading="lazy" width="1179" height="677" srcset="https://danielraffel.me/content/images/size/w600/2024/10/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2024/10/image-1.png 1000w, https://danielraffel.me/content/images/2024/10/image-1.png 1179w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ High Fives While Cycling ]]></title>
        <description><![CDATA[ Last year, I was riding on a cold, windy day, not exactly feeling up to cycling. As I pedaled along, I noticed a runner and couldn’t help but see a bit of myself in him, imagining he might also be feeling a bit reluctant about working out in this weather. ]]></description>
        <link>https://danielraffel.me/2024/09/13/high-fives-while-cycling/</link>
        <guid isPermaLink="false">66a7c9bc33dba5033472ee1f</guid>
        <category><![CDATA[ 🚴‍♂️ Biking ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 13 Sep 2024 10:53:52 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/DALL-E-2024-09-13-10.38.37---An-illustration-in-the-style-of-Paul-Rand--featuring-a-cyclist-and-a-runner-sharing-a-high-five-near-the-Golden-Gate-Bridge.-The-background-is-a-minim.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Late last year, I was riding near the Golden Gate Bridge on a cold, windy day, not exactly feeling up to cycling over to Marin. As I pedaled along, I noticed a runner approaching and couldn’t help but see a bit of myself in him, imagining he might also be feeling a bit reluctant about working out in this weather.</p><p>We made eye contact, I extended my hand out to the side, shared a brief smile, and as our paths crossed, we connected with a high-five. Little did I know that this would become a move on future rides. 🤣✋🚴‍♂️</p><p>A few weeks back, while <a href="https://www.strava.com/activities/12141310180/overview?utm_medium=web_embed&utm_source=activity_embed&strava_deeplink_url=strava%3A%2F%2Factivities%2F12141310180&_branch_match_id=1167627502776502459&_branch_referrer=H4sIAAAAAAAAA3WOywrCMBBFvyYubZMUKYKIIF25dR0m6UhDExvzKv69qS3uhNncOdzHEKMLx6oK0UOGPTi3N%2Fo5Vuqm4Q4tDy95Jqx5gDES1CiSN6dh8RB%2BIawrtznVZIsAFXXWUWMogjLaUE5r2tZFTRl91jgT3qVohcVeJ0v4dUYp0ErsCTssIEzJKyxgy3qvdLf2iB7RLQu%2FS9bfb8q%2F9g%2F%2BcP8H5QAAAA%3D%3D"><u>climbing up to the peak of Tahoe Donner</u></a>, someone cheered me on, and as I approached, I couldn’t resist saying, "Can I get a high five?" They ran over, and we connected. I’ve noticed that after every high-five, I gain a little burst of energy, and my cadence picks up.</p><p>I’ve started jokingly measuring my rides by the number of high-fives I collect. On a good day, I can easily rack up 3 (e.g., a 15-ride!). Of course, not every ride is high-five-worthy, but when those opportunistic moments materialize, it’s rare to be left hanging.</p><p>Occasionally, the people I ask for a high-five are those standing next to their cars, either admiring the scenery or getting ready to take photos. Part of me feels like extending a connection to drivers on the road is my way of saying, "Hey, bicyclists are your friends," while subversively implying, "Thanks for watching out for your pals next time you drive past one."</p><p>I don’t know if what I’m feeling is what some describe as <a href="https://en.wikipedia.org/wiki/Spirit_(animating_force)?ref=danielraffel.me"><u>universal energy</u></a>, but in the moment, it sure feels like there’s a shared understanding that we’re all in this together.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Set Up a Shortcut on iOS to Trigger a Photo ID Screenshot When Arriving at a Location ]]></title>
        <description><![CDATA[ I recently joined a gym. While they offer physical ID cards for check-in, most people are directed to use the mobile app to display a static barcode for entry. I set up an Apple Shortcut that automatically displays a screenshot of my gym ID when I arrive at the gym. ]]></description>
        <link>https://danielraffel.me/til/2024/09/13/how-to-set-up-a-shortcut-on-ios-to-trigger-a-photo-id-when-arriving-at-a-location-2/</link>
        <guid isPermaLink="false">66e468c90384c003413f7bdf</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 13 Sep 2024 10:21:59 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/DALL-E-2024-09-13-10.19.28---A-minimalist-and-geometric-design-inspired-by-Paul-Rand--representing-an-Apple-Shortcut-automation.-The-visual-features-a-simplified-mobile-phone-with.png" medium="image"/>
        <content:encoded><![CDATA[ <p><strong>TL;DR: This Apple Shortcut and Automation automatically displays a screenshot of your photo ID when you approach a set location, saving you time needing to manually locate and display it.</strong></p><p>I recently joined UCSF Bakar Gym in San Francisco. While they offer physical ID cards for check-in, most people are directed to use the mobile app to display a static UPC barcode for entry. Knowing this could get cumbersome for me, I decided to set up an Apple Shortcut that automatically displays a screenshot of my gym ID when I arrive within 200 meters of the gym.</p><p>I'm not sure how many others might find this useful, but here's a guide on setting up an automation to easily access a location-triggered screenshot of your photo ID.</p><h3 id="step-1-download-the-shortcut">Step 1: Download the Shortcut</h3><p>You can <a href="https://www.icloud.com/shortcuts/7042ea27263a481797404a941392d8f4?ref=danielraffel.me">download my shortcut here</a>. The Apple shortcut performs two simple actions.</p><ul><li><strong>Accesses the photo file</strong>: It retrieves the saved image of your gym ID (in this case, <code>UCSF_GYM_ID.jpg</code>) from your iPhone’s storage. <strong>Note:</strong> Be sure to update the path to your image since it won't find mine. 😉</li><li><strong>Displays the ID in Quick Look</strong>: It opens a preview of the image so you can quickly show your ID at the gym.</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/shortcut.png" class="kg-image" alt="" loading="lazy" width="319" height="693"></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/09/shortcut_ux.png" class="kg-image" alt="" loading="lazy" width="300" height="246"><figcaption><span style="white-space: pre-wrap;">Shortcut automation in the Shortcuts app which shows the image </span></figcaption></figure><h3 id="step-2-set-up-the-automation">Step 2: Set Up the Automation</h3><p>While the Apple Shortcut app doesn't allow sharing automations, I can walk you through how to configure it.</p><ol><li><strong>Location Trigger</strong>:</li></ol><ul><li>Open the Shortcuts app and navigate to the Automation tab.</li><li>Set the trigger to activate when you arrive at the gym's location (in my case, 1675 Owens St, SF, CA).</li><li>Choose a proximity radius (I set mine to 200 meters which is the smallest radius iOS allows) to ensure it triggers only when near the gym.</li></ul><ol start="2"><li><strong>Action</strong>:</li></ol><ul><li>Choose the "UCSF Gym ID" shortcut as the action to run when you arrive at the location.</li></ul><ol start="3"><li><strong>Run Confirmation</strong>:</li></ol><ul><li>Make sure to keep the "Run After Confirmation" setting enabled. This way, you’ll receive a notification asking if you want to run the shortcut when you arrive.</li></ul><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/automation_screen.png" class="kg-image" alt="" loading="lazy" width="320" height="693"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/automation_settings.png" class="kg-image" alt="" loading="lazy" width="320" height="693"></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/09/automation_ux2.png" class="kg-image" alt="" loading="lazy" width="320" height="244"><figcaption><span style="white-space: pre-wrap;">Personal Automation in the Shortcuts app which triggers the Shortcut</span></figcaption></figure><hr><p>Once set up, every time you arrive near the gym, you’ll get a notification on your phone. When you confirm the action, the screenshot of your gym ID will appear in Quick Look, ready to be scanned for entry. For me, I no longer have to manually open the gym's app (and wait for it to load) or search for the screenshot in my photo library, and the image can be easily dismissed after showing it to the staff.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How To Streamline Web Tasks by Integrating Browserless, Playwright and ChangeDetection.io with n8n ]]></title>
        <description><![CDATA[ I recently self-hosted ChangeDetection.io using Browserless and Playwright in Docker. My initial goal was to set up basic website monitoring for website changes and alerts on restocks and price drops. However, this setup unlocked possibilities for more advanced web automation tasks with n8n. ]]></description>
        <link>https://danielraffel.me/til/2024/09/12/how-to-streamline-your-web-tasks-by-integrating-browserless-playwright-and-changedetection-io-with-n8n/</link>
        <guid isPermaLink="false">66c0d0ece28a390342791830</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 12 Sep 2024 16:40:22 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/n8nchangedetection.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently self-hosted <a href="https://github.com/dgtlmoon/changedetection.io?ref=danielraffel.me">ChangeDetection.io</a> using <a href="https://docs.browserless.io/?ref=danielraffel.me">Browserless</a> and <a href="https://playwright.dev/?ref=danielraffel.me">Playwright</a> in Docker. My initial goal was straightforward: set up basic website monitoring with ChangeDetection.io's GUI. I figured I might find it useful to have a way to monitor websites for changes and <a href="https://changedetection.io/tutorial/vevor-tools-easily-monitor-restock-and-get-alerts-discounts?ref=danielraffel.me" rel="noreferrer">get alerts on restocks and price drops</a>.</p><p>After setting it up, I realized that I had an environment capable of executing Browserless API calls and running Playwright scripts. This opened the door to advanced web automation tasks like generating PDFs, capturing screenshots, and more—all integrated seamlessly with <a href="https://n8n.io/?ref=danielraffel.me">n8n</a>.</p><h2 id="why-integrate-changedetectionio-browserless-playwright-and-n8n">Why Integrate ChangeDetection.io, Browserless, Playwright and n8n?</h2><ul><li><a href="https://changedetection.io/?ref=danielraffel.me" rel="noreferrer">ChangeDetection.io</a> provides a simple frontend for monitoring website changes.</li><li><a href="http://browserless.js.org/?ref=danielraffel.me" rel="noreferrer">Browserless</a> offers a hosted browser solution, enabling headless Chrome automation.</li><li><a href="http://playwright.dev/?ref=danielraffel.me" rel="noreferrer">Playwright</a> is a powerful library for automating browser interactions.</li><li><a href="https://n8n.io/?ref=danielraffel.me">n8n</a> is a workflow automation tool that connects various services and APIs.</li></ul><p>By integrating these tools, it's trivial to have a self-hosted system for web monitoring and automation.</p><h2 id="setting-up-changedetectionio-with-browserless-and-playwright">Setting Up ChangeDetection.io with Browserless and Playwright</h2><p>I stumbled upon a <a href="https://www.reddit.com/r/selfhosted/comments/1afkymd/updated_my_setup_so_changedetectionio_works_with/?ref=danielraffel.me" rel="noreferrer">Reddit post</a> that linked to <a href="https://gist.github.com/benuski/f60c973424cb7c061ac103ea1a02ca96?ref=danielraffel.me" rel="noreferrer">this Gist</a> containing a useful <code>docker-compose.yml</code> file. This configuration integrates Browserless with Playwright, making it easy to get started.</p><h3 id="securing-your-setup-with-a-custom-token">Securing Your Setup with a Custom Token</h3><p>Security is important when exposing services like Browserless and Playwright. I modified the Docker Compose file to include a custom token for authentication. Here's how you can do it:</p><ol><li><strong>Generate a Token</strong>: Create a secure token that you'll use for authentication.</li><li><strong>Update <code>docker-compose.yml</code></strong>: Replace <code>YOUR_PLAYWRIGHT_TOKEN</code> with your generated token in the environment variables.</li></ol><h3 id="docker-compose-configuration">Docker Compose Configuration</h3><p>Below is the modified <code>docker-compose.yml</code> file with a with a Custom Token:</p><pre><code class="language-yaml">changedetection:
    image: ghcr.io/dgtlmoon/changedetection.io:latest
    container_name: changedetection
    hostname: changedetection
    volumes:
      - /home/path/files:/datastore  # Update path to your data volume
    environment:
      - PORT=20400  # Port for changedetection
      - PUID=1000  # Preferred user ID (avoid root/0 unless necessary in a managed environment)
      - PGID=1000  # Preferred group ID (avoid root/0 unless necessary in a managed environment)
      - PLAYWRIGHT_DRIVER_URL=ws://playwright-chrome:3000/chrome?token=YOUR_PLAYWRIGHT_TOKEN&amp;launch={"headless":false}  # WebSocket connection for Playwright
    ports:
      - 20400:20400  # Port mapping for changedetection
    restart: unless-stopped
    depends_on:
      - playwright-chrome

  playwright-chrome:
    hostname: playwright-chrome
    image: ghcr.io/browserless/chrome
    restart: unless-stopped
    environment:
      - TOKEN=YOUR_PLAYWRIGHT_TOKEN # Update using a secure token
      - SCREEN_WIDTH=1920
      - SCREEN_HEIGHT=1024
      - SCREEN_DEPTH=16
      - ENABLE_DEBUGGER=true
      - TIMEOUT=600000
      - CONCURRENT=15
    ports:
      - 20450:3000  # Port mapping for Playwright WebSocket connection</code></pre><h2 id="exploring-browserless-api-capabilities">Exploring Browserless API Capabilities</h2><p>With Browserless and Playwright set up, I explored the <a href="https://docs.browserless.io/docs/api.html?ref=danielraffel.me">Browserless API documentation</a> to see what was possible. Here are some of the key endpoints and their capabilities:</p><ul><li><strong>Load and Render HTML</strong>: Use <code>/content</code> to load a URL or HTML content and retrieve the fully rendered HTML after JavaScript execution.</li><li><strong>Download Files</strong>: Use <code>/download</code> to execute browser interactions and download files.</li><li><strong>Execute Code in Browser</strong>: Use <code>/function</code> to run custom Playwright code in a browser context, perfect for web scraping tasks.</li><li><strong>Generate PDFs</strong>: Use <code>/pdf</code> to create PDF documents from URLs or HTML content with customizable options.</li><li><strong>Capture Screenshots</strong>: Use <code>/screenshot</code> to take screenshots of webpages, supporting full-page captures and customizations.</li><li><strong>Scrape Data</strong>: Use <code>/scrape</code> to extract specific data using CSS or XPath selectors.</li></ul><h2 id="integrating-with-n8n-for-workflow-automation">Integrating with n8n for Workflow Automation</h2><p>In a <a href="https://danielraffel.me/self-hosting-n8n-google-cloud">previous post</a>, I detailed how to self-host n8n on Google Cloud. Building on that, I connected n8n with a self-hosted Browserless and Playwright setup to automate web tasks.</p><h3 id="setting-up-credentials-in-n8n">Setting Up Credentials in n8n</h3><p>To authenticate with your Browserless instance in n8n:</p><ol><li><strong>Create New Credentials</strong>:<ul><li>Go to <strong>Credentials</strong> in n8n.</li><li>Choose <strong>Header Auth</strong>.</li><li>Set the name to <code>Authorization</code> and the value to <code>YOUR_PLAYWRIGHT_TOKEN</code>.</li></ul></li><li><strong>Use Credentials in Workflows</strong>:<ul><li>In your HTTP Request nodes, select the Browserless credentials you just created.</li></ul></li></ol><h3 id="example-n8n-workflows">Example n8n Workflows</h3><p>I've created several workflows utilizing different Browserless API endpoints:</p><ul><li><strong>Screenshotting</strong>: Capture a screenshot of a URL.</li><li><strong>PDF Generation</strong>: Convert a web page into a PDF.</li><li><strong>Content Retrieval</strong>: Extract the HTML content from a specific URL.</li><li><strong>Downloading Files</strong>: Download files (e.g., ZIPs) from provided links.</li><li><strong>Scraping</strong>: Extract elements using CSS or XPath selectors.</li></ul><p>You can copy and paste the following workflow template to import all the examples into your n8n instance (also <a href="https://community.n8n.io/t/automate-screenshots-pdfs-and-more-integrating-n8n-with-self-hosted-browserless-playwright-changedetection-io/53351?ref=danielraffel.me" rel="noreferrer">shared on the n8n forum</a>).</p><pre><code class="language-JSON">{
	"name": "Browserless Examples [Shared]",
	"nodes": [
		{
			"parameters": {
				"content": "## /Screenshot URL\nhttps://docs.browserless.io/HTTP-APIs/screenshot\n\nHardcoded URL Example.com\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nPassing URL Example.com as a variable",
				"height": 552.0382521592817,
				"width": 414.07950814059393
			},
			"id": "3b89c6bb-b9cd-4bcd-8958-4dfbf87e39aa",
			"name": "Sticky Note1",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				-53.09026656387687,
				-60
			]
		},
		{
			"parameters": {
				"content": "## /content return HTML\nhttps://docs.browserless.io/HTTP-APIs/content\n\nHardcoded URL Example.com\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nPassing URL Example.com as a variable",
				"height": 549.8248303020775,
				"width": 415.7157914610786
			},
			"id": "e069f9c9-a524-4da0-b0e7-9ddb8f7f2317",
			"name": "Sticky Note",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				400,
				-60
			]
		},
		{
			"parameters": {
				"content": "## /download example\n\nHardcoded URL https://getsamplefiles.com/sample-archive-files/zip and downloads the first ZIP file\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nPassing URL, file type and first/last/all as a variable",
				"height": 543.2718166583338,
				"width": 436.9874746273785
			},
			"id": "5b5bef89-7cb2-4932-86cf-1cb695e1e46c",
			"name": "Sticky Note2",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				860,
				-60
			]
		},
		{
			"parameters": {
				"content": "## /function examples",
				"height": 543.1338634894774,
				"width": 404.2618082176863
			},
			"id": "f2a40685-047a-4a71-b314-7409ac83cfa3",
			"name": "Sticky Note3",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				-56.54071590859172,
				580
			]
		},
		{
			"parameters": {
				"content": "## /scrape example\nhttps://docs.browserless.io/HTTP-APIs/scrape\n\nHardcoded URL Example.com\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nPassing URL Example.com as a variable",
				"height": 549.7523374008159,
				"width": 454.9865911527092
			},
			"id": "1bbb57a1-87cc-48bb-8aa4-e7705d6d434f",
			"name": "Sticky Note5",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				860,
				580
			]
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/function",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{\n  \"code\": \"export default async function ({ page }) {\\n    try {\\n      await page.goto('https://www.example.com', { \\n        waitUntil: 'networkidle2',\\n        timeout: 60000 // 60 seconds timeout\\n      });\\n      const screenshot = await page.screenshot({ fullPage: true });\\n      return {\\n        data: screenshot.toString('base64'),\\n        type: 'image/png'\\n      };\\n    } catch (error) {\\n      console.error('Error:', error);\\n      return {\\n        data: `Error: ${error.message}`,\\n        type: 'text/plain'\\n      };\\n    }\\n  }\\n\"\n}",
				"options": {
				}
			},
			"id": "8a91be2b-03f6-4060-93c7-bcd4e3c0e06f",
			"name": "HTTP Request5",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				120,
				860
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"jsCode": "// Set the URL in a variable\nconst url = 'https://www.example.com';\n\n// Loop over input items and add the URL to the JSON of each one\nfor (const item of $input.all()) {\n  item.json.url = url;\n}\n\n// Return the updated items\nreturn $input.all();"
			},
			"id": "1233d7c8-6218-4e3c-92ad-b3ed33fde0ba",
			"name": "Code",
			"type": "n8n-nodes-base.code",
			"typeVersion": 2,
			"position": [
				0,
				320
			]
		},
		{
			"parameters": {
				"jsCode": "// Set the URL in a variable\nconst url = 'https://www.example.com';\n\n// Loop over input items and add the URL to the JSON of each one\nfor (const item of $input.all()) {\n  item.json.url = url;\n}\n\n// Return the updated items\nreturn $input.all();"
			},
			"id": "be478e2d-0964-416d-9641-9deda9d92dfb",
			"name": "Code1",
			"type": "n8n-nodes-base.code",
			"typeVersion": 2,
			"position": [
				440,
				320
			]
		},
		{
			"parameters": {
				"jsCode": "// Set the URL in a variable\nconst url = 'https://www.example.com/';\n\n// Loop over input items and add the URL to the JSON of each one\nfor (const item of $input.all()) {\n  item.json.url = url;\n}\n\n// Return the updated items\nreturn $input.all();"
			},
			"id": "cc78803b-e207-4fd1-b650-1ff8f92054f1",
			"name": "Set URL",
			"type": "n8n-nodes-base.code",
			"typeVersion": 2,
			"position": [
				920,
				960
			]
		},
		{
			"parameters": {
				"jsCode": "// Set the URL in a variable\nconst url = 'https://www.example.com';\n\n// Loop over input items and add the URL to the JSON of each one\nfor (const item of $input.all()) {\n  item.json.url = url;\n}\n\n// Return the updated items\nreturn $input.all();"
			},
			"id": "31a1b79b-b10f-48c4-8f43-c30658976ae8",
			"name": "Set URL1",
			"type": "n8n-nodes-base.code",
			"typeVersion": 2,
			"position": [
				420,
				960
			]
		},
		{
			"parameters": {
				"content": "## /pdf example\nhttps://docs.browserless.io/HTTP-APIs/pdf\n\nHardcoded URL Example.com\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nPassing URL Example.com as a variable",
				"height": 549.7523374008159,
				"width": 454.9865911527092
			},
			"id": "5bd19d9b-258b-4a9d-9664-f648d9cbb8af",
			"name": "Sticky Note6",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				380,
				580
			]
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/screenshot",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{   \"url\": \"https://www.example.com\",   \"options\": {     \"fullPage\": true   } }",
				"options": {
				}
			},
			"id": "48668f58-fed4-4a74-8b50-57d8dec68eef",
			"name": "HTTP Request: Screenshot",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				0,
				80
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/content",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{\n  \"url\": \"https://www.example.com\"\n}",
				"options": {
				}
			},
			"id": "c473afb5-3b38-4d8a-9bee-4fbb95584a26",
			"name": "HTTP Request: Content",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				440,
				80
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/download",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{\n  \"code\": \"export default async function ({ page }) {\\n  try {\\n    await page.goto('https://getsamplefiles.com/sample-archive-files/zip', { waitUntil: 'networkidle2', timeout: 120000 });\\n    console.log('Page loaded successfully');\\n    const zipLink = await page.$eval('a[href$=\\\\\\\".zip\\\\\\\"]', link =&gt; link.href);\\n    console.log(`Processing first ZIP link: ${zipLink}`);\\n    const response = await page.goto(zipLink, { waitUntil: 'networkidle2', timeout: 120000 });\\n    if (response.ok()) {\\n      const buffer = await response.buffer();\\n      return [{\\n        filename: zipLink.split('/').pop(),\\n        data: buffer.toString('base64'),\\n        type: 'application/zip'\\n      }];\\n    } else {\\n      console.error(`Failed to download: ${response.status()} ${response.statusText()}`);\\n      return {\\\"error\\\": `Failed to download: ${response.status()} ${response.statusText()}` };\\n    }\\n  } catch (error) {\\n    console.error('No matching ZIP file found or an error occurred:', error);\\n    return {\\\"error\\\": 'No matching ZIP file found or an error occurred' };\\n  }\\n}\\n\",\n  \"context\": {}\n}",
				"options": {
				}
			},
			"id": "15cf6c0b-458c-4a7a-8fc3-6f7cb666b307",
			"name": "HTTP Request: Download",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				920,
				80
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/function",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{\n  \"code\": \"export default async function ({ page }) {\\n  try {\\n    const response = await page.goto('https://www.dundeecity.gov.uk/sites/default/files/publications/civic_renewal_forms.zip', { \\n      waitUntil: 'networkidle2',\\n      timeout: 60000 // 60 seconds timeout\\n    });\\n\\n    if (response.ok()) {\\n      const buffer = await response.buffer();\\n      return {\\n        data: buffer.toString('base64'),\\n        type: 'application/zip'\\n      };\\n    } else {\\n      throw new Error(`Failed to download: ${response.status()} ${response.statusText()}`);\\n    }\\n  } catch (error) {\\n    console.error('Error:', error);\\n    return {\\n      data: `Error: ${error.message}`,\\n      type: 'text/plain'\\n    };\\n  }\\n}\\n\"\n}",
				"options": {
				}
			},
			"id": "92c67358-e23b-4954-bd21-01430b255183",
			"name": "HTTP Request: Function",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				120,
				680
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/pdf",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{\n  \"url\": \"https://www.example.com\",\n  \"options\": {\n    \"displayHeaderFooter\": true,\n    \"printBackground\": false,\n    \"format\": \"A4\"\n  }\n}",
				"options": {
				}
			},
			"id": "2996a1d2-70c1-469f-8610-c122f0cfc95c",
			"name": "HTTP Request: PDF",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				420,
				720
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/scrape",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "{\n  \"url\": \"https://www.example.com/\",\n  \"elements\": [\n    { \"selector\": \"h1\" }\n  ],\n  \"gotoOptions\": {\n    \"timeout\": 10000,\n    \"waitUntil\": \"networkidle2\"\n  }\n}",
				"options": {
				}
			},
			"id": "e6b2daae-b3ba-4cc4-b1f5-b58577c3b4f3",
			"name": "HTTP Request: Scrape",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				920,
				720
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/screenshot",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "={\n  \"url\": \"{{$json[\"url\"]}}\",\n  \"options\": { \"fullPage\": true }\n}",
				"options": {
				}
			},
			"id": "9513a03e-e37c-4d4d-80cb-076e212fcdb9",
			"name": "HTTP Request: Screenshot1",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				160,
				320
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/pdf",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "={\n  \"url\": \"{{$json[\"url\"]}}\",\n  \"options\": {\n    \"displayHeaderFooter\": true,\n    \"printBackground\": false,\n    \"format\": \"A4\"\n  }\n}",
				"options": {
				}
			},
			"id": "4bed427f-0c4e-4c5e-8fb5-611f483e86ed",
			"name": "HTTP Request: PDF1",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				580,
				960
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/content",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "={\n  \"url\": \"{{$json[\"url\"]}}\"\n}",
				"options": {
				}
			},
			"id": "73836d1e-638c-4937-a89a-b0c923bbcd52",
			"name": "HTTP Request: Content1",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				600,
				320
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/scrape",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "={\n  \"url\": \"{{$json[\"url\"]}}\",\n  \"elements\": [\n    { \"selector\": \"h1\" }\n  ],\n  \"gotoOptions\": {\n    \"timeout\": 10000,\n    \"waitUntil\": \"networkidle2\"\n  }\n}",
				"options": {
				}
			},
			"id": "bd0e9889-68d5-4ae0-95b1-96fb937b592d",
			"name": "HTTP Request: Scrape1",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				1080,
				960
			],
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"method": "POST",
				"url": "http://yourdomain:port/chrome/download",
				"authentication": "genericCredentialType",
				"genericAuthType": "httpHeaderAuth",
				"sendHeaders": true,
				"headerParameters": {
					"parameters": [
						{
							"name": "Content-Type",
							"value": "application/json"
						}
					]
				},
				"sendBody": true,
				"contentType": "raw",
				"rawContentType": "application/json",
				"body": "={\n  \"code\": \"export default async function ({ page }) {\\n  try {\\n    await page.goto('{{ $json[\"url\"] }}', { waitUntil: 'networkidle2', timeout: 120000 });\\n    console.log('Page loaded successfully');\\n    const links = await page.$eval('a[href$=\\\\\\\".{{ $json[\"fileType\"] }}\\\\\\\"]', links =&gt; links.map(link =&gt; link.href));\\n    let selectedLinks = [];\\n    if ('{{ $json[\"downloadOption\"] }}' === 'all') {\\n      selectedLinks = links;\\n    } else if ('{{ $json[\"downloadOption\"] }}' === 'first') {\\n      selectedLinks = links.length &gt; 0 ? [links[0]] : [];\\n    } else if ('{{ $json[\"downloadOption\"] }}' === 'last') {\\n      selectedLinks = links.length &gt; 0 ? [links[links.length - 1]] : [];\\n    }\\n    const results = [];\\n    for (const zipLink of selectedLinks) {\\n      console.log(`Processing ${zipLink}`);\\n      const response = await page.goto(zipLink, { waitUntil: 'networkidle2', timeout: 120000 });\\n      if (response.ok()) {\\n        const buffer = await response.buffer();\\n        results.push({\\n          filename: zipLink.split('/').pop(),\\n          data: buffer.toString('base64'),\\n          type: 'application/{{ $json[\"fileType\"] }}'\\n        });\\n      } else {\\n        console.error(`Failed to download: ${response.status()} ${response.statusText()}`);\\n        results.push({\\\"error\\\": `Failed to download: ${response.status()} ${response.statusText()}` });\\n      }\\n    }\\n    return results;\\n  } catch (error) {\\n    console.error('No matching {{ $json[\"fileType\"] }} file found or an error occurred:', error);\\n    return {\\\"error\\\": 'No matching {{ $json[\"fileType\"] }} file found or an error occurred' };\\n  }\\n}\\n\",\n  \"context\": {}\n}",
				"options": {
				}
			},
			"name": "HTTP Request: Download4",
			"type": "n8n-nodes-base.httpRequest",
			"typeVersion": 4.2,
			"position": [
				1120,
				320
			],
			"id": "8ba0010f-fa97-45cd-928e-01751688705d",
			"credentials": {
				"httpHeaderAuth": {
					"id": "NXo0Bj93cIusdtnD",
					"name": "Browserless: Header Auth Account"
				}
			}
		},
		{
			"parameters": {
				"jsCode": "return [\n  {\n    json: {\n      url: 'https://getsamplefiles.com/sample-archive-files/zip',\n      fileType: 'zip',\n      downloadOption: 'first' // Options: 'all', 'first', 'last'\n    }\n  }\n];"
			},
			"id": "8ecdcfac-2d65-45ec-a604-5f49ae66383f",
			"name": "Code4",
			"type": "n8n-nodes-base.code",
			"typeVersion": 2,
			"position": [
				920,
				320
			]
		},
		{
			"parameters": {
				"content": "## Note about the URL field\nReplace yourdomain:port with your actual domain.com:portt# like something.com:24000",
				"height": 80,
				"width": 1341.4849931447006
			},
			"id": "e4e9705d-4b08-447c-85fa-57d260697ad8",
			"name": "Sticky Note4",
			"type": "n8n-nodes-base.stickyNote",
			"typeVersion": 1,
			"position": [
				-60,
				-180
			]
		},
		{
			"parameters": {
			},
			"id": "2e4045a9-cb47-4a65-af34-aabaa47ba16c",
			"name": "When clicking ‘Test workflow’",
			"type": "n8n-nodes-base.manualTrigger",
			"typeVersion": 1,
			"position": [
				1440,
				940
			]
		}
	],
	"pinData": {
	},
	"connections": {
		"Code": {
			"main": [
				[
					{
						"node": "HTTP Request: Screenshot1",
						"type": "main",
						"index": 0
					}
				]
			]
		},
		"Code1": {
			"main": [
				[
					{
						"node": "HTTP Request: Content1",
						"type": "main",
						"index": 0
					}
				]
			]
		},
		"Set URL": {
			"main": [
				[
					{
						"node": "HTTP Request: Scrape1",
						"type": "main",
						"index": 0
					}
				]
			]
		},
		"Set URL1": {
			"main": [
				[
					{
						"node": "HTTP Request: PDF1",
						"type": "main",
						"index": 0
					}
				]
			]
		},
		"Code4": {
			"main": [
				[
					{
						"node": "HTTP Request: Download4",
						"type": "main",
						"index": 0
					}
				]
			]
		}
	},
	"active": false,
	"settings": {
		"executionOrder": "v1"
	},
	"versionId": "186e16fa-c115-441f-8517-1ce16f3f2695",
	"meta": {
		"templateCredsSetupCompleted": true,
		"instanceId": "84c8cadeffb0e45ffb93507bd03ee1ba65b1274dc2bab04cc058f9e6a2a130e1"
	},
	"id": "cqCiCDGyUb37qXGd",
	"tags": [

	]
}</code></pre><p><strong>Note</strong>: Aside from adding your header authentication credentials, the only other change you’ll need to make to the above workflow is to update the domain and port of your Changedetection.io (or Browserless/Playwright) instance in each node, as mentioned in the sticky note within the template.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/12/image-13.png" class="kg-image" alt="" loading="lazy" width="1272" height="1206" srcset="https://danielraffel.me/content/images/size/w600/2024/12/image-13.png 600w, https://danielraffel.me/content/images/size/w1000/2024/12/image-13.png 1000w, https://danielraffel.me/content/images/2024/12/image-13.png 1272w" sizes="(min-width: 720px) 720px"></figure><h2 id="benefits-of-this-setup">Benefits of This Setup</h2><p>For those relying on third-party services for web automation tasks, this setup offers a self-hosted, cost-effective solution to capture screenshots, generate PDFs, extract HTML content, automate file downloads, and scrape elements from web pages, reducing reliance on third-party services. I hope these tips are helpful for others looking to automate mundane tasks possible with Browserless, Playwright, and n8n.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Illuminate: A Glimpse of Google&#x27;s Cautious Innovation ]]></title>
        <description><![CDATA[ I recently came across the Illuminate &quot;Google Experiment&quot; and decided to give it a try. Illuminate is described as an AI-driven service that creates audio conversations between two AI voices, summarizing key points from selected computer science papers. ]]></description>
        <link>https://danielraffel.me/2024/09/05/illuminate-a-glimpse-of-googles-cautious-innovation/</link>
        <guid isPermaLink="false">66d9e4b9f9e34a05c61dcee2</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 05 Sep 2024 10:42:10 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/09/DALL-E-2024-09-05-10.31.38---An-abstract--minimalist-illustration-in-the-style-of-Paul-Rand-featuring-a-bold--geometric-design.-The-image-includes-two-figures-in-conversation--sym.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently came across the <a href="https://illuminate.google.com/home?ref=danielraffel.me" rel="noreferrer">Illuminate</a> "Google Experiment" and decided to give it a try. Illuminate is described as an AI-driven service that creates audio conversations between two AI voices, summarizing key points from selected computer science papers. It allows users to search <a href="https://arxiv.org/?ref=danielraffel.me" rel="noreferrer">arxiv.org</a> or paste in URLs and generate up to five audio summaries per day.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image.png" class="kg-image" alt="" loading="lazy" width="1968" height="660" srcset="https://danielraffel.me/content/images/size/w600/2024/09/image.png 600w, https://danielraffel.me/content/images/size/w1000/2024/09/image.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/09/image.png 1600w, https://danielraffel.me/content/images/2024/09/image.png 1968w" sizes="(min-width: 720px) 720px"></figure><p>While exploring the Home tab, I noticed a Books tab next to the Research Papers section, featuring audio examples from public domain works like Leo Tolstoy's&nbsp;<em>War and Peace</em>. Although you can't upload your own Books, which obviously raises intellectual property concerns, this feature seems far more useful than limiting the service to scientific papers.</p><p>Interestingly, the Books feature isn't mentioned anywhere in the help section which I've copy/pasted down below. It feels like the Books feature was quietly slipped in without much explanation or a clear strategy, perhaps to avoid drawing attention—but it certainly highlights how challenging it seems to launch truly compelling products at Google these days.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/09/image-1.png" class="kg-image" alt="" loading="lazy" width="1972" height="1370" srcset="https://danielraffel.me/content/images/size/w600/2024/09/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2024/09/image-1.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/09/image-1.png 1600w, https://danielraffel.me/content/images/2024/09/image-1.png 1972w" sizes="(min-width: 720px) 720px"></figure><p>One of the reasons I was initially excited about working at Google was their bold innovation across tech, product, legal and business in the early 2000s. When I see Google unveiling promising new technology restricted by an unfinished experiment that avoids taking risks, I can’t help but predict that services like Illuminate will soon be discontinued due to product limitations that prevent it from attracting a substantial user base.</p><p>Perhaps another team will eventually find a purpose for the technology and help it succeed, but I can’t help but wonder why its full potential isn't embraced now and allowed to stand on its own. Google’s leadership seems more inclined toward a cautious, measured approach, rather than launching bold initiatives that carry some risk. I hope this mindset shifts, as I’d love to see projects like Illuminate reach their full potential, with Google fully backing the ambitious visions its talented employees likely envision but struggle to get executive support for.</p><hr><p>Via the "How to use Illuminate" help page which I can't link to:</p><blockquote>What is Illuminate<br>Illuminate is an experimental technology that uses AI to adapt content to your learning preferences. After you select one or more published papers, Illuminate will generate audio with two AI-generated voices in conversation, discussing the key points of the selected papers. Illuminate is currently optimized for published computer science academic papers.</blockquote><blockquote>Selecting papers and generating an audio discussion</blockquote><blockquote>Once signed in, you can either search for a topic or a paper in&nbsp;<a href="http://arxiv.org/?ref=danielraffel.me">arxiv.org</a>&nbsp;or directly paste in the URL of a PDF from&nbsp;<a href="http://arxiv.org/?ref=danielraffel.me">arxiv.org</a>. Currently, users are permitted up to 5 audio generations per day.</blockquote><blockquote><strong>Search:</strong><br>Type the topic you are interested in the search box<br>- Tap on the search icon<br>- Select the paper you want Illuminate to use with the checkbox next to the title<br>- When your list is ready tap&nbsp;<em>Add</em><br>- Press&nbsp;<em>Generate</em></blockquote><blockquote><strong>PDF URL:</strong><br>- Paste the URL of a paper you want Illuminate to use<br>- Tap&nbsp;<em>Add</em><br>- Repeat the same process if you want to add more papers<br>- When your list is ready press&nbsp;<em>Generate</em></blockquote><blockquote>Depending on traffic and length of the selected papers, audio generation may take a few minutes or longer. In case you close the tab or lose connectivity during audio generation, the process will continue, and you can return to the generated audio conversation when you reload Illuminate.</blockquote><blockquote>When generation has completed, you can press the play button to listen to the new generated audio conversation. Tap the save button to save the generated audio conversation to your personal library. Generated audio conversations are deleted after 30 days unless you save them to your personal library.</blockquote><blockquote>Library<br>You can access the Library by tapping the Library button. The Library contains two sections:</blockquote><blockquote><strong>Personal</strong><br>Here you can find and manage all the audio conversations you generated with Illuminate and saved. Audio conversations you generate can be removed from your personal library by tapping&nbsp;<em>Delete</em>&nbsp;in the overflow menu.</blockquote><blockquote><strong>Public</strong><br>In this section, you can listen to a selection of publicly available generated audio conversations.</blockquote><blockquote>Sources<br>In the Library, each generated audio conversation is based on a published academic paper from&nbsp;<a href="http://arxiv.org/?ref=danielraffel.me">arxiv.org</a>. Tap&nbsp;<em>Source</em>&nbsp;to access the original paper and view its details, including title and author.</blockquote><blockquote>Listening to audio discussions<br>Tap&nbsp;<em>Play</em>&nbsp;to start listening to the generated audio conversation. You can control the playback speed as well as provide feedback on the quality of the generated content by tapping the thumbs-up/thumbs-down icons.</blockquote><blockquote>Feedback<br>During audio playback, you can use the thumbs-up and thumb-down buttons to rate the quality of the content. This is optional. If you choose to tap the thumbs-up or thumbs-down, we will ask you to fill out a follow up questionnaire to clarify your feedback. As an experimental product, know that we are continually iterating on the user experience. Providing feedback is important to help us improve the quality of Illuminate for all our users and we thank you for taking the time to share your experience with us.</blockquote><blockquote>Terms and Policies<br>Terms</blockquote><blockquote>When you use Illuminate, you are subject to the&nbsp;<a href="https://policies.google.com/terms?ref=danielraffel.me">Google Terms of Service</a>, including the&nbsp;<a href="https://policies.google.com/terms/generative-ai/use-policy?e=-IdentityBoqPoliciesUiGoodallSSAT%3A%3ALaunch%2CIdentityBoqPoliciesUiAdditionalAup%3A%3ALaunch&ref=danielraffel.me">Generative AI Prohibited Use Policy</a>.</blockquote><blockquote>Privacy<br>Our&nbsp;<a href="https://policies.google.com/privacy?ref=danielraffel.me">privacy policy</a>&nbsp;describes how Google handles your data when you interact with Illuminate. Please review it carefully.</blockquote><blockquote>Policies<br>Respect copyright laws. Do not share copyrighted content without authorization or provide links to sites where people can obtain unauthorized downloads of copyrighted content. It is our policy to respond to clear notices of alleged copyright infringement. Repeated infringement of intellectual property rights, including copyright, will result in account termination.</blockquote><blockquote>If you hear content in a generated audio conversation from Illuminate that you believe violates our policies, the law or your rights, let us know via this&nbsp;<a href="https://docs.google.com/forms/d/e/1FAIpQLScyL0G6p3_La-FJBEX6z5tbB4xGU8-Eowm_VVG4mrQccINN6g/viewform?usp=published_options&ref=danielraffel.me">direct link</a>.</blockquote> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ A Potential Fix to Calendar Invites Not Showing Up and How to Fix It ]]></title>
        <description><![CDATA[ Today, someone sent me an invite to a Google Calendar event, but I never received it. At first, I suspected it might be due to a spam prevention feature blocking the invite from appearing on my calendar from an unknown sender. ]]></description>
        <link>https://danielraffel.me/til/2024/08/08/a-potential-fix-to-calendar-invites-not-showing-up-and-how-to-fix-it/</link>
        <guid isPermaLink="false">66b5400e71ed090339141c37</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 08 Aug 2024 16:30:26 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/08/DALL-E-2024-08-08-16.22.41---A-wordless-illustration-inspired-by-Paul-Rand-s-style--representing-the-concept-of-Apple-Calendar-invites-getting-stuck-in-the-Apple-Cloud-and-not-mak.png" medium="image"/>
        <content:encoded><![CDATA[ <p>Today, someone sent me an invite to a Google Calendar event, but I never received it. At first, I suspected it might be due to a spam prevention feature blocking the invite from appearing on my calendar from an <a href="https://support.google.com/calendar/answer/13159188?hl=en&co=GENIE.Platform%3DDesktop&oco=0&ref=danielraffel.me" rel="noreferrer">unknown sender</a>. After some investigation and searching, I stumbled upon a <a href="https://jimmytechsf.com/people-send-calendar-invites-dont-receive/?ref=danielraffel.me" rel="noreferrer">post from eight years ago</a>:</p><blockquote>If someone sends you a calendar invite but you don’t receive it, it’s very likely a very particular problem with a&nbsp;not-so-obvious solution. You likely&nbsp;have an iCloud&nbsp;<em>account</em>&nbsp;but you do not have iCloud&nbsp;<em>Calendar</em>&nbsp;enabled on your devices. When the sender sends an invite from their own iCloud calendar, Apple’s system sees that you (the recipient)&nbsp;is also on iCloud so instead of sending you&nbsp;an email Apple&nbsp;puts the invite right in your Calendar program. It’s very convenient if you are using iCloud Calendar. The problem is, if you don’t use iCloud Calendar (Maybe you’re using Google Calendar or Yahoo Calendar instead) that invite never gets to you and just hangs out in your iCloud account.</blockquote><p>That sounds like exactly what happened to me. My friend shared a screenshot that showed they had created an invite in Apple Calendar using my Gmail address. Since I have an iCloud account associated with that Gmail address but don’t actively use iCloud Calendar, I am left to presume that the invite got "stuck in iCloud" and never found its way to my Google Calendar.</p><p>To resolve this, I followed the instructions in the post and adjusted my iCloud Calendar settings to ensure I receive email notifications for invitations and updates. Hopefully, this will prevent any future mix-ups. The Apple iCloud Calendar interface has evolved slightly since that original post, but the solution is still largely the same:</p><ol><li>Visit <a href="https://www.icloud.com/calendar/?ref=danielraffel.me">iCloud Calendar</a> from a web browser and log in with your iCloud account.</li><li>Click the three-dot (overflow) menu in the top left corner, and when the popup appears, choose <strong>Settings</strong>.</li></ol><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/08/image.png" class="kg-image" alt="" loading="lazy" width="896" height="518" srcset="https://danielraffel.me/content/images/size/w600/2024/08/image.png 600w, https://danielraffel.me/content/images/2024/08/image.png 896w" sizes="(min-width: 720px) 720px"></figure><ol start="3"><li>In the Settings window, select <strong>Accounts</strong> and choose to have your invites delivered <strong>via Emai</strong>l instead of via In-app Notifications. I happen to have a few emails associated with my Apple iCloud account which is why there are several listed below.</li></ol><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/08/image-1.png" class="kg-image" alt="" loading="lazy" width="1512" height="1170" srcset="https://danielraffel.me/content/images/size/w600/2024/08/image-1.png 600w, https://danielraffel.me/content/images/size/w1000/2024/08/image-1.png 1000w, https://danielraffel.me/content/images/2024/08/image-1.png 1512w" sizes="(min-width: 720px) 720px"></figure><p>This will hopefully help avoid any further issues with missing invites. Thanks to <a href="https://jimmytechsf.com/?ref=danielraffel.me#about" rel="noreferrer">Jimmy Obomsawin</a> for the tip.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Use Google Assistant Voice Commands to Control Devices Linked to Apple HomeKit ]]></title>
        <description><![CDATA[ This week, I was disappointed to discover that my Harmony Hub no longer worked with Google Assistant voice commands. I managed to get it working again by using Homebridge, some third-party Homebridge plugins, HomeKit, Automations and Routines. ]]></description>
        <link>https://danielraffel.me/til/2024/07/27/how-to-use-google-assistant-voice-commands-to-control-devices-linked-to-apple-homekit/</link>
        <guid isPermaLink="false">66a5290a33dba5033472ecbf</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 27 Jul 2024 12:16:10 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/07/DALL-E-2024-07-27-11.59.53---A-stylized-illustration-in-the-style-of-Paul-Rand-showing-Apple-HomeKit-being-connected-to-Google-Assistant.-The-illustration-should-include-icons-or-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>This week, I was disappointed to learn that my <a href="https://support.myharmony.com/en-us/hub?ref=danielraffel.me" rel="noreferrer">Harmony Hub</a> stopped working with Google Assistant voice commands. After some research, I found that <a href="https://www.reddit.com/r/logitechharmony/comments/wufxgu/harmony_home_hub_stopped_working_with_google_home/?ref=danielraffel.me" rel="noreferrer">some folks</a> had success with a hard reset of their Hub. However, since this didn't resolve the issue for everyone, I decided to avoid that potential headache. It's frustrating that third-party smart home products seem to be neglecting their integrations with Google.</p><p>To be fair, Harmony Hub <a href="https://support.myharmony.com/en-rs/harmony-remote-manufacturing-update?ref=danielraffel.me" rel="noreferrer">discontinued</a> their remote products in 2021. However, as of the time of this post, they clearly state on their website that they will continue to support these products and their integrations with Alexa and Google.</p><blockquote>I use my Harmony with Alexa and Google, will the integrations with 3rd parties change?<strong><em>&nbsp;</em></strong><br><br>This decision only impacts the manufacturing of new Harmony remotes. <strong>We plan to continue to offer service and support.</strong></blockquote><p>My Harmony Hub uses IR to turn on my living room TV, set it to the correct HDMI port, turn on my amp, and set it to the correct input setting and volume level. I also use it to turn off these devices. For listening to music, I use it to turn on my amp and set it to the right input setting, and I use it to turn off the amp as well.</p><p>Having been a product lead that helped launch Google Assistant, home automation integrations, and routines, I have numerous smart home devices and Google speakers. Back in 2015, one of the first smart home integrations I enabled was the command, "Hey Google, turn on the TV," which coordinated actions across multiple devices. When this doesn't work, it's always been a nuisance to debug.</p><p>My household primarily relies on Apple devices and services. Since leaving Google, I've transitioned to using HomeKit and the Apple Home app for my smart home needs. However, I still depend on Google Assistant for a few custom voice commands because I have Google speakers and smart displays in every room in my house. </p><p>Since the Harmony Hub integration with Apple HomeKit is functioning properly, I decided to explore how to connect HomeKit to Google Assistant. <strong>My goal was to use Google Assistant voice commands to control devices linked to HomeKit. </strong>Here’s the process I followed to make it work.</p><h3 id="pre-requisites">Pre-requisites</h3><ul><li>Harmony Hub configured with the desired activities for voice command triggers.</li><li><a href="https://github.com/homebridge/homebridge?ref=danielraffel.me" rel="noreferrer">Homebridge</a> server connected to Apple HomeKit.</li><li>Harmony Hub connected via Homebridge using <a href="https://github.com/nicoduj/homebridge-harmony?ref=danielraffel.me">homebridge-harmony</a>. </li><li>An account set up with the Google Home App.</li><li>Device(s) that can respond to Hey Google voice commands.</li></ul><h3 id="this-post-explains-how-to">This post explains how to:</h3><ul><li>Configure these additional Homebridge plugins: <ul><li><a href="https://github.com/nfarina/homebridge-dummy?ref=danielraffel.me#readme" rel="noreferrer">Dummy Switches</a></li><li><a href="https://github.com/oznu/homebridge-gsh?ref=danielraffel.me" rel="noreferrer">Homebridge Google Smart Home</a></li></ul></li><li>Create some Apple Automations to trigger several Harmony activities.</li><li>Set up some Personal Routines in the Google Home app to use voice commands to trigger various Apple Home automations.</li></ul><h3 id="homebridge-server-dummy-switch-plugin-configuration">Homebridge Server Dummy Switch Plugin Configuration</h3><p>I already run a <a href="https://github.com/homebridge/homebridge?ref=danielraffel.me" rel="noreferrer">Homebridge</a> server and have been using the <a href="https://github.com/nfarina/homebridge-dummy?ref=danielraffel.me#readme" rel="noreferrer">Dummy Switches</a> plugin for other purposes, which I found useful in this situation. </p><p>I configured two stateful Dummy Switches on Homebridge: one called <code>DSwitch - TV</code> and the other <code>DSwitch - Stereo</code>.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/dswitchtv.png" class="kg-image" alt="" loading="lazy" width="900" height="982" srcset="https://danielraffel.me/content/images/size/w600/2024/07/dswitchtv.png 600w, https://danielraffel.me/content/images/2024/07/dswitchtv.png 900w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/dswitchstereo.png" class="kg-image" alt="" loading="lazy" width="900" height="992" srcset="https://danielraffel.me/content/images/size/w600/2024/07/dswitchstereo.png 600w, https://danielraffel.me/content/images/2024/07/dswitchstereo.png 900w" sizes="(min-width: 720px) 720px"></figure><h3 id="apple-home-app-automation-configuration">Apple Home App Automation Configuration</h3><p>I already have my Harmony Hub connected to the Apple Home app via Homebridge using <a href="https://github.com/nicoduj/homebridge-harmony?ref=danielraffel.me">homebridge-harmony</a>. I imported a few Harmony Hub activities to the Apple Home app to <code>switch on the TV</code> and <code>listen to music</code>.</p><p>Next, I configured a few accessory-controlled automations in the Apple Home app. </p><p>When <code>DSwitch - TV</code> is turned on, it triggers the Harmony Hub's <code>TV-Switch</code> activity, which powers on my TV and amp.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/TVon.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><p>When <code>DSwitch - TV</code> is turned off, it triggers the Harmony Hub to turn off the <code>TV-Switch</code> (which turns off my TV and Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/TVoff.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><p>When <code>DSwitch - Stereo</code> is turned on, it triggers the Harmony Hub to turn on the <code>listen to music</code> switch (which turns on my Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/stereoOn.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><p>When <code>DSwitch - Stereo</code> is turned off, it triggers the Harmony Hub to turn off the <code>listen to music</code> switch (which turns off my Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/stereoOff.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><h3 id="homebridge-server-google-smart-home-plugin-configuration">Homebridge Server Google Smart Home Plugin Configuration</h3><p>The <a href="https://github.com/oznu/homebridge-gsh?ref=danielraffel.me" rel="noreferrer">Homebridge Google Smart Home (GSH</a>) plugin allows adding devices from HomeKit to Google Assistant in the Google Home app. I followed these <a href="https://github.com/oznu/homebridge-gsh?tab=readme-ov-file&ref=danielraffel.me#installation-instructions" rel="noreferrer">instructions to configure the GSH plugin</a>. </p><p>It's mentioned in the instructions but it's crucial to first configure this plugin and <strong>then</strong> add the connection to Homebridge in the Google Home app <strong>after</strong> setting up the Homebridge Dummy Switches and Apple Home app automations.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/gsh.png" class="kg-image" alt="" loading="lazy" width="900" height="532" srcset="https://danielraffel.me/content/images/size/w600/2024/07/gsh.png 600w, https://danielraffel.me/content/images/2024/07/gsh.png 900w" sizes="(min-width: 720px) 720px"></figure><h3 id="google-home-app-homebridge-configuration">Google Home App HomeBridge Configuration</h3><p>After setting up the Dummy Switches in Homebridge, configuring the automations in the Apple Home app, and installing the GSH plugin in Homebridge, I followed the <a href="https://github.com/oznu/homebridge-gsh/wiki?ref=danielraffel.me#add-homebridge-to-google-home-app" rel="noreferrer">GSH Wiki instructions</a> to add Homebridge as a connection in the Google Home app. After connecting, I imported the two new Dummy Switches I created. </p><p>I believe I had to repeat the Homebridge setup process in the Google Home app to import the second Dummy Switch. On the <code>Add devices</code> screen, I needed to long press on the Homebridge integration and select <code>Check for new devices</code> to access the flow for selecting more devices to import.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/IMG_2266.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><h3 id="google-home-app-routines-configuration">Google Home App Routines Configuration</h3><p>Once the Dummy Switches appeared in the Google Home app, I moved them to my Living Room and created a few <a href="https://support.google.com/googlenest/answer/7029585?hl=en&co=GENIE.Platform%3DiOS&oco=0&ref=danielraffel.me" rel="noreferrer">Google Personal routines</a> with specific voice commands. </p><p>When I say to Google Assistant, "Hey Google, turn on the TV," it triggers the <code>DSwitch - Living Room TV</code> dummy switch in Homebridge (which triggers the Apple Home automation, turning on my TV and Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/gTurnonTV-1.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><p>When I say to Google Assistant, "Hey Google, turn off the TV," it triggers the <code>DSwitch - Living Room TV</code> dummy switch in Homebridge (which triggers the Apple Home automation, turning off my TV and Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/gTurnoffTV.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><p>When I say to Google Assistant, "Hey Google, turn on the Stereo," it triggers the <code>DSwitch - Stereo</code> dummy switch in Homebridge (which triggers the Apple Home automation, turning on my Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/gTurnonStereo.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><p>When I say to Google Assistant, "Hey Google, turn off the Stereo," it triggers the <code>DSwitch - Stereo</code> dummy switch in Homebridge (which triggers the Apple Home automation, turning off my Amp).</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/gTurnoffStereo.png" class="kg-image" alt="" loading="lazy" width="350" height="758"></figure><h3 id="conclusion">Conclusion</h3><p>Although this setup is <em>definitely</em> brittle, it succeeds at connecting Google Assistant to HomeKit. My Harmony Hub now responds to Google Assistant voice commands as it did before the Harmony integration with Google Home stopped working.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Adjust the Spring Tension on Shimano SPD-SL Pedals ]]></title>
        <description><![CDATA[ I&#39;ve been using Shimano SPD pedals for decades, but it never occurred to me that I could adjust the tension to make clipping in and out easier or harder. Today, I discovered a 2.5mm adjustment bolt on the top of the pedal and a tension indicator on the rear binding of each pedal. ]]></description>
        <link>https://danielraffel.me/til/2024/07/25/how-to-adjust-the-spring-tension-on-shimano-spd-sl-pedals/</link>
        <guid isPermaLink="false">66a28b8733dba5033472ec34</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 25 Jul 2024 10:56:58 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/07/DALL-E-2024-07-25-10.50.12---A-Shimano-bike-pedal-being-adjusted--focusing-on-the-tension-bolt--illustrated-in-the-style-of-Paul-Rand.-The-image-features-clean-lines--bold-colors-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I've been using Shimano SPD pedals for decades, but it never occurred to me that I could adjust the tension to make clipping in and out easier or harder. Today, I discovered a 2.5mm adjustment bolt on the top of the pedal. Turning this bolt clockwise increases the tension, while turning it counterclockwise decreases it. There's an indicator on the rear binding of each pedal.</p><p>As you turn the adjustment bolt, it clicks, changing the tension one step at a time, with four clicks per turn. This allows you to fine-tune the spring force to achieve the optimal cleat holding tension for releasing the cleats from the bindings. To ensure equal spring tension on both pedals, you can refer to the tension indicators and count the number of turns of the adjustment bolts.</p><p>The <a href="https://ride.shimano.com/blogs/technologies/how-to-adjust-shimano-spd-sl-spring-tension?ref=danielraffel.me" rel="noreferrer">Shimano website</a> provides further insights into the benefits of higher vs. lower tension settings:</p><blockquote><strong>Why Opt for More Spring Tension?&nbsp;</strong><br><br>A higher pedal spring tension will provide the most secure connection between your cleat and the&nbsp;<a href="https://ride.shimano.com/collections/pedals-spdsl?ref=danielraffel.me">SPD-SL</a>&nbsp;pedal. This ensures maximum power transfer and a "locked-in" feel for aggressive and explosive riding. Think riders looking to get every iota of efficiency out of their bikes as they seek ultimate speed.<br><br>Keep in mind a higher spring tension will require better ankle mobility and strength to turn the foot and disengage from the pedal. More experienced riders who have lots of practice getting in and out of clipless pedals will often opt for this firmer spring tension.<br><br>Ultimately, you’ll most likely find the higher spring tension settings in WorldTour road races and the fastest local group rides. This is where the pros and racers prize absolute connectivity and efficiency above all else, especially during explosive power outputs.&nbsp;<br><br><strong>Why Opt for Less Spring Tension?&nbsp;</strong><br><br>A lower SPD-SL spring tension still provides a connected feel between rider and bike but allows for easier disengagement with less effort. These settings are perfect for those who are new to clipless technology or who are gaining confidence with clipping in and out.<br><br>The reduced effort to clip out means riders with less strength or mobility in their ankles will appreciate a lighter spring setting, even those with years of riding experience. Ultimately, this lower SPD-SL spring tension is well-suited for riders whose pedaling dynamics remain consistent without great bursts of power.&nbsp;&nbsp;</blockquote><p>The <a href="https://si.shimano.com/en/pdfs/dm/PD0002/DM-PD0002-07-ENG.pdf?ref=danielraffel.me" rel="noreferrer">dealers manual for Shimano pedals</a> includes a helpful graphic that clearly explains how the adjustment mechanism works.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/07/Screenshot-2024-07-25-at-10.43.23-AM.png" class="kg-image" alt="" loading="lazy" width="1076" height="1406" srcset="https://danielraffel.me/content/images/size/w600/2024/07/Screenshot-2024-07-25-at-10.43.23-AM.png 600w, https://danielraffel.me/content/images/size/w1000/2024/07/Screenshot-2024-07-25-at-10.43.23-AM.png 1000w, https://danielraffel.me/content/images/2024/07/Screenshot-2024-07-25-at-10.43.23-AM.png 1076w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">This applies to a large range of Shimano PD-SL pedals: PD-9000/PD-6800/PD-5800/PD-5700-C/ PD-R550/PD-R540-LA.</span></figcaption></figure><p>Now that I'm up-to-date on pedal tension adjustments, I plan to follow <a href="https://bike.shimano.com/en-EU/technologies/apparel-accessories/footwear/shoe-fitting-cleat-setting.html?ref=danielraffel.me" rel="noreferrer">Shimano's tips for shoe fitting and cleat placement</a>.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Fix Garmin IQ Sync Issues on iOS ]]></title>
        <description><![CDATA[ I have a Garmin Edge 1030 Plus, which I use for cycling. I discovered that I can use a ConnectIQ app called EDGE MapFields to add significantly more data fields to the ClimbPro screen on my Garmin making it much more useful. While setting this up, I ran into sync issues that were tricky to resolve. ]]></description>
        <link>https://danielraffel.me/til/2024/07/24/garmin-iq-sync-issues-on-ios/</link>
        <guid isPermaLink="false">66a1519d8e6fb703b66400c3</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 24 Jul 2024 12:51:07 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/07/DALL-E-2024-07-24-12.49.17---A-modern-and-minimalist-illustration-depicting--Garmin-Issues--in-the-style-of-Paul-Rand.-The-design-features-bold--geometric-shapes-and-a-limited-col.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I have a Garmin Edge 1030 Plus, which I use for cycling. After <a href="https://www.youtube.com/watch?v=9XasUZUVn30&ref=danielraffel.me" rel="noreferrer">watching a video from Shane Miller</a>, I discovered that I can use a ConnectIQ app called <a href="https://apps.garmin.com/en-US/apps/30fdcbb5-a0f2-4832-9157-b7dcd81e7a0f?ref=danielraffel.me" rel="noreferrer">EDGE MapFields</a> to add significantly more data fields to the ClimbPro screen on my Garmin. This is quite handy because I also <a href="https://www.youtube.com/watch?v=1pAONWnjYOk&ref=danielraffel.me" rel="noreferrer">learned from another video of Shane's</a> that I can get wind and weather data on my Garmin using the <a href="https://apps.garmin.com/apps/a924715a-2d53-42a1-a8f9-e365bc849840?ref=danielraffel.me" rel="noreferrer">Windfield app</a>. Combining these apps provides a much more useful ClimbPro page with a wealth of data in one place. I am now able to view my power, cadence, heart rate, speed, and weather data on the ClimbPro page.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/IMG_2187.png" class="kg-image" alt="" loading="lazy" width="400" height="654"></figure><p>While setting this up, I encountered several issues. <strong>The ConnectIQ app on my iOS device wouldn't allow me to update a few third-party apps I had installed, showing them as listed in my "Download Queue" no matter how many times I successfully synced.</strong></p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/IMG_2188.png" class="kg-image" alt="" loading="lazy" width="400" height="867"></figure><p>I eventually worked around the stuck ConnectIQ apps by unpairing my Edge device from both the Garmin app on iOS <strong>and</strong> my Bluetooth settings, then repairing the Edge device via the Garmin app. After completing the pairing process, I force quit the ConnectIQ iOS app and was finally able to successfully sync and update the queued third party apps using the Connect app.</p><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/IMG_2192.png" class="kg-image" alt="" loading="lazy" width="400" height="867"></figure><p>Before this, I had tried syncing on macOS using the Garmin Express app, even unpairing my Edge device, <a href="https://support.garmin.com/en-US/?faq=GiLICKFtvE1nV5uq9diX89&ref=danielraffel.me" rel="noreferrer">removing device data</a> from the Express "RegisteredDevices" folder, emptying the trash with the Edge device connected to my Mac, and then repairing, but that didn’t work. It appears that the issue is specific to the phone and needs to be addressed there.</p><p>Garmin software has significant room for improvement. Throughout this process I encountered 500 errors on their website, sync issues in the Express macOS software that they are aware of but haven’t fixed, and various other sync issues on their devices. After chatting with support, they advised me to ignore many of these issues.</p><p>Anyway, I’ve got it working again. Time to get out and ride.</p><hr><p>Below are screenshots of the other errors I encountered. Despite this, <a href="https://connect.garmin.com/status/?ref=danielraffel.me" rel="noreferrer">Garmin System Status</a> indicates everything is functioning normally. This poor quality makes it challenging to recommend Garmin bike computers.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/07/IMG_4071.png" class="kg-image" alt="" loading="lazy" width="1254" height="1000" srcset="https://danielraffel.me/content/images/size/w600/2024/07/IMG_4071.png 600w, https://danielraffel.me/content/images/size/w1000/2024/07/IMG_4071.png 1000w, https://danielraffel.me/content/images/2024/07/IMG_4071.png 1254w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Was told to just "ignore" this because they know about it the file doesn't exist anywhere.</span></figcaption></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/IMG_6135.png" class="kg-image" alt="" loading="lazy" width="2000" height="1435" srcset="https://danielraffel.me/content/images/size/w600/2024/07/IMG_6135.png 600w, https://danielraffel.me/content/images/size/w1000/2024/07/IMG_6135.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/07/IMG_6135.png 1600w, https://danielraffel.me/content/images/2024/07/IMG_6135.png 2000w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/IMG_2191.jpeg" class="kg-image" alt="" loading="lazy" width="1179" height="2556" srcset="https://danielraffel.me/content/images/size/w600/2024/07/IMG_2191.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2024/07/IMG_2191.jpeg 1000w, https://danielraffel.me/content/images/2024/07/IMG_2191.jpeg 1179w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/07/image.png" class="kg-image" alt="" loading="lazy" width="2000" height="1435" srcset="https://danielraffel.me/content/images/size/w600/2024/07/image.png 600w, https://danielraffel.me/content/images/size/w1000/2024/07/image.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/07/image.png 1600w, https://danielraffel.me/content/images/2024/07/image.png 2000w" sizes="(min-width: 720px) 720px"></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Incognito is (Kinda) Broken in the YouTube App Store Builds ]]></title>
        <description><![CDATA[ I recently used the YouTube iPad app to watch a video and enabled incognito mode to avoid endless recommendations. To my surprise, the feature did not work as expected. ]]></description>
        <link>https://danielraffel.me/til/2024/07/15/incognito-mode-broken-on-youtube-ios-ipados/</link>
        <guid isPermaLink="false">668733ae7321ae03332f6ab9</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Mon, 15 Jul 2024 14:53:00 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/07/DALL-E-2024-07-15-14.44.06---A-stylized-image-in-the-manner-of-Paul-Rand--showcasing-a-broken-incognito-mode-on-YouTube-iOS_iPadOS.-The-image-features-the-YouTube-app-icon-with-a-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>I recently watched a video on my iPad using the <a href="https://apps.apple.com/us/app/youtube-watch-listen-stream/id544007664?ref=danielraffel.me" rel="noreferrer">YouTube</a> app and wanted to avoid getting related recommendations, so I briefly enabled <a href="https://support.google.com/youtube/answer/9209643?hl=en&co=GENIE.Platform%3DiOS&ref=danielraffel.me" rel="noreferrer">Incognito</a>. To my surprise, the option to disable Incognito had its text cut off in both landscape and portrait modes. Instead of displaying "Turn off incognito," it displayed "Turn o...incognito."</p><p>Today, I searched my YouTube history for a video I watched on my iPad and couldn't find it. I realized that even after tapping the "Turn o...incognito" button to disable Incognito, none of the videos I watched since were saved to my history. This appears to indicate that there might be two bugs:</p><ol><li>The call-to-action text to disable Incognito is truncated.</li><li>Once Incognito is disabled, the app continues to behave as if it's still in Incognito by not saving watched videos on that device to the users history.</li></ol><p>Software bugs are common, and teams often ship unintentional mistakes. If you're experiencing this issue, and found this via search, it's not you.</p><p>That doesn't change the fact that I am disappointed that one of the most popular and highest-grossing apps in the App Store hasn't properly implemented a privacy feature. It reflects a disregard for user experience and attention to detail. Given that I pay $23 per month for a YouTube family plan, this leaves a sour taste in my mouth.</p><p>I reported these issues via email to an iOS tech lead at YouTube, hoping they are aware and planning to fix them, as the second issue also occurs on iOS. As someone who has steadily reduced my reliance on Google products, I hope they make it a priority to deliver higher quality experiences to the Apple ecosystem one day.</p><p><strong>Update 7/16/2024:</strong> I think I have a better understanding of why my watch history on iOS/iPadOS isn't updating.<br><br>I'm using AdGuard Home, and it turns out that&nbsp;<a href="http://s.youtube.com/?ref=danielraffel.me">s.youtube.com</a>&nbsp;is being blocked by a filter. This blockage prevents YouTube's video history from updating on iOS devices. I confirmed this by disabling AdGuard, watching a YouTube video on the iPad app (making sure I wasn't in incognito mode), and seeing that the watch history updated as expected. When I re-enabled AdGuard, the videos I watched no longer appeared in the watch history.<br><br>Interestingly, watching a video on&nbsp;<a href="http://youtube.com/?ref=danielraffel.me">youtube.com</a>&nbsp;in my browser on macOS with AdGuard enabled updates the watch history correctly. It seems YouTube's API implementation for propagating watch history <em>may</em> differ across platforms, causing unexpected behavior for those who modify their DNS rules. Given that <a href="https://www.searchenginejournal.com/global-ad-blocker-trends/398133/?ref=danielraffel.me" rel="noreferrer">up to 27% of people</a> use some form of ad blocker, YouTube might have millions of impacted users. FWIW I suspect the % of ad block users is higher on desktop devices and significantly lower on mobile devices.<br><br>The truncated text on the iPad is still a bug. However, upon further investigation, it’s difficult to classify the watch history issue as a bug. Instead, the inconsistency in watch history propagation could&nbsp;<em>potentially</em>&nbsp;be improved by using a consistent API/domain across platforms, if that’s not already the case. If the API and domain are indeed consistent across macOS/iOS, it might be worth YouTube investigating whether the API calls made when enabling and disabling incognito mode on iOS are performing unique actions that could be getting blocked by network filters, such as making a call to something like&nbsp;<a href="http://s.youtube.com/?ref=danielraffel.me">s.youtube.com</a>. Maybe the web behaves differently. I’ve shared this update with folks at YouTube and wanted to also reflect what I learned in this post in case others encounter perplexing watch history behavior on iOS.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://danielraffel.me/content/images/2024/07/IMG_1528.png" class="kg-image" alt="" loading="lazy" width="1668" height="2388" srcset="https://danielraffel.me/content/images/size/w600/2024/07/IMG_1528.png 600w, https://danielraffel.me/content/images/size/w1000/2024/07/IMG_1528.png 1000w, https://danielraffel.me/content/images/size/w1600/2024/07/IMG_1528.png 1600w, https://danielraffel.me/content/images/2024/07/IMG_1528.png 1668w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">Truncated text to turn off Incognito in the YouTube app on iPadOS</span></figcaption></figure> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Climate Change Impacts on Organic Farming ]]></title>
        <description><![CDATA[ I recently visited a renowned Amarone winery in Valpolicella, Veneto, in northeastern Italy. Hearing about their struggles with a fungus was the first time I truly considered the impact of climate change on agricultural practices and its implications for the future of organic certification. ]]></description>
        <link>https://danielraffel.me/2024/06/27/climate-change-impacts-on-organic-farming/</link>
        <guid isPermaLink="false">667db368e43ee7033bd6d15a</guid>
        <category><![CDATA[ 👀 Discovering ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Thu, 27 Jun 2024 12:51:52 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/06/IMG_0462-1.JPG" medium="image"/>
        <content:encoded><![CDATA[ <figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/06/IMG_0462.JPG" class="kg-image" alt="" loading="lazy" width="2000" height="1500" srcset="https://danielraffel.me/content/images/size/w600/2024/06/IMG_0462.JPG 600w, https://danielraffel.me/content/images/size/w1000/2024/06/IMG_0462.JPG 1000w, https://danielraffel.me/content/images/size/w1600/2024/06/IMG_0462.JPG 1600w, https://danielraffel.me/content/images/size/w2400/2024/06/IMG_0462.JPG 2400w" sizes="(min-width: 720px) 720px"></figure><p>I recently visited a winery in Valpolicella, Veneto, in northeastern Italy, renowned for their <a href="https://en.wikipedia.org/wiki/Amarone?ref=danielraffel.me" rel="noreferrer">Amarone</a>. During the tour, I learned that the region has been experiencing significantly more rain in recent years. This increased rainfall has led to widespread outbreaks of the <a href="https://en.wikipedia.org/wiki/Plasmopara_viticola?ref=danielraffel.me" rel="noreferrer">plasmopara viticola</a> fungus, which attacks grapevines' leaves and fruits, causing <a href="https://en.wikipedia.org/wiki/Downy_mildew?ref=danielraffel.me" rel="noreferrer">downy mildew</a>. This has devastated vines in many Italian regions and has led to a double-digit reduction in wine production nationwide in recent years.</p><p>To combat this fungus and protect their harvests, I was told that many organic wineries in the <a href="https://en.wikipedia.org/wiki/Valpolicella?ref=danielraffel.me" rel="noreferrer">Valpolicella</a> region have attempted to use organically certified copper fungicides. However, overuse of these fungicides has apparently resulted in soil accumulation, harming beneficial organisms and stressing the vines, which in turn affects grape quality and flavor. Additionally, overuse of copper has led to residues ending up in the wine when the grapes are pressed, rendering it unsuitable for consumption.</p><p>The winery I visited, while very supportive of organic and biodynamic winemaking, mentioned that after trying numerous organic techniques, they felt they had to switch to non-organic pest and fungus management strategies to mitigate these risks and maintain the health of their grapevines. They also noted that most of the region is moving away from organic farming and is unlikely to switch back.</p><p>Hearing this story was the first time I truly considered the impact of climate change on agricultural practices and its implications for the future of organic certification.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Poolside Reflections ]]></title>
        <description><![CDATA[ While writing this, I was sitting at a hotel pool 6,000 miles from home. It reminded me of advice I have given friends who are starting new companies: choose endeavors you&#39;d be excited to share with someone you just met at a hotel pool. ]]></description>
        <link>https://danielraffel.me/2024/06/05/poolside-reflections/</link>
        <guid isPermaLink="false">665742b29cb8df0336660d1a</guid>
        <category><![CDATA[ 🪩 Reflecting on ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Wed, 05 Jun 2024 03:20:28 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/06/IMG_9077-1.jpeg" medium="image"/>
        <content:encoded><![CDATA[ <figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/06/IMG_9077.jpeg" class="kg-image" alt="" loading="lazy" width="2000" height="1500" srcset="https://danielraffel.me/content/images/size/w600/2024/06/IMG_9077.jpeg 600w, https://danielraffel.me/content/images/size/w1000/2024/06/IMG_9077.jpeg 1000w, https://danielraffel.me/content/images/size/w1600/2024/06/IMG_9077.jpeg 1600w, https://danielraffel.me/content/images/2024/06/IMG_9077.jpeg 2000w" sizes="(min-width: 720px) 720px"></figure><p>While writing this, I was sitting at a hotel pool 6,000 miles from home. It reminded me of advice I have given friends who are starting new companies: choose endeavors you'd be excited to share with someone you just met at a hotel pool.</p><p>If you’re working on something you'd be enthusiastic to discuss with a stranger while on vacation, there's a good chance you’re oriented in the right direction.</p><p>Reflecting on when I have offered this advice, it's usually because I sense someone might be pursuing an idea that makes sense on paper and that they can sell, but doesn't truly resonate with them.</p><p>Being excited about your work enhances performance, positively influences outcomes, and inspires others. When you’re passionate about a vision you are committed to realizing, vacations can recharge your energy, but they can't fix a lack of genuine enthusiasm.</p> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ Apple’s Potential Strategy with OpenAI ]]></title>
        <description><![CDATA[ Like others, I&#39;m reading about the alleged Apple and OpenAI deal. Reflecting on Apple&#39;s past, particularly a rumored $1 billion deal with Nuance for a perpetual license to self-host and privately fork their speech technology for Siri&#39;s iOS launch, a pattern emerges. ]]></description>
        <link>https://danielraffel.me/2024/05/31/apples-potential-strategy-with-openai/</link>
        <guid isPermaLink="false">665a00059cb8df0336660d3b</guid>
        <category><![CDATA[ 🤔 Thinking about ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Fri, 31 May 2024 10:51:22 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/05/11EA3B4E-AEB9-4035-8417-1817C8557982.webp" medium="image"/>
        <content:encoded><![CDATA[ <figure class="kg-card kg-image-card"><img src="https://danielraffel.me/content/images/2024/05/11EA3B4E-AEB9-4035-8417-1817C8557982-1.webp" class="kg-image" alt="" loading="lazy" width="1024" height="1024" srcset="https://danielraffel.me/content/images/size/w600/2024/05/11EA3B4E-AEB9-4035-8417-1817C8557982-1.webp 600w, https://danielraffel.me/content/images/size/w1000/2024/05/11EA3B4E-AEB9-4035-8417-1817C8557982-1.webp 1000w, https://danielraffel.me/content/images/2024/05/11EA3B4E-AEB9-4035-8417-1817C8557982-1.webp 1024w" sizes="(min-width: 720px) 720px"></figure><p>Like others, I'm reading about the alleged Apple and OpenAI deal. Reflecting on Apple's past, particularly a rumored $1 billion deal with Nuance for a perpetual license to self-host and privately fork their speech technology for Siri's iOS launch, a pattern emerges. When Apple needed to quickly catch up, they licensed and integrated advanced technologies. This strategy allowed them to provide a cutting-edge assistant with voice recognition and speech synthesis to iPhone users. Given this history, it wouldn't be surprising if Apple is taking a similar approach with OpenAI to offer cloud based AI offerings. In the lead-up to WWDC announcements, it would be in character for Apple to have negotiated a perpetual license to self-host and fork as much OpenAI software as possible and to potentially license custom models.</p><p><strong>6/11/2024 Update:</strong> According to <a href="https://www.theinformation.com/articles/apple-eyes-deals-with-google-and-anthropic-after-openai-apple-grades-its-llms-while-databricks-grades-ai-usage?ref=danielraffel.me" rel="noreferrer">The Information</a> my original post seems to be wrong:</p><blockquote>The features OpenAI is powering for Apple are running exclusively on Microsoft's&nbsp;<strong>Azure</strong>&nbsp;cloud, so we aren't sure which of Apple’s AI features will run on Apple’s own "private cloud," which Federighi said would handle tasks requiring bigger Apple AI models than the ones that will reside on Apple devices themselves. Apple is staying mum about which features are going to be processed in those cloud servers, using Apple’s custom chips.&nbsp;</blockquote><blockquote>For OpenAI-powered features, Apple said the startup (and presumably&nbsp;<strong>Microsoft</strong>) won't have visibility into which Apple customer is making a request and won't be able to keep a record of the query either. So OpenAI may not be able to improve its tech from Apple customers’ queries the way it can from customers of ChatGPT.</blockquote> ]]></content:encoded>
    </item>
    <item>
        <title><![CDATA[ How to Configure a Travel eSIM with Apple Messages to Avoid Missing iMessages While Abroad ]]></title>
        <description><![CDATA[ When traveling internationally with my iPhone, I often purchase an eSIM for data. I finally figured out how to receive iMessages sent to my primary phone number while using an eSIM to avoid using my primary line for data. ]]></description>
        <link>https://danielraffel.me/til/2024/05/25/how-to-configure-a-travel-esim-with-apple-messages-to-avoid-missing-imessages-while-abroad/</link>
        <guid isPermaLink="false">66443bb5e865b50335d1c400</guid>
        <category><![CDATA[ 💡Today I Learned ]]></category>
        <dc:creator><![CDATA[ Daniel Raffel ]]></dc:creator>
        <pubDate>Sat, 25 May 2024 11:12:46 -0700</pubDate>
        <media:content url="https://danielraffel.me/content/images/2024/05/DALL-E-2024-05-20-14.40.59---A-simpler--vibrant-illustration-in-the-style-of-Paul-Rand--depicting-the-process-of-configuring-a-Travel-eSIM-on-an-iPhone-and-getting-Apple-Messages-.png" medium="image"/>
        <content:encoded><![CDATA[ <p>When traveling internationally with my iPhone, I often purchase a third party eSIM for data. In the past, before leaving the United States, I’ve made the mistake of disabling my primary line to avoid international cellular data overages. However, this has resulted in missed messages sent to my phone number while I'm away and a flood of missed messages upon my return. I’ve finally figured out why this happens and found a way to avoid using my primary line for data while traveling abroad. I can also receive iMessages sent to my primary phone number while using an eSIM abroad.</p><p>My understanding is that disabling my primary line in my home country deactivates my phone number registered with iMessage, and in my case, my phone number can only be re-registered with iMessage when I re-enable my primary line and reconnect to my carrier in my home country. The trick seems to be to keep my primary line active but disable data roaming, set the data plan to my travel eSIM and disable carrier switching.</p><p>To receive inbound messages in iMessage sent to your primary phone number and to make and receive phone calls over Wi-Fi while abroad using a data eSIM without incurring data overages on your primary line, follow these steps.</p><h2 id="before-leaving-home-country"><strong>Before Leaving Home Country</strong></h2><p>Keep your primary line enabled&nbsp;<strong>DO NOT DISABLE IT</strong>&nbsp;however…</p><ul><li>To prevent international cellular data use while abroad&nbsp;<strong>disable data roaming on your primary line in Settings &gt; Cellular &gt; Primary</strong></li><li>To prevent accidental cellular data use on your primary line while roaming abroad&nbsp;<strong>disable cellular data switching in Settings &gt; Cellular &gt; Cellular data</strong></li><li>To allow Wi-Fi calling on your primary line while abroad you must&nbsp;<strong>enable “Wi-Fi Calling on This iPhone” before leaving your home country in Settings &gt; Cellular &gt; Primary &gt; Wi-Fi Calling</strong></li><li>To use Wi-Fi Calling when you connect to Wi-Fi abroad using your primary line you must&nbsp;<strong>enable "Prefer Wifi When Roaming" in Settings &gt; Cellular &gt; Primary &gt; Wi-Fi Calling</strong></li><li>To prevent accidental costs abroad from sending SMS messages&nbsp;<strong>disable SMS in Settings &gt; Messages turn off 'Send as SMS'&nbsp;</strong></li></ul><h2 id="upon-arriving-in-new-country"><strong>Upon Arriving in New Country&nbsp;</strong></h2><ul><li>Enable your new eSIM (by scanning a QR code, etc) this often requires being on Wi-Fi</li><li>To use your international cellular data plan&nbsp;<strong>enable cellular data access in Settings &gt; Cellular Data &gt;&nbsp;</strong>&nbsp;<strong>select the name of your Travel eSIM&nbsp;</strong>(your primary network must NOT be selected)</li><li>To use data roaming on your international data plan&nbsp;<strong>enable Data Roaming in Settings &gt; Cellular &gt;&nbsp;</strong>&nbsp;<strong>select the name of your Travel eSIM&nbsp;</strong></li></ul><h2 id="upon-returning-home"><strong>Upon Returning Home</strong></h2><ul><li>Enable&nbsp;<strong>data roaming on your primary line in Settings &gt; Cellular &gt; Primary</strong></li><li>Enable&nbsp;<strong>SMS sending in Settings &gt; Messages turn on 'Send as SMS'&nbsp;</strong></li><li><strong>Delete secondary line in Settings &gt; Cellular &gt; name of your Travel eSIM</strong></li></ul> ]]></content:encoded>
    </item>

</channel>
</rss>