go back to the home page

clover’s log

these are like mini blog posts, but more in generally just what the heck i’m up to.

2026-03-20

tags: react markdown

i published a new library named @clo/react-markdown. originally, i wanted to have my own version of the streamdown package from vercel. but i didn’t realize how much of a loser company they all are. i wasn’t even trying and i made a library like a hundred times better and simpler than theres. because of the awesome success here, instead of calling mine “memo markdown”, i just said “yea, this covers every markdown use case for react” and called it React Markdown.

you can install it from the JSR:

npx jsr add @clo/react-markdown
pnpm add jsr:@clo/react-markdown

there are two main features that i deliver on:

copying some architecture notes from the readme, the memoizer is performant from the following tricks:

2026-03-18

tags: progress.ts

laser hair removal is awesome btw. organizing things at home slowly. more work on the progress library, trying to handle every edge case possible for the log widget system.

when it is done, a code snippet like this will work.

using node = progress.start("some action");
for await (const token of stream) {
  
  process.stderr.write(token);
}

the log system injects into process to ensure it plays nice, but there is only a fast path on stdout (since the main log messages uses stdout). stderr gets to use the crazy getDrawLock API i’m cooking up internally.

after all of this works and is reliable, there are some more things i have to tidy, but then the progress blog post can be written and reviewed for realsies.

2026-03-16

just chilling. at work i replaced bun install with pnpm. pretty peak. in the night, i worked on this page, and worked a bit on [hexiflare]'s local llm system.

2026-03-15

tags: clover creative control

this is sort of the new version of what i’ve called “creative toolkit.” it’s now a TypeScript library that controls REAPER and the DaVinci Resolve Speed Editor. the two layers are the library layer and the configuration layer. the config layer is meant to be trivial, editable and instantly-reloadable at a moments notice.

export default config.forApp("com.cockos.reaper", ({ speededitor: se, mac }) => {
  const reaper = new Reaper();

  
  reaper.on("transport", (transport) => {
    se.leds.cut = transport.recording;
    se.leds.dissolve = transport.recording;
    se.leds.smoothCut = transport.recording;
  });

  
  se.onPress("stopPlay", () => {
    if (reaper.transport.recording) {
      reaper.runAction("transport-stop-save-all-recorded-media");
    } else {
      reaper.runAction("transport-play-stop");
    }
  });
  se.onPress("cut", () => {
    if (reaper.transport.recording) {
      reaper.runAction("transport-stop-delete-all-recorded-media");
      reaper.runAction("transport-record");
    } else {
      reaper.runAction("transport-record");
    }
  });
  se.onPress("dissolve", () => {
    reaper.runAction("transport-stop-delete-all-recorded-media");
  });
  se.onPress("smoothCut", () => {
    if (reaper.transport.recording) {
      reaper.runAction("transport-stop-save-all-recorded-media");
      reaper.runAction("transport-record");
    }
  });
  se.onPress("videoOnly", () => {
    if (mac.window?.title.startsWith("FX:") && mac.mainWindow) {
      mac.focusMainWindow();
      return;
    }
    reaper.runAction("track-view-fx-chain-for-current-last-touched-track");
  });
});

the winning workflow here for recording audio has been the bottom three keys being mapped to

this is better than just having keys on the main keyboard, no modifiers! and i don’t have to lose out on any existing keybinds.

i used to have a setup like this with multiple external keyboards, but i am keeping that on hold until i run out of macro keys. what’s also interesting to note is each app gets its own layer of bindings.

2026-03-14

tags: [album]

i’ve made some beautiful progress on the album, i finished another song. it is one of my best creative works i’ve ever done. i’m personally very happy with my singing and vocal editing skills improving. will be holding it captive except in some small circles (who have all said it is amazing)

2026-03-11

tags: react mutation

my coworker was working on something that needed a feature for react mutation. i also needed his approval on some code i was working on, but he wouldn’t give me any attention. so as a bribe, i released a new patch of the library. my PR was reviewed, and everything was great. the changelog

2026-03-10

tags: [hexiflare]

with the hexi bot with just two users eating my entire at-work anthropic subscription, i was messing with getting hexi to work as much on local models as possible. the results are very promising, but it is very much becoming a manually designed and written project, which is good (slop is bad). but that does mean i can’t spent a lot of time on it, as i have other desires.

anyways, the local model chat system works by having a qwen model with an insane “personality” prompt. and then it goes through another qwen model with an equally insane “review”. the results are pretty good.

you> what u been up to?
hexi> not much lol
hexi> messing w/ some css animations rn breaking them is kinda fun though
you> omg can i see?
hexi> hold on lemme whip up a lil demo page real quick
[triggers sandboxed agent task]

benefit of doing it this way instead of through a stupid wrapper, the latency can be brought down a ton. but the full system isn’t wired up, and i frankly don’t trust anyone but myself to do this task.

2026-03-05

tags: [work], [hexiflare]

i wrote a beautiful dialog component for use at work and in the hexiflare template. the usage is like this:

showTextDialog({
  title: "Hello hiii heelloooo.",
  description: "Enter something important"
  onConfirm: async (reason) => {
    
  },
  onCancel: "close",
});

it’s certainly better than copying the Radix Dialog everywhere. that component is a great primitive but it isn’t nice to spam it.

2026-02-25

tags: [hexiflare]

i was starting to work on an ai project, a chatbot that is absurdly tuned to the format of texting. in addition to this, it has full access to a (sandboxed) computing environment where it can run any command, browser use, and also can deploy web apps.

this whole thing is based on a friend’s project, named “clungus”, which was literally just a wrapper around claude code. his could deploy static websites, which was really cool since you could just ask it for some little thing and it would just make it. and you send your bug reports as text messages.

in both bots, the interface is much more natural than using claude code, chatgpt, or another application. and with mine particularly, i was working on making the base app template as high quality as possible, so that the apps. so at a minimum, we now have one of the best tanstack start templates ever. at best, i might be able to automate some apps i never would’ve cared to write manually.

2026-02-25

2026-02-18

tags: this site, markodown

i finally closed issue #1 on my website, eliminating the MDX compiler in its entirety on my website.

i don’t think MDX is a great format to work with. and on top of that, it adds 111 transitive dependencies to my website. and what, all for a medium markdown language. markdown is inherently concise, so pairing it with Marko felt like it would be a nicer format to work in. the answer: it is! i made “markodown”, by using rust, a bit of llm slop, and used it a ton to ensure a good api. the result is beautiful.

my favorite part of this language is automatic header tracking being built in. alongside Marko’s id shorthand, i can type something like this:

<h2#technical-review>A Technical Review: What are Server Components?</>

and the compiler will track this properly, emitting it as <Heading level=2 id=“technical-review”> and including it in the generated table of contents. i can then include the table of contents with a <table-of-contents /> element

a little bit tough because i did spend a whole week on it. but it’s done, and i love it. after this port, i also axed some other dependencies in favor of lighter alternatives:

and the biggest of all, deprecating my HTML rendering framework, in favor of using Marko. this is technically a complexity increase, except for the part where i already supported Marko.

2026-02-08

tags: HOTEWIG reanimated

i am on the team of people organizing “history of the entire world, i guess - reanimated.” not much news yet, but it will be exciting. please fill out the form if you are interested, and please send it to anyone you may think would be interested.

2026-02-06

tags: this site

spent yesterday and today catching up with this page and preparing to get writing heavy on the blog. i also fixed a bug on the video player where it wouldnt work on latest chrome on non-av1 accelerated hardware.

there were two bugs:

since i was testing windows 7, decided to also fix some IE9 bugs. the page works besides the video player and the fonts, which is all things i believe i could easily fix, so i opened new issues for those tasks.

2026-02-04

tags: this site

i did a ton of random stuff to this site, including the RSS feed and random bug fixes on the domain. trying to reduce the issue count on my forgejo.

2026-02-03

tags: history of japan reanimated

immediately after the youtube premiere, i worked on my webpage of it. the live credits was a really fun touch that really completes things. i’m very happy with the final result.

i also had to do a ton of work fixing the video player to properly show thumbnails on the video player. the biggest one was adding even more hell to the dash-av1 player. a issue that dash.js has is providing the first frame of media before the user presses “play.” this doesn’t happen with the hls or native players. so my workaround… delay attachment of the view.

player.initialize(null, dashFile, false);
player.updateSettings({
  streaming: { cacheInitSegments: true },
});
player.preload();

video.addEventListener("play", function play() {
  video.removeEventListener("play", play);

  enableSafariAirplay(); 
  player.attachView(video);
  onCloverVideoInit?.(id, video);
});
video.controls = true;

but on chrome you get cooked because a video without a source is not playable. so a second hack used to attach a fake source.

const duration = new Promise<number>((resolve) => {
  player.on(dashjs.MediaPlayer.events.MANIFEST_LOADED, (e) => {
    const duration = e.data.mediaPresentationDuration;
    resolve(duration); 
  });
});
mediaSource = new MediaSource();
mediaSource.addEventListener("sourceopen", () => {
  duration.then((duration) => {
    mediaSource = mediaSource!; 
    mediaSource.duration = duration;
    const sb = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
    const oneFrame = Uint8Array.from(
      atob("GkXfo59ChoEBQveBAULygQRC84EIQoKEd2VibUKHgQJChYECGFOAZwEAAAAAAAITEU2bdLpNu4tTq4QVSalmU6yBoU27i1OrhBZUrmtTrIHWTbuMU6uEElTDZ1OsggEjTbuMU6uEHFO7a1OsggH97AEAAAAAAABZ" + "A".repeat(119) + "VSalmsCrXsYMPQkBNgIxMYXZmNjIuMy4xMDBXQYxMYXZmNjIuMy4xMDBEiYhAXgAAAAAAABZUrmvIrgEAAAAAAAA/14EBc8WIDV7YG1KxA/CcgQAitZyDdW5kiIEAhoVWX1ZQOIOBASPjg4QCYloA4JCwgQG6gQGagQJVsIRVuYEBElTDZ/tzc59jwIBnyJlFo4dFTkNPREVSRIeMTGF2ZjYyLjMuMTAwc3PWY8CLY8WIDV7YG1KxA/BnyKFFo4dFTkNPREVSRIeUTGF2YzYyLjExLjEwMCBsaWJ2cHhnyKFFo4hEVVJBVElPTkSHkzAwOjAwOjAwLjEyMDAwMDAwMAAfQ7Z11eeBAKOigQAAgBACAJ0BKgEAAQALxwiFhYiFhIg/ggAMDWAA/ua1AKOVgQAoALEBAC8R/AAYABhYL/QAJAAAo5WBAFAAsQEALxH8ABgAGFgv9AAkAAAcU7trkbuPs4EAt4r3gQHxggGj8IED"),
      (x) => x.charCodeAt(0),
    );
    sb.appendBuffer(oneFrame);
  });
});
objectUrl = URL.createObjectURL(mediaSource);

mainSource = document.createElement("source");
mainSource.src = objectUrl;
video.appendChild(mainSource);

the payload within contains a single webm vp8 frame, one pixel by one pixel. this is enough to get chrome to shut up. now the experience is nearly perfect, with browser support all the way to the ancient firefox version on my old laptop, to modern Apple Silicon Macs.

2026-01-31

tags: history of japan reanimated

main section of animation for the project was already done, but i wanted to make some needed adjustments to my scene. so i had been doing those for a few days, as well as animating an extra 25 seconds for the outro section. it was very fun doing that process, since i actually grabbed an old macbook i had in storage to take the textedit screenshot. more details on this entire process will be in the project page.

2026-01-30

tags: react mutation

at work, one of my coworkers was complaining about my helper functions for TanStack Query’s mutation system. the conclusion was that the mutation system we were building on was flawed and not enjoyable to use. so i spent a few days making an alternative library. it’s on the JSR: @clo/react-mutation.

this project was really fun because of how much the API surface changed as i started staging the library into our actual code. the experience has shaped how i want to go about my upcoming blog post for the better.

2026-01-25

tags: ts lie detector, todo tracker

added a trivial binary to the lie detector, tsld-node, which is like ts-node or tsx but it runs with the lie detector.

for todo tracker, it’s been vibe coded to a point where the sitegen repo gets through all commits, but there are still some issues with missing TODOs. i haven’t really had time to prioritize this, even with an ai agent writing most of it, since even then i have to review the progress of it, so it’s just not worth my time until other projects pull through.

2026-01-24

tags: git, home infra

i moved forgejo off of sqlite and onto postgres. the only motiviation behind this was to easily backdate the repositories i had imported: name paint bot and a scripting language compiler i wrote. i was more comfortable doing this on a real database server than just editing the file.

the migration took a bit for me to figure out, but last year someone named Sven did the same thing, and found this pgloader command with the critical data only clause.

echo "LOAD DATABASE
FROM sqlite:///root/forgejo.db
INTO postgresql://forgejo:$POSTGRES_PASSWORD_FORGEJO@postgres/forgejo
WITH data only, reset sequences, prefetch rows = 10000
SET work_mem TO '16MB', maintenance_work_mem TO '512MB';" > ./pgloader-command

pgloader ./pgloader-command

2026-01-18

tags: git, home infra

i finally setup system integrated ssh push, but with a twist.

the first problem is having two ssh servers on the same machine, one for the host, and another for Forgejo. this means that one of them had to live on another port, so i chose to move the host. but it kind of just sucks to use this. the proper solution is to use a custom config to direct the git user to the right place.

what made this harder is i actually had two git instances; the second one is for a temporary infrastructure i’m running for evil inc until the organization gets dedicated hardware (more about this in a future post). so even if i routed git specially, it still wouldnt know which git server to go to.

the solution: write a custom “router” script that generates an authorized_keys file based on which git instances actually have a key, then the line within the authorized_keys forces a special wrapper which intercepts the targetted git repository, routing it to the git instance with it.

the sshd config looks like

Match User git
    AuthorizedKeysCommand /bin/python /mnt/storage1/apps/home-infra/config/forgejo/ssh/keys.py %u %t %k
    AuthorizedKeysCommandUser git

it is convenient because AuthorizedKeysCommand is run on every connection. it produces a file with zero or one valid keys:

command="/mnt/storage1/apps/home-infra/config/forgejo/ssh/route.sh --clover 1 --evil 2",no-port-forwarding,...,restrict ssh-ed25519 AAAAC3NzaC...

and then the route script does this shit:

if [ -d "/mnt/storage1/apps/forgejo/git/repositories/${repo}" ]; then
  [ -z "$clover_id" ] && echo "ssh key is not configured for repository on git.paperclover.net" >&2 && exit 1
  exec sudo docker exec -i -u git forgejo /usr/bin/env SSH_ORIGINAL_COMMAND="$SSH_ORIGINAL_COMMAND" /usr/local/bin/forgejo --config=/custom/conf/app.ini serv key-"$clover_id"
elif [ -d "/mnt/storage1/apps/evil-infra/forgejo/git/repositories/${repo}" ]; then
  [ -z "$evil_id" ] && echo "ssh key is not configured for repository on git.evil.inc" >&2 && exit 1
  exec sudo docker exec -i -u git evil-forgejo /usr/bin/env SSH_ORIGINAL_COMMAND="$SSH_ORIGINAL_COMMAND" /usr/local/bin/forgejo --config=/custom/conf/app.ini serv key-"$evil_id"
else
  echo "repo not found" >&2
  exit 1
fi

it works beautifully. see the whole patch for more info

2026-01-14

tags: next.js blog post, this site

general housekeeping. getting this activity page up. getting the spanish translation by my good friend trubiso up. things are looking really cozy.

2026-01-12

tags: [album]

i started more music work. i’ve gotten better at lyric writing, phrasing this new song as a sort of “adventure”. felt for one of the first times that i was doing worldbuilding in a song. the imagery is that good.

2026-01-11

tags: progress.ts, todo tracker

i finished streaming io on progress.ts. very proud of it. my git commits describe the tech better than me reiterating.

feat(lib/progress): implement streaming wire protocol

resolves #47

encodeByteStream converts these events into a ReadableStream. by batching events together, the stream contents remain small, that way the code that constructs progress nodes do not have to worry about calling many setters at once, it gets debounced be the serializer. stream backpressure causes larger time-gaps to be batched (smaller). this enables servers to respond with rich progress.

const root = new progress.Root();
doActionWithProgress(root).then(root.end, root.error);

if (req.headers.get("Accept")?.includes(progress.contentType))
  return new Response(progress.encodeByteStream(root), {
    headers: { 'Content-Type': progress.contentType },
  });

return Response.json(await root.asPromise());

and decodeByteStream on the client:

const output = document.getElementById("output");
const res = await fetch(...);
if (!res.ok) throw ...;
const root = new progress.Root();
root.on("change", (active) => {
  output.innerText = ansi.strip(progress.formatAnsi(
    performance.now(),
    active,
  ));
});
const result = await progress.decodeByteStream(res.body, root);
output.innerText = JSON.stringify(result);

there is currently no document bindings, but i plan to. additionally, a React hook is very trivial to implement for this – but that is unplanned for this repository. for transports that require JSON or UTF-8, there is encodeEventStream which returns a ReadableStream of JSON objects which can be compressed at the developer’s discretion.

feat(lib/progress): headless rendering + time estimation

node signaling is done by providing a progress.Root to every node, dispatching events to it when the node changes. the root is connected to an observer to construct a UI out of it. there are two apis planned:

additionally, resolves #33 by implementing estimatedTime

i also did a large part of the work to create a “code todo tracking” tool. i would say it’s about half done, since the second half is simply fixing all of the little bugs there are. most of this code is currently ai-generated, but with me manually coming in to write interfaces and the modular program architecture. then i synthesize the code and the tests. this was basically just going on ambiently while progress.ts was in progress.

2026-01-09

tags: home infra

finished SSO sub-project. im happy with the setup i used to protect internal services, such as pgadmin and qbittorrent. it’s a caddy snippet that i can re-use very easily.

(reverse_proxy_auth) {
  handle /snow.oauth2/* {
		reverse_proxy "http://forward-auth" {
			header_up X-Real-IP {remote_host}
			header_up X-Forwarded-Uri {uri}
		}
	}
  handle {
    forward_auth "http://forward-auth" {
      uri /snow.oauth2/auth
      header_up X-Real-IP {remote_host}
      @error status 401
      handle_response @error {
        redir * /snow.oauth2/sign_in?rd={scheme}://{host}{uri}
      }
      @valid_group header X-Auth-Request-Groups *role:{args[1]}*
      handle_response @valid_group {
        method  {method}
        rewrite {uri}
        reverse_proxy {args[0]} {
          header_up Cookie ([^;]*?)\s*_oauth2_proxy_\d=[^;]*(;?.*) "$1$2"
          {block}
        }
      }
      handle_response {
        rewrite /403.html
        file_server {
          status 403
          root /etc/caddy
        }
      }
    }
  }
}

# usage
pg.{$HOME_DOMAIN} {
  import reverse_proxy_auth "http://pgadmin" admin
}
qbt.{$HOME_DOMAIN} {
  import reverse_proxy_auth "http://qbittorrent" media-manage
}

2026-01-04

tags: home infra

working on SSO for my internal services. for context, i have about 12 self-hosted services running, half of which i allow my friends to access. currently, this is done through manually creating an account on such service (jellyfin, forgejo), but many are done through a caddy rule. in the interest of making my password manager less confused (ip vs domain, subdomain etc), i’m slowly reducing this setup to a single sign in page.

to do this, i am using keycloak.org, which supports openid connect (how i will configure forgejo and jellyfin), as well as a separate service to provide forward auth proxying (how i protect services like copyparty, syncthing, pgadmin, and many more). i tried authelia beforehand, but i really do not recommend them due to how hard it is to configure, passkeys being annoying to setup, and limited themes. i also dont recommend authentik, but i couldnt figure out how to even start using it after i installed it.

keycloak is a bit stupid on config. as all the config lies in the postgres database, i can’t use a config file to setup the primary realm. so instead, i have this huge python script to use the API to upsert the configuration in. this works pretty well, and means that for locally running the infrastructure for testing, i can get the config to be the same (useful if you brick keycloak, which is pretty easy to do).

2026-01-02

tags: home infra, git, name paint bot show

i deleted all my github repositories except four: my “readme”, a bug reproduction repo, the mirror for ts lie detector, and a shared private repo with someone that is load bearing. in this process, i’ve moved all the projects to my forgejo instance.

with this, name paint bot, one of my few remaining projects that is still active, moves to that forgejo instance using their github migrator. some of my private projects, like my pet scripting language, were migrated as well. it feels more alive on my site because of the theming and per-repo icons.

after a year of forgejo, i am really happy with how it treats me.