these are like mini blog posts, but more in generally just what the heck i'm up to.
2026-02-06
spent yesterday and today cathing up with this page and preparing to get
writing heavy on the blog. i also fixed a bug on the video player where it
wouldnt work on latest chrome on non-av1 accelerated hardware.
there were two bugs:
chrome without av1 and version >=142 would try and play the hls stream
natively. they made video.canPlayType("application/x-mpegURL") return
"maybe", which is total propaganda because chrome does not actually support
this type of stream. they support some insane subset that isn't real world
use cases.
hls polyfill was never being used because of a bundling mistake, so those
users would get the unoptimized original file.
since i was testing windows 7, decided to also fix some IE9 bugs. the page
works besides the video player and the fonts, which is all things i believe i
could easily fix, so i opened new issues for those tasks.
immediately after the youtube premiere, i worked on my webpage of it. the live
credits was a really fun touch that really completes things. i'm very happy
with the final result.
i also had to do a ton of work fixing the video player to properly show
thumbnails on the video player. the biggest one was adding even more hell to
the dash-av1 player. a issue that dash.js has is providing the first frame of
media before the user presses "play." this doesn't happen with the hls or
native players. so my workaround... delay attachment of the view.
player.initialize(null, dashFile, false);
player.updateSettings({
streaming: { cacheInitSegments: true },
});
player.preload();
video.addEventListener("play", function play() {
video.removeEventListener("play", play);
enableSafariAirplay(); // <-- another important hack
player.attachView(video);
onCloverVideoInit?.(id, video);
});
video.controls = true;
but on chrome you get cooked because a video without a source is not playable.
so a second hack used to attach a fake source.
the payload within contains a single webm vp8 frame, one pixel by one pixel.
this is enough to get chrome to shut up. now the experience is nearly perfect,
with browser support all the way to the ancient firefox version on my old
laptop, to modern Apple Silicon Macs.
main section of animation for the project was already done, but i wanted to
make some needed adjustments to my scene. so i had been doing those for a few
days, as well as animating an extra 25 seconds for the outro section. it was
very fun doing that process, since i actually grabbed an old macbook i had in
storage to take the textedit screenshot. more details on this entire process
will be in the project page.
at work, one of my coworkers was complaining about my helper functions for
TanStack Query's mutation system. the conclusion was that the mutation system
we were building on was flawed and not enjoyable to use. so i spent a few days
making an alternative library. it's on the JSR: @clo/react-mutation.
this project was really fun because of how much the API surface changed as i
started staging the library into our actual code. the experience has shaped how
i want to go about my upcoming blog post for the better.
added a trivial binary to the lie detector, tsld-node, which is like ts-node or tsx but it runs with the lie detector.
for todo tracker, it's been vibe coded to a point where the sitegen repo gets
through all commits, but there are still some issues with missing TODOs. i haven't really had time to prioritize this, even with an ai agent writing most of it, since even then i have to review the progress of it, so it's just not worth my time until other projects pull through.
i moved forgejo off of sqlite and onto postgres. the only motiviation behind this was to easily backdate the repositories i had imported: name paint bot and a scripting language compiler i wrote. i was more comfortable doing this on a real database server than just editing the file.
the migration took a bit for me to figure out, but last year someone named Sven did the same thing, and found this pgloader command with the critical data only clause.
echo "LOAD DATABASE
FROM sqlite:///root/forgejo.db
INTO postgresql://forgejo:$POSTGRES_PASSWORD_FORGEJO@postgres/forgejo
WITH data only, reset sequences, prefetch rows = 10000
SET work_mem TO '16MB', maintenance_work_mem TO '512MB';" > ./pgloader-command
pgloader ./pgloader-command
i finally setup system integrated ssh push, but with a twist.
the first problem is having two ssh servers on the same machine, one for the
host, and another for Forgejo. this means that one of them had to live on
another port, so i chose to move the host. but it kind of just sucks to use
this. the proper solution is to use a custom config to direct the git user to
the right place.
what made this harder is i actually had two git instances; the second one is
for a temporary infrastructure i'm running for evil inc until the
organization gets dedicated hardware (more about this in a future post). so
even if i routed git specially, it still wouldnt know which git server to go
to.
the solution: write a custom "router" script that generates an
authorized_keys file based on which git instances actually have a key, then
the line within the authorized_keys forces a special wrapper which intercepts
the targetted git repository, routing it to the git instance with it.
the sshd config looks like
Match User git
AuthorizedKeysCommand /bin/python /mnt/storage1/apps/home-infra/config/forgejo/ssh/keys.py %u %t %k
AuthorizedKeysCommandUser git
it is convenient because AuthorizedKeysCommand is run on every connection. it produces a file with zero or one valid keys:
general housekeeping. getting this activity page up. getting the spanish
translation by my good friend trubiso up. things are looking really cozy.
2026-01-12
tags: [album]
i started more music work. i've gotten better at lyric writing, phrasing this
new song as a sort of "adventure". felt for one of the first times that i was
doing worldbuilding in a song. the imagery is that good.
encodeByteStream converts these events into a ReadableStream. by
batching events together, the stream contents remain small, that way the
code that constructs progress nodes do not have to worry about calling
many setters at once, it gets debounced be the serializer. stream
backpressure causes larger time-gaps to be batched (smaller). this
enables servers to respond with rich progress.
const root = new progress.Root();
doActionWithProgress(root).then(root.end, root.error);
// streaming clients indicate a header
if (req.headers.get("Accept")?.includes(progress.contentType))
return new Response(progress.encodeByteStream(root), {
headers: { 'Content-Type': progress.contentType },
});
// to support non-streaming clients
return Response.json(await root.asPromise());
and decodeByteStream on the client:
const output = document.getElementById("output");
const res = await fetch(...);
if (!res.ok) throw ...;
const root = new progress.Root();
root.on("change", (active) => {
output.innerText = ansi.strip(progress.formatAnsi(
performance.now(),
active,
));
});
const result = await progress.decodeByteStream(res.body, root);
output.innerText = JSON.stringify(result);
there is currently no document bindings, but i plan to. additionally, a
React hook is very trivial to implement for this -- but that is
unplanned for this repository. for transports that require JSON or
UTF-8, there is encodeEventStream which returns a ReadableStream of
JSON objects which can be compressed at the developer's discretion.
feat(lib/progress): headless rendering + time estimation
node signaling is done by providing a progress.Root to every node,
dispatching events to it when the node changes. the root is connected to
an observer to construct a UI out of it. there are two apis planned:
attachToScreen binds a root to a TTY screen (via the log.Widget API).
the primary use of this is to implement the top level progress.start.
a serialization system that allows transmitting a Root over a wire.
this commit was going to include this but it is an unexpectedly large
component.
potentially a browser binding like attachToDocument. this will not
be added in this patch.
additionally, resolves #33 by implementing estimatedTime
i also did a large part of the work to create a "code todo tracking" tool. i
would say it's about half done, since the second half is simply fixing all of
the little bugs there are. most of this code is currently ai-generated, but
with me manually coming in to write interfaces and the modular program
architecture. then i synthesize the code and the tests. this was basically just
going on ambiently while progress.ts was in progress.
finished SSO sub-project. im happy with the setup i used to protect internal
services, such as pgadmin and qbittorrent. it's a caddy snippet that i can
re-use very easily.
working on SSO for my internal services. for context, i have about 12
self-hosted services running, half of which i allow my friends to access.
currently, this is done through manually creating an account on such service
(jellyfin, forgejo), but many are done through a caddy rule. in the interest of
making my password manager less confused (ip vs domain, subdomain etc), i'm
slowly reducing this setup to a single sign in page.
to do this, i am using https://keycloak.org, which supports openid connect
(how i will configure forgejo and jellyfin), as well as a separate service to
provide forward auth proxying (how i protect services like copyparty,
syncthing, pgadmin, and many more). i tried authelia beforehand, but i really
do not recommend them due to how hard it is to configure, passkeys being
annoying to setup, and limited themes. i also dont recommend authentik, but i
couldnt figure out how to even start using it after i installed it.
keycloak is a bit stupid on config. as all the config lies in the postgres
database, i can't use a config file to setup the primary realm. so instead, i
have this huge python script to use the API to upsert the configuration in.
this works pretty well, and means that for locally running the infrastructure
for testing, i can get the config to be the same (useful if you brick
keycloak, which is pretty easy to do).
i deleted all my github repositories except four: my "readme", a bug
reproduction repo, the mirror for ts lie detector, and a shared private repo
with someone that is load bearing. in this process, i've moved all the projects
to my forgejo instance.
with this, name paint bot, one of my few remaining projects that is still
active, moves to that forgejo instance using their github migrator. some of my
private projects, like my pet scripting language, were migrated as well. it
feels more alive on my site because of the theming and per-repo icons.
after a year of forgejo, i am really happy with how it treats me.