# nagi miniweb: keeping it stupid #tech-project ## Background Yesterday I was thinking about adding a [[Wander|https://codeberg.org/susam/wander]] console to my site and losing enthusiam by the minute. It's a cool project, but it's JavaScript-based, and I don't want to add any of that to my site if I can avoid it; I figured if it was *really cool*, I could make an exception. So, I set about trying to extend it to solve a problem raised on the issue tracker: people put the root domain in their site lists, instead of a good post. I figured, how hard can it be to use js to fetch the document, look for ``, then fetch the feed and grab the first post? Well, probably not very hard if you're good at Computer, but I kept running into problems. Sites can set security policies that my own browser will betray me by respecting. And it seemed like I might realistically need to use a dependency in order to parse the RSS, which I didn't want to do, and I was just... slowly losing interest in the whole thing. Besides, I didn't even know what sites I would put in my console. That reminded me, I never submitted my site to Kagi Smallweb.[^0] Hey, that's open-source, right? What if I just dump those into my console? It did technically work, but it was slow. And uh... what was even the point? Then I remembered [[a post I'd seen on the bugtracker|https://github.com/kagisearch/smallweb/issues/700]] complaining that KSW doesn't work in terminal browsers. I tested it in a few, and yeah, it really didn't work at all. And it really *ought* to. Hm, I can do this. How hard could it be? Many more hours later than it really deserved, I had [[nagi miniweb|/app/nagi/]]. ## Deets The smart way to do this would be to have a program running on your server. When you hit "next", it would pull a random URL from the list and insert it into the document it serves you. The problem is, I don't have easy access to that ability on my current host. And, just like JavaScript, I'd like to minimize dynamic content in general, so that I can easily pick up and move to a new host, consider hosting it myself without security issues, etc. So I thought, hey, what if I do it the stupid way? I can just pre-generate all those documents that would normally be generated on the fly. Each one can point to the next site in a circular linked list. It would take up an absurd amount of space, but like... how much? Then I remembered one dynamic feature I *do* have access to: server-side includes. Well, there we go, problem solved! I'll just abuse those to compress each entry down to the bare minimum. It's still dynamic, but it's a very simple, widespread feature, and without it, this stupid project is DOA. A few shell scripts later and I had my answer to how much space it would take up: about 120MB. That's pretty horrible, but [[Web 1.0 Hosting|https://web1.0hosting.net/]] gives me 500MB, and I'm only using like 30. Fuck it, we'll do it live.[^1] It worked, but I had three glaring problems: 1. A bunch of the sites would display a security error instead of loading. 2. It still didn't work in any terminal browsers. I turned on iframe support in elinks and that just caused it to crash. 3. Most of the sites required https, so it wasn't very useful in old browsers. To solve #1, I decided to stick with stupidity since it had served me well enough so far. I looped through the entire list of websites, curled the headers, and grepped them for the anti-framing rules. Then I went to bed. When I woke up, it turned out that about 20% of the websites on the list forbid framing. Eh, that's acceptable, let's just delete them. (I'm not sure how Kagi handles this. I was surprised they had *any* on the list that forbid framing, much less such a large number. I guess I could look at the code, but I don't feel like it.) For #2, when I fruitlessly enabled iframes in elinks, I noticed that it claimed to fully support *frames* (no i). These are really arcane, crusty stuff.[^ref] You don't put them in your ``, you put them inside a `` which takes the place of your ``. As far as I can tell, you can't have any other content on the page except for frames. So, instead of my elegantly stupid solution of a single file containing navigation and an iframe, I needed two files: one to hold the frameset and one to hold the navigation. This doubled the disk overhead, but... it worked! Not in offpunk, w3m, or lynx... but it does work in elinks and links2. Problem #3 is unsolved. Only about one in ten sites loads in Netscape or Firefox 4, and you get a lot of complaints about security issues. I could pare the list down to only sites that support HTTP, but I would expect it to be a very short list. There are fewer than 900 listed with that protocol in smallweb.txt, while the full list is about 34,000 sites (about 27,000 after anti-frame-pruning). I might give it a try anyway; maybe I could do an insecure-only spinoff version. I think I could solve the SSL issue — and probably the frame-blocking issue too — with something like [[Retro Proxy|https://github.com/DrKylstein/retro-proxy]], but again, I would need an actual server with actual powers that I pay actual money for. I haven't ruled it out, though. I was already considering renting the cheapest VPS I can find to host Gopher and Gemini versions of the site, and maybe this is enough to push me over the edge. I could probably even do something fancier to eliminate frames entirely, if I'm playing man-in-the-middle anyway: just insert the nav element at the top of the site. I could even clean the sites up using Retro Proxy, [[readability|https://github.com/mozilla/readability]], [[unmerdify|https://codeberg.org/vjousse/unmerdify]], etc. ## Conclusion If you appreciate nagi, please let me know! I don't collect any information on visitors, so I won't know unless you tell me. If people do like it, it will increase the odds I work on it. But do *I* like it? Eh, I dunno. I had to compromise on the size of the navigation frame, so it takes up more space than I'd like in graphical browsers.[^2] And, honestly, as I've been clicking around testing it out, I've realized I don't necessarily like the KSW dataset that much. There's a lot of like... "hello, my name is John Johnson, here's my résumé and a couple of boring technical posts". A lot of the small/smol/old/retro/indie web is like that. Since I've already eliminated 20% of the set anyway, I might start curating it more. Remember this actually started out as an effort to add a Wander console to my site. Wander is all about sharing your personalized recommendations with an ad-hoc, non-commercial network. Kagi is a commercial enterprise that searches the entire internet. It might make more sense to drop the KSW element altogether and just make it pnppl's low-tech stumbler. And you know how I wanted to add KSW-style display of recent posts to my Wander console? An irony of this project is that nagi doesn't have that feature either. It only sends you to the root of the website. ¯\_(ツ)_/¯ On the plus side, you can bookmark the site you're on to save your progress through the list, ensuring you don't see the same site twice without the need for a coookie. Until I rebuild it, at least. That won't happen very often, though, not least because it takes like an hour to build and upload from my 15-year-old Chromebook and shitty internet connection. [^0]: After I finished scouring my subscriptions for sites that meet the inclusion requirements — I only came up with three, maybe four; everyone uses garbage silos like Substack now — I discovered that my site had already been added two weeks ago. I guess it got pulled in from one of the other [[2026-03-23_soop#Publicity|places I submitted to]]. [^1]: It was around this point that I realized that my efforts to keep the files small was mostly pointless. Each one was only about 300 bytes, but the filesystem's overhead meant it took up 4KB. Well, at least I can upload it faster. And my host seems to be using a 1KB block size, so it's probably still a win in the end. [^2]: It worked beautifully at a smaller size in Netscape and Firefox, but then refused to show up in links2 or elinks until I made it bigger. [^ref]: Helpful site from 2011: https://www.syntaxsandbox.co.uk/learnhtml/frames.html