# nagi miniweb: keeping it stupid
#tech-project
## Background
Yesterday I was thinking about adding a [[Wander|https://codeberg.org/susam/wander]] console to my site and losing enthusiam by the minute.
It's a cool project, but it's JavaScript-based, and I don't want to add any of that to my site if I can avoid it; I figured if it was *really cool*, I could make an exception. So, I set about trying to extend it to solve a problem raised on the issue tracker: people put the root domain in their site lists, instead of a good post. I figured, how hard can it be to use js to fetch the document, look for ``, then fetch the feed and grab the first post? Well, probably not very hard if you're good at Computer, but I kept running into problems. Sites can set security policies that my own browser will betray me by respecting. And it seemed like I might realistically need to use a dependency in order to parse the RSS, which I didn't want to do, and I was just... slowly losing interest in the whole thing.
Besides, I didn't even know what sites I would put in my console. That reminded me, I never submitted my site to Kagi Smallweb.[^0] Hey, that's open-source, right? What if I just dump those into my console? It did technically work, but it was slow. And uh... what was even the point? Then I remembered [[a post I'd seen on the bugtracker|https://github.com/kagisearch/smallweb/issues/700]] complaining that KSW doesn't work in terminal browsers. I tested it in a few, and yeah, it really didn't work at all. And it really *ought* to. Hm, I can do this. How hard could it be?
Many more hours later than it really deserved, I had [[nagi miniweb|/app/nagi/]].
## Deets
The smart way to do this would be to have a program running on your server. When you hit "next", it would pull a random URL from the list and insert it into the document it serves you. The problem is, I don't have easy access to that ability on my current host. And, just like JavaScript, I'd like to minimize dynamic content in general, so that I can easily pick up and move to a new host, consider hosting it myself without security issues, etc. So I thought, hey, what if I do it the stupid way? I can just pre-generate all those documents that would normally be generated on the fly. Each one can point to the next site in a circular linked list. It would take up an absurd amount of space, but like... how much?
Then I remembered one dynamic feature I *do* have access to: server-side includes. Well, there we go, problem solved! I'll just abuse those to compress each entry down to the bare minimum. It's still dynamic, but it's a very simple, widespread feature, and without it, this stupid project is DOA. A few shell scripts later and I had my answer to how much space it would take up: about 120MB. That's pretty horrible, but [[Web 1.0 Hosting|https://web1.0hosting.net/]] gives me 500MB, and I'm only using like 30. Fuck it, we'll do it live.[^1]
It worked, but I had three glaring problems:
1. A bunch of the sites would display a security error instead of loading.
2. It still didn't work in any terminal browsers. I turned on iframe support in elinks and that just caused it to crash.
3. Most of the sites required https, so it wasn't very useful in old browsers.
To solve #1, I decided to stick with stupidity since it had served me well enough so far. I looped through the entire list of websites, curled the headers, and grepped them for the anti-framing rules. Then I went to bed.
When I woke up, it turned out that about 20% of the websites on the list forbid framing. Eh, that's acceptable, let's just delete them. (I'm not sure how Kagi handles this. I was surprised they had *any* on the list that forbid framing, much less such a large number. I guess I could look at the code, but I don't feel like it.)
For #2, when I fruitlessly enabled iframes in elinks, I noticed that it claimed to fully support *frames* (no i). These are really arcane, crusty stuff.[^ref] You don't put them in your `