Many moons ago I was on a constrained internet connection -- I set up a repeater by hanging an old phone over my curtains so it could catch Wifi from the cafe across and connected to the phone's internet over bluetooth.
I had like 2KB/s.
This made most of the internet unusable, but it turns out the parts I care about are text. So I just browsed it through a text browser.
This didn't really work either, because it turns out web protocols don't work very well over 2KB/s.
So I browsed the internet by connecting to a $1 VPS (very fast internet!) over Mosh (which is like SSH, but more efficient and resilient). So that way, it would only send the actual bytes of text to me.
I mostly browsed HN and the linked articles at that point.
The browser that rendered HN the best in those days was w3m. I remember it had indentation and even mouse / scrolling support. I tried lynx too and it was good too, but I went with w3m in the end.
I see w3m hasn't been updated in 15 years, but it's probably still better for reading HN, whose UI hasn't changed for longer than that! I will have to give them both a spin :)
Yeah mosh works really well in those kinds of scenarios. My provider once had an outage where they dropped 50% of all packets, which rendered most of the internet completely unusable. I was able to connect to my VPS via mosh (it took 6 attempts since it uses SSH for the initial handshake), and then mosh + w3m worked essentially the same as if no packet drops even existed. Feels like magic
> set up a repeater by hanging an old phone over my curtains so it could catch Wifi from the cafe across and connected to the phone's internet over bluetooth.
I had no idea this was possible. Can you explain why this works? Sounds fascinating.
Basically you are making a network connection via bluetooth. Depending on the OS you drive, there's probably a guide such as [1] to set up bluetooth via the modem adapter to act as an internet connection.
Of course it relies on both devices being able to create and maintain a good bluetooth connection.
It would be nicer if phones could run two wifi networks at the same time, allowing a mix of leach or hotspots but I guess in practical terms it's only one in a thousand type of demand.
For HN there's gopher://hngopher.com and for the main web, links preserves
the cascaded formatting for threads. Yes, I used mosh with 2.7 KB/s too, against
a pubnix, but I used gopher too for tons of services.
Thanks to bitlbee I could talk with my SO and relatives against Telegram and https://brutaldon.org allowed me to read posts at Mastodon.
I still used Lynx as my default browser while working on ships until 2020. Satellite internet connections at sea were slow and very expensive which made Lynx a good choice. But it turned out that the text-based, distraction-free browsing could be a better experience than the same site in a modern browser. And a few sites still serve text versions, like text.npr.org. I liked Lynx enough that I would still use it back on land until the habit faded.
It is unfortunate that modern web development has led to websites so complex that they either break entirely or look terrible in text-based browsers like Lynx. Take Mastodon, for example:
$ lynx https://mastodon.social/
[…]
To use the Mastodon web application, please enable JavaScript.
Alternatively, try one of the native apps for Mastodon for your
platform.
The C2 Wiki does not load either:
$ lynx https://wiki.c2.com/
[…]
javascript required to view this site
why
To their credit, at least they use the <noscript> tag to display the above notices. Some websites don't even bother with that. But there are many old school websites that still load fine to varying degrees:
lynx https://danluu.com/ # Mostly okay but some needed spaces missing
lynx https://en.wikipedia.org/ # Okay, but a large wall of links on top
lynx https://irreal.org/blog/ # Renders fine
lynx https://libera.chat/ # Mostly fine
lynx https://news.ycombinator.com/ # Of course!
lynx https://sachachua.com/ # Mostly fine
lynx https://shkspr.mobi/ # Renders really well
lynx https://susam.net/ # Disclosure: This is mine
lynx https://norvig.com/ # A classic!
lynx https://nullprogram.com/ # Also pretty good
If you have more examples, please comment, and I'll add them to this list in the two hour edit window I have.
While JavaScript has its place, I believe that websites that focus on delivering primarily text content could prioritise working well in TUI browsers. Sometimes testing it with text-based browsers may even show fundamental issues with your HTML. For example, several times, I've seen that multiple navigation links next to each other have no whitespace between them. The links may appear like this:
HomeBlogRSSAboutCodebergMastodon
Or, in a list of articles, dates and titles may appear jammed together:
14 Mar 2025The Lost Art of Dual Booting
15 Mar 2025Some Forgotten Features of Gopher
16 Mar 2025My Favourite DOS Games
The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent. The number of text-based web users may be shrinking, but there are some of us who still browse the web using tools like lynx, w3m, and M-x eww, at least occasionally.
Unlike computer interfaces, the web was never text-first. It was graphical from the start. The first browser was in a GUI, not a terminal.
Sites have been hobbled/broken on Lynx since the very beginning. It's neat and can be convenient to have a browser that works in your terminal for simple stuff, but the web was never designed for that. It's natural and to be expected that many sites will break. The burden is really on Lynx to do what it can to support sites as they are, rather than sites to try to build compatibility with Lynx.
It's kind of like, there are programs to "view" a PDF in the terminal, or at least its text content. But PDF authors shouldn't be expected to optimize the order text is presented for those programs. That's not what PDF was ever meant for, even if you can get it work sometimes.
Given the web’s much wider remit than pdf, it has support for accessibility tools and much better non-visual handling than pdf, so the comparison isn’t entirely fair I think.
If a website doesn’t handle lynx well, there’s a good chance it doesn’t handle accessibility well either.
> It's natural and to be expected that many sites will break.
There is nothing "natural" about software development at all. It was an active choice to hobble the internet as a browser to sell ads via interactive apps.
Images and other media were supported from the start, they just opened in separate windows instead of being inline.
The first browser ran on NeXT, graphically. It did not grow out of the terminal. And the very first publicly formalized definition of HTML in 1993 did already include the img tag:
Back in the day my 28.8k modem came bundled with a book on HTML, which is how I learned to make my first personal web site.
Even back then, the book recommended testing your web site in Lynx for two reasons:
1. Web sites are supposed to gracefully degrade when viewed in browsers without support for advanced features.
2. Accessibility matters, and while most of us don't have access to or know how to operate screen readers if we can comfortably view and navigate a web site in Lynx there's a pretty decent chance that it'll be usable with a screen reader.
It's been ~30 years since then and those reasons still apply just as well. For the vast majority of web sites which do not have any need to be interactive webapps there's not really any good reason for it not to be perfectly usable in a text-only browser, and if it's both readable and navigable in a text browser it should also be with a screen reader.
By the time it caught on, HTML did allow for very graphical web pages, and that's not even considering how popular Flash Player was when the internet had its initial growth spurt. Technically, very early versions of HTML, especially predating CSS, were closer to epub than modern HTML, but there were so few web pages it's not a meaningful argument against your supposition.
I think what's more important is that a significant portion of web pages have no more complex layout than a newspaper or slideshow, so why not make them easy to parse? Not only would it make browsing in Lynx easier, but it would work well with screen readers, which are the only way some people can browse web pages.
I think a significant difference is that in the early days, the content was predominantly text, with styling/images/multimedia to embellish the content. But today it feels like a large proportion of websites put the embellishments first, the text content is thin and you often have to hunt it out.
Of course the web has evolved and has uses other than reading/absorbing information (some of them great) and multimedia content is valid, but it does seem to have become harder to find substance in amongst all the style.
When I'm surfing the web it's still usually words that I'm looking for. I think that may be going out of fashion.
Wasn’t lynx before other graphical browsers? I remember first using the web through a vax terminal and lynx. You could download images but had to launch a viewer.
I think you guys are talking across one another. You both are correct to my point of view. OP is correct that allowing websites, by enacting standards that allowed such a thing, to script their rendering was a mistake. You are also correct that the web is visual from the start.
The early web was not terminal based but that doesn't automatically mean it was "graphical". HTML was meant to be processed by any number of different user agents. A lot of early HTML tags were semantic in nature in order to convey the intent of the element. Conveying intent allows non-graphical browser, meaning everything from spiders to screen readers to AI agents, to use a page with the same capability.
People abusing HTML for the purpose of styling is the whole reason for CSS existing. A well written HTML document should have a very clean structure. The CSS has the ability to do all the crazy graphical styling. The old CSS Zen Garden was an amazing demonstration of that, an incredibly well structured and mostly semantic HTML document could have any number of crazy styles only by varying the CSS.
Bullshit HTML loaded with bullshit CSS frameworks generated by megabytes of bullshit JavaScript is a complete failure of web devs to master the medium. Unless a web page is absolutely reliant on a graphical interface (Google Maps, a game, etc) there's no reason that it shouldn't render passably in lynx. Even in those cases it should have noscript guardrails to provide at least an explanation as to why it can't work.
I have to add https://nullprogram.com, just because of the care the author took to have it work better in lynx[1]:
Just in case you haven’t tried it, the blog also works really well with terminal-based browsers, such as Lynx and ELinks. Go ahead and give it a shot. The header that normally appears at the top of the page is actually at the bottom of the HTML document structure. It’s out of the way for browsers that ignore CSS.
Which works perfectly, including navigation (next/prev/parent). The perfect way to use javascript to enhance a site (collapsing threads etc) but not require it.
HN is hosted on a single machine in a colo somewhere (with a backup elsewhere), yet has far more value than the majority of sites 100 times as complex.
Because HN value is the value of the comments, and those are a scarce resource. Making a great website (for whatever is your definition of great) doesn't guarantee that it will become valuable.
All this to say, HN shouldn't an example to blindly follow.
When I started using the WWW in 01992 the majority of Web users were probably using text-based browsers, and specifically Lynx, because that was what the University of Kansas was using for its campuswide information service (CWIS). Mosaic didn't exist yet, and most people accessing the internet were using either dumb terminals like I was (typically in my case a VT-100 or CIT-101 clone of it) or dialup terminal emulators like Procomm+.
I made an e-commerce platform that has zero JavaScript. It is PHP only. Additionally, cgit uses JavaScript for updating idle time, but you do not need it, just refresh the page.
But yeah, I wish people were more hesitant over-using JavaScript.
> The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent.
Why would this be considered "an issue" or "a problem in your HTML"? TUI browsers are really a fun novelty and not much more, I'd be shocked if even the largest sites in the world receive more than 1000 visits per day from lynx or eww or any other combined. Unless you have a compelling reason to think that your site will be used by terminal browser fans, there is no reason whatsoever to care about how your HTML renders without CSS. Even screen readers would not have problems properly reading links not separated by spaces correctly.
I agree with your javascript complaints, and I use a graphical browser.
I do prefer to surf with js disabled, and in most cases it actually works pretty well.
But the lack of non-js mastedon has pretty much stopped me from reading posts on the system. I can surf github, and this site, with no js, but mastedon is a no go.
The conversion of the internet into a scam ad distribution system is the primary culprit leading to the massive proliferation of js, along with the use of overly complex "frameworks" that could often be static html, I don't know what mastedon's excuse is...
Very early in my Linux days in the early 2000s I was bound and determined to learn how to use Lynx as I thought the skill would be a necessity for maintaining servers. Being able to look up issues online and what not.
Little did I realize that 99% of the time I would be SSHed in from a full desktop with a standard browser, and Lynx has just been kind of a fun novelty for me.
Many mobile devices render pages in a virtual window aka viewport, which is wider than the screen, and then shrink the rendered result down so it can all be seen at once.
Mobile browsers can stop doing that any time they want. They do it because pages not optimized for mobile and break often in mobile.
This 'shit-sifting' phenomenon in common in open protocols with lots of software and inertia.
1. Bad shit in the other end breaks this end.
2. Fix it with hack in this end.
3. Good shit in the other end is now bad shit with the fix.
4. Add workaround to make good shit good again.
(Microsoft Internet Explorer was born after Bill Gates did seance and Satan taught him to use this phenomenon to corrupt the internet.)
I maintained it for a while, then delegated the DNS to someone else, but they didn't maintain it either, swapped it back. ~I'll update it when I get a chance.~
edit: Updated with the correct version and some small HTML tweaks
Probably Fantasque because I've (to the best of my memory) never installed Cosmic Sans (and I made the screenshot, obvs.) but I do occasionally use Fantasque for terminals.
But since the screenshot needs updating, I'm open to suggestions for what font to use this time.
also works with android "Stoutner privacy browser"
with java scripts, cookies, DOM, turned OFF
went strait to the resources page, where there are versions for a lot of different OS's
nothing for android, but not realy expecting to find that, anyway.
Should we blame an old timey basic webpage for its lack of complexity or should we blame a modern browser for not accommodating the web in its most simple form?
There is a specification that says how it should be rendered. I definitely don't want every browser to decide how to render my webpage, that would just make development so much harder and complex.
I'm old enough to remember showing up to a new HS on a college campus in 1996 and our computer lab being on VAX.
We had official school pages on gopher (!!!) and the www browser was lynx.
To this day I install it on every new machine I get, especially laptops. Just in case I have to find some information on almost zero bandwidth. I don't recall having to use it, maybe once or twice in 25 years max.
Years ago (like 2013), I had an actual use case for lynx, which was that I was staying at a hotel long-term and I couldn't access the Wi-Fi landing page from my browser for some reason. But I could hit it from lynx, so I'd just log in from there every day.
Never had to do that since, but it sure saved my ass back then...
A browser which you can run inside your Vim! Use it all the time in a separate tab for reading compatible websites, including HN and some documentation, very handy!
Always dreamed about Javascript engine which could render to cli though....
Oh, this brings back memories of my first steps with Gentoo linux, when I failed at setting up the display (XFree86 back then) or configure it properly, I remember browsing Gentoo wiki pages with Lynx to bring it back.
Just tried accessing my personal website[1] with Lynx and it works pretty well. Granted Lynx is pretty crumb some compared to other browsers but like other commenters have said, it's a good tool to have in events where you need to access a particular website and have next to no internet connection.
EDIT: One thing I am curious about, is there documentation on how to make one's website "lynx friendly". Going through my website it's pretty clean but there are a few areas (like my recipes) that could use adjustment.
I wish there was a Lynx-like (or, even better, Edbrowse-like) web browser, but powered by something like headless Chromium underneath.
This way, you could have an extremely low-resource user terminal and/or a laptop on an extremely constrained connection, and still be able to use a modern web by connecting to a more powerful server.
You could even share such servers between users. Because people aren't all using the web at the same time, you could actually utilize that server capacity a lot more than you can do with laptops.
It would be even better integrated with an LLM (especially with extremely slow / unreliable / high latency connections).
As a web developer, everything is in the editor. A super-fast way to preview my web page changes is:
lynx --dump localhost:8000/mypage.html
Put that in a loop. My server or frontend updates appear immediately and I don't have to mess with triggering an external browser or faffing about. Wonderful tool!
Like most people my age, my first Internet experience at home was with lynx on a Unix shell. We shared credentials for an account at the school board that we scored from a teachers aid.
It was sweet to dial into the school board (same numbers all the teachers used!) and surf around in text only lynx.
For a while I was using https://habilis.net/lynxlet/ in MacOS to get Lynx as a 'native' browser. It was a nifty wrapper. Sadly lost to time... and 32-bit only. :(
Been using a text-only browser daily for over thirty years now. In the early to mid 90's I used Lynx. For me, it is the worst of all the text-only browsers I have tried. I would never go back to using it.
When I first was exposed to Lynx, I was also working on a project using the Lynx realtime Posix OS. To my knowledge, the two aren't related other than by name. I checked a couple of years ago, Lynx OS still exists but under a different name.
Proud user of noscript/basic (x)html browsers here.
Lynx and links (and I wanted to _code_ my own using netsurf libraries).
Restoring noscript/basic (x)html will only happen with hardcore regulation (or "tarif"/"gigantic fines"... same same...).
This is critical for the web, since that makes developing real-life alternative browsers a reasonable task from many pertinent perspectives.
The current technical landscape of the web is a disaster: a cartel of 2.5 absurdely and grotesquely gigantic web engines written in the most complex computer language out there which requires a compiler on the same complexity level... and there are only 2 of them from roughly from the same cartel/mob.
It seems that technical interop of the web with a very simple standard, stable in time and good enough to do the job is a 'competitive' issue of the small vs the big and should be handle by regulating administrations.
Remember, tons of web sites were noscript/basic (x)html compatible and doing a more than enough good job already... without insane technical dependencies...
Due to zoom's resource-intensity and my non-work laptop's dire lack of resources, I find myself using Lynx with some regularity today. I absolutely love it. Thanks, Lynx devs!
I cannot wait until vision llms are cheap and fast enough (ran locally of course) to just browse everything with, then I can return to browsing with lynx, or rather, emacs.
Well-written websites look great it it; poorly-written ones look terrible or are blank pages. That’s the same as with any other text-mode browser.
A lot of sites require Javascript to function, even when their actual functionality does not (pro tip: if you use Javascript to display text, show images or create links, you are doing it wrong). That’s a real shame.
As a general rule, the best sites work great, and the worst ones don’t work at all. This is not entirely a bad thing! It cna make web browsing less distracting. I often use a text-mode browser as a primary, and fall back to Firefox when using some Javascript-laden monstrosity.
This response was written from Lynx within a tmux session: Hacker News remains a notable bastion of simplicity, usability, and accessibility. Lynx is my primary browswer, but precisely because I desire a degree of separation from the "modern" internet. This requires curation. Many sites simply do not work. But this practice gives me space to reconsider what I actually want from the web and what the web wants from me.
Lynx is still valuable even if one doesn't use it to browse the web.
It's been a while but when large sites would every so often alter their link landscape, via pruning or renaming, Lynx was a useful tool when looking for similar pages that may not be displayed by search engines with the given query parameters, (non matching name,) and more often it was Lynx I'd use as it was quick to dump most of the links to search though later on if I needed to rather than deploy wget's spider mode
The problem, for the sites I'm interested in, typically isn't the lack of JavaScript, but how the sites layout. Take Wikipedia, the content is perfectly fine in Lynx and Links, but the menus are defined at the top of the page, meaning that you'll need to scroll multiple pages to get to the actual article.
If you have website, which have actual content (text), it's pretty easy to ensure that it looks good and works in Lynx, with minimal effort.
> Take Wikipedia, the content is perfectly fine in Lynx and Links, but the menus are defined at the top of the page, meaning that you'll need to scroll multiple pages to get to the actual article.
Wikipedia has a link at the top of the page that is labeled 'jump to content' that skips over all of the menu gunk at the top. Several other sites also do this
Site layout can definately be an annoyance browsing in text. More accessible sites will offer a Jump to Content link near the top, which is the case with Wikipedia. In Lynx, hitting the ">" to move down a row of links then right arrow (or corresponding vim motion if configured) brings you adjacent to the article TOC and significantly closer to the article text.
I don't get the appeal of using gopher. Is it just a nostalgic way of looking at the early days of the web or do you really think gopher was better than web? Might wanna read this if it's the latter: https://ils.unc.edu/callee/gopherpaper.htm
Once you head these services (and more) under an atom n270 netbook or a lesser machine, you'll understand why you would love Gopher, Usenet, IRC (and bridges like Bitlbee) and Email.
Tons of JS made lots of serviceable machines obsolete. Even an i3 -and i5- is not enough today to browse the web with 2GB of RAM without a heavy ad blocker like UBlock.
Old machines should not be used for anything serious anyway. It will lack firmware and OS updates, modern exploit mitigations and probably has a high power usage. The main reason gopher died was due to lack of openness in the protocol development. They corrected it in the early 2000's but it was too late by then
Good luck with modern intel exploits there, and I'm running OpenBSD 7.6 on an Atom n270 nebook like a champion. Power? Maybe a bit high for a machine from late 2000's, but is not so bad compared to a current powerhouse as a desktop.
I can play 720p video with no JS needed at all (mpv+yt-dlp), chat with Jabber/IRC/telegram, some of them over Bitlbee. I have locally curated playlists and music from both Jamendo and Internet Archive.
I can program in AWK, C and Go. I can plot stuff faster with AWK+GNUplot than most people battling with CUDA, Python and months of GPU usage drawing more power
for a mundane task than a quality beef restaurant in a month.
I have several services routed from the web to Gopher, even Internet Archive.
I can read the news using a laugable bandwidth and CPU on the remote servers.
The average desktop today uses far more power with local and remote services in a month than my machine and the Gopher servers over a year.
Many moons ago I was on a constrained internet connection -- I set up a repeater by hanging an old phone over my curtains so it could catch Wifi from the cafe across and connected to the phone's internet over bluetooth.
I had like 2KB/s.
This made most of the internet unusable, but it turns out the parts I care about are text. So I just browsed it through a text browser.
This didn't really work either, because it turns out web protocols don't work very well over 2KB/s.
So I browsed the internet by connecting to a $1 VPS (very fast internet!) over Mosh (which is like SSH, but more efficient and resilient). So that way, it would only send the actual bytes of text to me.
I mostly browsed HN and the linked articles at that point.
The browser that rendered HN the best in those days was w3m. I remember it had indentation and even mouse / scrolling support. I tried lynx too and it was good too, but I went with w3m in the end.
I see w3m hasn't been updated in 15 years, but it's probably still better for reading HN, whose UI hasn't changed for longer than that! I will have to give them both a spin :)
Yeah mosh works really well in those kinds of scenarios. My provider once had an outage where they dropped 50% of all packets, which rendered most of the internet completely unusable. I was able to connect to my VPS via mosh (it took 6 attempts since it uses SSH for the initial handshake), and then mosh + w3m worked essentially the same as if no packet drops even existed. Feels like magic
> set up a repeater by hanging an old phone over my curtains so it could catch Wifi from the cafe across and connected to the phone's internet over bluetooth.
I had no idea this was possible. Can you explain why this works? Sounds fascinating.
Basically you are making a network connection via bluetooth. Depending on the OS you drive, there's probably a guide such as [1] to set up bluetooth via the modem adapter to act as an internet connection.
Of course it relies on both devices being able to create and maintain a good bluetooth connection.
It would be nicer if phones could run two wifi networks at the same time, allowing a mix of leach or hotspots but I guess in practical terms it's only one in a thousand type of demand.
[1] https://www.lifewire.com/internet-on-laptop-with-a-bluetooth...
These days, I wonder if it'd be better to just sic a LLM API on it and have it stream you the text summary back?
For HN there's gopher://hngopher.com and for the main web, links preserves the cascaded formatting for threads. Yes, I used mosh with 2.7 KB/s too, against a pubnix, but I used gopher too for tons of services. Thanks to bitlbee I could talk with my SO and relatives against Telegram and https://brutaldon.org allowed me to read posts at Mastodon.
And, well, gopher://magical.fish it's unbeatable.
[dead]
I still used Lynx as my default browser while working on ships until 2020. Satellite internet connections at sea were slow and very expensive which made Lynx a good choice. But it turned out that the text-based, distraction-free browsing could be a better experience than the same site in a modern browser. And a few sites still serve text versions, like text.npr.org. I liked Lynx enough that I would still use it back on land until the habit faded.
You would love gopher with gopher://magical.fish and gopher://sdf.org among others such as gopher://gopher.icu
Oh, and reddit: gopher://gopherddit.com
HN: gopher://gopherddit.com
But, as they stated, connecting to a pubic Unix like mosh was and is magic.
You can render JS only websites using chromium headless like this:
chromium --headless example.com --disable-gpu --run-all-compositor-stages-before-draw --dump-dom --virtual-time-budget=10000 --window-size=800,600 | sed "s|<head>|<head><base href=example.com>|g" | lynx -stdin
good idea, firefox headless would be my choice though (I think all chromium-based browsers should be boycotted)
Check browsh out then.
It is unfortunate that modern web development has led to websites so complex that they either break entirely or look terrible in text-based browsers like Lynx. Take Mastodon, for example:
The C2 Wiki does not load either: To their credit, at least they use the <noscript> tag to display the above notices. Some websites don't even bother with that. But there are many old school websites that still load fine to varying degrees: If you have more examples, please comment, and I'll add them to this list in the two hour edit window I have.While JavaScript has its place, I believe that websites that focus on delivering primarily text content could prioritise working well in TUI browsers. Sometimes testing it with text-based browsers may even show fundamental issues with your HTML. For example, several times, I've seen that multiple navigation links next to each other have no whitespace between them. The links may appear like this:
Or, in a list of articles, dates and titles may appear jammed together: The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent. The number of text-based web users may be shrinking, but there are some of us who still browse the web using tools like lynx, w3m, and M-x eww, at least occasionally.Is it really so bad?
Unlike computer interfaces, the web was never text-first. It was graphical from the start. The first browser was in a GUI, not a terminal.
Sites have been hobbled/broken on Lynx since the very beginning. It's neat and can be convenient to have a browser that works in your terminal for simple stuff, but the web was never designed for that. It's natural and to be expected that many sites will break. The burden is really on Lynx to do what it can to support sites as they are, rather than sites to try to build compatibility with Lynx.
It's kind of like, there are programs to "view" a PDF in the terminal, or at least its text content. But PDF authors shouldn't be expected to optimize the order text is presented for those programs. That's not what PDF was ever meant for, even if you can get it work sometimes.
Given the web’s much wider remit than pdf, it has support for accessibility tools and much better non-visual handling than pdf, so the comparison isn’t entirely fair I think. If a website doesn’t handle lynx well, there’s a good chance it doesn’t handle accessibility well either.
> It's natural and to be expected that many sites will break.
There is nothing "natural" about software development at all. It was an active choice to hobble the internet as a browser to sell ads via interactive apps.
No it wasn't. The img element was not added to the standard until HTML 2.0
Images and other media were supported from the start, they just opened in separate windows instead of being inline.
The first browser ran on NeXT, graphically. It did not grow out of the terminal. And the very first publicly formalized definition of HTML in 1993 did already include the img tag:
https://www.w3.org/MarkUp/draft-ietf-iiir-html-01.txt
OP never said that the web was terminal based. He said it was text-based. Text should render fine either in a terminal or on a graphical canvas.
That doesn't necessarily hold true with rich text, though...
Heading formats can be well represented on an VT320. HTML didn't have color text at the start.
Back in the day my 28.8k modem came bundled with a book on HTML, which is how I learned to make my first personal web site.
Even back then, the book recommended testing your web site in Lynx for two reasons:
1. Web sites are supposed to gracefully degrade when viewed in browsers without support for advanced features.
2. Accessibility matters, and while most of us don't have access to or know how to operate screen readers if we can comfortably view and navigate a web site in Lynx there's a pretty decent chance that it'll be usable with a screen reader.
It's been ~30 years since then and those reasons still apply just as well. For the vast majority of web sites which do not have any need to be interactive webapps there's not really any good reason for it not to be perfectly usable in a text-only browser, and if it's both readable and navigable in a text browser it should also be with a screen reader.
By the time it caught on, HTML did allow for very graphical web pages, and that's not even considering how popular Flash Player was when the internet had its initial growth spurt. Technically, very early versions of HTML, especially predating CSS, were closer to epub than modern HTML, but there were so few web pages it's not a meaningful argument against your supposition.
I think what's more important is that a significant portion of web pages have no more complex layout than a newspaper or slideshow, so why not make them easy to parse? Not only would it make browsing in Lynx easier, but it would work well with screen readers, which are the only way some people can browse web pages.
I think a significant difference is that in the early days, the content was predominantly text, with styling/images/multimedia to embellish the content. But today it feels like a large proportion of websites put the embellishments first, the text content is thin and you often have to hunt it out.
Of course the web has evolved and has uses other than reading/absorbing information (some of them great) and multimedia content is valid, but it does seem to have become harder to find substance in amongst all the style.
When I'm surfing the web it's still usually words that I'm looking for. I think that may be going out of fashion.
Wasn’t lynx before other graphical browsers? I remember first using the web through a vax terminal and lynx. You could download images but had to launch a viewer.
Nexus predates Lynx.
I think you guys are talking across one another. You both are correct to my point of view. OP is correct that allowing websites, by enacting standards that allowed such a thing, to script their rendering was a mistake. You are also correct that the web is visual from the start.
The early web was not terminal based but that doesn't automatically mean it was "graphical". HTML was meant to be processed by any number of different user agents. A lot of early HTML tags were semantic in nature in order to convey the intent of the element. Conveying intent allows non-graphical browser, meaning everything from spiders to screen readers to AI agents, to use a page with the same capability.
People abusing HTML for the purpose of styling is the whole reason for CSS existing. A well written HTML document should have a very clean structure. The CSS has the ability to do all the crazy graphical styling. The old CSS Zen Garden was an amazing demonstration of that, an incredibly well structured and mostly semantic HTML document could have any number of crazy styles only by varying the CSS.
Bullshit HTML loaded with bullshit CSS frameworks generated by megabytes of bullshit JavaScript is a complete failure of web devs to master the medium. Unless a web page is absolutely reliant on a graphical interface (Google Maps, a game, etc) there's no reason that it shouldn't render passably in lynx. Even in those cases it should have noscript guardrails to provide at least an explanation as to why it can't work.
I have to add https://nullprogram.com, just because of the care the author took to have it work better in lynx[1]:
[1] https://nullprogram.com/blog/2017/09/01/And of course https://news.ycombinator.com/
Which works perfectly, including navigation (next/prev/parent). The perfect way to use javascript to enhance a site (collapsing threads etc) but not require it.
HN is hosted on a single machine in a colo somewhere (with a backup elsewhere), yet has far more value than the majority of sites 100 times as complex.
Because HN value is the value of the comments, and those are a scarce resource. Making a great website (for whatever is your definition of great) doesn't guarantee that it will become valuable.
All this to say, HN shouldn't an example to blindly follow.
For mastodon, https://brutadon.org works with the 'old login form'.
For the c2wiki, there are clones of it which work with plain text, but I can't remember the alternative domain. You can DDG/GG it, tho.
Brutaldon[0] is a Mastodon UI that (allegedly) works with Lynx. Which is something.
[0] https://brutaldon.org/about
> The number of text-based web users may be shrinking
I wouldn't be surprised if it's growing in absolute numbers, in relative numbers it stays at essentially 0% where it always was.
When I started using the WWW in 01992 the majority of Web users were probably using text-based browsers, and specifically Lynx, because that was what the University of Kansas was using for its campuswide information service (CWIS). Mosaic didn't exist yet, and most people accessing the internet were using either dumb terminals like I was (typically in my case a VT-100 or CIT-101 clone of it) or dialup terminal emulators like Procomm+.
I made an e-commerce platform that has zero JavaScript. It is PHP only. Additionally, cgit uses JavaScript for updating idle time, but you do not need it, just refresh the page.
But yeah, I wish people were more hesitant over-using JavaScript.
> The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent.
Why would this be considered "an issue" or "a problem in your HTML"? TUI browsers are really a fun novelty and not much more, I'd be shocked if even the largest sites in the world receive more than 1000 visits per day from lynx or eww or any other combined. Unless you have a compelling reason to think that your site will be used by terminal browser fans, there is no reason whatsoever to care about how your HTML renders without CSS. Even screen readers would not have problems properly reading links not separated by spaces correctly.
I agree with your javascript complaints, and I use a graphical browser.
I do prefer to surf with js disabled, and in most cases it actually works pretty well.
But the lack of non-js mastedon has pretty much stopped me from reading posts on the system. I can surf github, and this site, with no js, but mastedon is a no go.
The conversion of the internet into a scam ad distribution system is the primary culprit leading to the massive proliferation of js, along with the use of overly complex "frameworks" that could often be static html, I don't know what mastedon's excuse is...
Very early in my Linux days in the early 2000s I was bound and determined to learn how to use Lynx as I thought the skill would be a necessity for maintaining servers. Being able to look up issues online and what not.
Little did I realize that 99% of the time I would be SSHed in from a full desktop with a standard browser, and Lynx has just been kind of a fun novelty for me.
I miss websites that look like lynx's: https://lynx.browser.org/
Missing one line to look good in mobile.
Added. Makes the text better but now the screenshot is too large. Bloody HTML.
I prefer to add
What does that do, and why is it not the default?
Many mobile devices render pages in a virtual window aka viewport, which is wider than the screen, and then shrink the rendered result down so it can all be seen at once.
Mobile browsers can stop doing that any time they want. They do it because pages not optimized for mobile and break often in mobile.
This 'shit-sifting' phenomenon in common in open protocols with lots of software and inertia.
1. Bad shit in the other end breaks this end.
2. Fix it with hack in this end.
3. Good shit in the other end is now bad shit with the fix.
4. Add workaround to make good shit good again.
(Microsoft Internet Explorer was born after Bill Gates did seance and Satan taught him to use this phenomenon to corrupt the internet.)
That landing page seems unmaintained, I think this is the main home page: https://lynx.invisible-island.net/
> That landing page seems unmaintained.
I maintained it for a while, then delegated the DNS to someone else, but they didn't maintain it either, swapped it back. ~I'll update it when I get a chance.~
edit: Updated with the correct version and some small HTML tweaks
This says:
> Access Denied - Sucuri Website Firewall
...
> Block reason: Access from your Country was disabled by the administrator.
For that reason I don't think it's a good page to recommend.
The font in the screenshot is Cosmic Sans [sic],
https://github.com/gregkh/cosmic-sans-neue
which was later renamed Fantasque due to hate mail.
> The font in the screenshot is Cosmic Sans
Probably Fantasque because I've (to the best of my memory) never installed Cosmic Sans (and I made the screenshot, obvs.) but I do occasionally use Fantasque for terminals.
But since the screenshot needs updating, I'm open to suggestions for what font to use this time.
Opted for Atkinson Hyperlegible Mono, Bold, at 20pt.
Do you mean the naming similarity to Comic Sans?
Unreadable on a phone without zooming and panning. A little CSS wouldn’t be a bad thing here.
Perfectly readable in Firefox Mobile. Chrome too. What mobile browser are you using?
also works with android "Stoutner privacy browser" with java scripts, cookies, DOM, turned OFF went strait to the resources page, where there are versions for a lot of different OS's nothing for android, but not realy expecting to find that, anyway.
You mean unreadable on mobile due to tiny text?
Should we blame an old timey basic webpage for its lack of complexity or should we blame a modern browser for not accommodating the web in its most simple form?
We can walk and chew gum? Also, how much complexity do you think is needed to fix this by the webpage?
It would be up to the browser to render HTML sensibly. Use a browser that can.
There is a specification that says how it should be rendered. I definitely don't want every browser to decide how to render my webpage, that would just make development so much harder and complex.
It's perfectly readable on Brave for Android. The text even wraps to the screen size so you don't have to scroll.
Which phone browser renders it in an unreadable manner?
Any phone browser that rendered the page before they added a simple fix
I’m sorry to hear that lunch to zoom is hard on your fingers
Looked fine for me on mobile.
I use it for two things:
* saving webpages as text with the links nicely organized at the bottom, and
* calling it from mutt (MUA) to display HTML parts of mail messages.
It works great and it's consistent.
I'm old enough to remember showing up to a new HS on a college campus in 1996 and our computer lab being on VAX.
We had official school pages on gopher (!!!) and the www browser was lynx.
To this day I install it on every new machine I get, especially laptops. Just in case I have to find some information on almost zero bandwidth. I don't recall having to use it, maybe once or twice in 25 years max.
But it's there if I need it.
Years ago (like 2013), I had an actual use case for lynx, which was that I was staying at a hotel long-term and I couldn't access the Wi-Fi landing page from my browser for some reason. But I could hit it from lynx, so I'd just log in from there every day.
Never had to do that since, but it sure saved my ass back then...
My first browser. There was another one called links which would display graphics inside the terminal.
https://en.m.wikipedia.org/wiki/Links_(web_browser)
I use http://links.twibright.com for surfing the local folders and lynx to retrieve pages. A little bit of sanity in this crazy world.
A browser which you can run inside your Vim! Use it all the time in a separate tab for reading compatible websites, including HN and some documentation, very handy!
Always dreamed about Javascript engine which could render to cli though....
Oh, this brings back memories of my first steps with Gentoo linux, when I failed at setting up the display (XFree86 back then) or configure it properly, I remember browsing Gentoo wiki pages with Lynx to bring it back.
Yeah, it was "links" in my case I believe. Not sure.
That and `elinks` were both viable and slightly more layout-focused.
Lynx was great, but w3m + gpm for mouse input + fb for graphics was a revelation.
Just tried accessing my personal website[1] with Lynx and it works pretty well. Granted Lynx is pretty crumb some compared to other browsers but like other commenters have said, it's a good tool to have in events where you need to access a particular website and have next to no internet connection.
EDIT: One thing I am curious about, is there documentation on how to make one's website "lynx friendly". Going through my website it's pretty clean but there are a few areas (like my recipes) that could use adjustment.
[1]: https://sunny.gg
I wish there was a Lynx-like (or, even better, Edbrowse-like) web browser, but powered by something like headless Chromium underneath.
This way, you could have an extremely low-resource user terminal and/or a laptop on an extremely constrained connection, and still be able to use a modern web by connecting to a more powerful server.
You could even share such servers between users. Because people aren't all using the web at the same time, you could actually utilize that server capacity a lot more than you can do with laptops.
It would be even better integrated with an LLM (especially with extremely slow / unreliable / high latency connections).
There is browsh, which uses Firefox underneath: https://github.com/browsh-org/browsh
And looks like Carbonyl uses Chrome: https://github.com/fathyb/carbonyl
As a web developer, everything is in the editor. A super-fast way to preview my web page changes is:
Put that in a loop. My server or frontend updates appear immediately and I don't have to mess with triggering an external browser or faffing about. Wonderful tool!Like most people my age, my first Internet experience at home was with lynx on a Unix shell. We shared credentials for an account at the school board that we scored from a teachers aid. It was sweet to dial into the school board (same numbers all the teachers used!) and surf around in text only lynx.
I remember a friend at UCLA showed me this fancy schmatcy new Mosaic browser circa 1994 or so.
I said sure that looks nice, but why would I need this, when I have Lynx...plus Pine for newsgroups and e-mail.
For a while I was using https://habilis.net/lynxlet/ in MacOS to get Lynx as a 'native' browser. It was a nifty wrapper. Sadly lost to time... and 32-bit only. :(
Been using a text-only browser daily for over thirty years now. In the early to mid 90's I used Lynx. For me, it is the worst of all the text-only browsers I have tried. I would never go back to using it.
So, what's your favourite text-only browser?
When I first was exposed to Lynx, I was also working on a project using the Lynx realtime Posix OS. To my knowledge, the two aren't related other than by name. I checked a couple of years ago, Lynx OS still exists but under a different name.
Proud user of noscript/basic (x)html browsers here.
Lynx and links (and I wanted to _code_ my own using netsurf libraries).
Restoring noscript/basic (x)html will only happen with hardcore regulation (or "tarif"/"gigantic fines"... same same...).
This is critical for the web, since that makes developing real-life alternative browsers a reasonable task from many pertinent perspectives.
The current technical landscape of the web is a disaster: a cartel of 2.5 absurdely and grotesquely gigantic web engines written in the most complex computer language out there which requires a compiler on the same complexity level... and there are only 2 of them from roughly from the same cartel/mob.
It seems that technical interop of the web with a very simple standard, stable in time and good enough to do the job is a 'competitive' issue of the small vs the big and should be handle by regulating administrations.
Remember, tons of web sites were noscript/basic (x)html compatible and doing a more than enough good job already... without insane technical dependencies...
Due to zoom's resource-intensity and my non-work laptop's dire lack of resources, I find myself using Lynx with some regularity today. I absolutely love it. Thanks, Lynx devs!
Still use it in the terminal when debugging cloud applications. KUTGW!
I cannot wait until vision llms are cheap and fast enough (ran locally of course) to just browse everything with, then I can return to browsing with lynx, or rather, emacs.
How usable is Lynx for modern Internet?
I would love a browser I can operate within tmux, but how does it stack against the modern javascript-laden ecosystem?
Well-written websites look great it it; poorly-written ones look terrible or are blank pages. That’s the same as with any other text-mode browser.
A lot of sites require Javascript to function, even when their actual functionality does not (pro tip: if you use Javascript to display text, show images or create links, you are doing it wrong). That’s a real shame.
As a general rule, the best sites work great, and the worst ones don’t work at all. This is not entirely a bad thing! It cna make web browsing less distracting. I often use a text-mode browser as a primary, and fall back to Firefox when using some Javascript-laden monstrosity.
This response was written from Lynx within a tmux session: Hacker News remains a notable bastion of simplicity, usability, and accessibility. Lynx is my primary browswer, but precisely because I desire a degree of separation from the "modern" internet. This requires curation. Many sites simply do not work. But this practice gives me space to reconsider what I actually want from the web and what the web wants from me.
Lynx is still valuable even if one doesn't use it to browse the web.
It's been a while but when large sites would every so often alter their link landscape, via pruning or renaming, Lynx was a useful tool when looking for similar pages that may not be displayed by search engines with the given query parameters, (non matching name,) and more often it was Lynx I'd use as it was quick to dump most of the links to search though later on if I needed to rather than deploy wget's spider mode
lynx -dump -listonly -nonumbers http://foobar.com/ >foobar.txt
The problem, for the sites I'm interested in, typically isn't the lack of JavaScript, but how the sites layout. Take Wikipedia, the content is perfectly fine in Lynx and Links, but the menus are defined at the top of the page, meaning that you'll need to scroll multiple pages to get to the actual article.
If you have website, which have actual content (text), it's pretty easy to ensure that it looks good and works in Lynx, with minimal effort.
> Take Wikipedia, the content is perfectly fine in Lynx and Links, but the menus are defined at the top of the page, meaning that you'll need to scroll multiple pages to get to the actual article.
Wikipedia has a link at the top of the page that is labeled 'jump to content' that skips over all of the menu gunk at the top. Several other sites also do this
Site layout can definately be an annoyance browsing in text. More accessible sites will offer a Jump to Content link near the top, which is the case with Wikipedia. In Lynx, hitting the ">" to move down a row of links then right arrow (or corresponding vim motion if configured) brings you adjacent to the article TOC and significantly closer to the article text.
The Wiki has the answer:
Well, yes. But what's the browsing experience like then?
I appreciate I can just install it and try, but tips for any long time users would be more insightful I think.
elinks is another textmode web browser that supports some javascript. https://github.com/rkd77/elinks/
I use thr shell alias "?" for websearch with lynx. So "$ ? search term"
Can we run it in Chrome?
Well apparently you can compile it to wasm so probably yes?
https://gist.github.com/Potherca/866dfc72de9bfe0fc5627945446...
Recent discussion:
Lynx Browser: The Land That Time Revived (2022)
https://news.ycombinator.com/item?id=43119238
I still use it with:
gopher://hngopher.com (for actual HN I use Links)
gopher://magical.fish (Huge portal)
gopher://tilde.pink/1/~bencollver/ia (Internet Archive)
gopher://sdf.org (Tech blogs and code mainly)
https://neuters.de (news)
https://m.xkcd.com and a external viewer
I don't get the appeal of using gopher. Is it just a nostalgic way of looking at the early days of the web or do you really think gopher was better than web? Might wanna read this if it's the latter: https://ils.unc.edu/callee/gopherpaper.htm
Once you head these services (and more) under an atom n270 netbook or a lesser machine, you'll understand why you would love Gopher, Usenet, IRC (and bridges like Bitlbee) and Email.
Tons of JS made lots of serviceable machines obsolete. Even an i3 -and i5- is not enough today to browse the web with 2GB of RAM without a heavy ad blocker like UBlock.
Old machines should not be used for anything serious anyway. It will lack firmware and OS updates, modern exploit mitigations and probably has a high power usage. The main reason gopher died was due to lack of openness in the protocol development. They corrected it in the early 2000's but it was too late by then
>Modern exploit mitigations >OS updates
Good luck with modern intel exploits there, and I'm running OpenBSD 7.6 on an Atom n270 nebook like a champion. Power? Maybe a bit high for a machine from late 2000's, but is not so bad compared to a current powerhouse as a desktop.
I can play 720p video with no JS needed at all (mpv+yt-dlp), chat with Jabber/IRC/telegram, some of them over Bitlbee. I have locally curated playlists and music from both Jamendo and Internet Archive.
I can comment on both HN and Mastodon with https://brutaldon.org
I can program in AWK, C and Go. I can plot stuff faster with AWK+GNUplot than most people battling with CUDA, Python and months of GPU usage drawing more power for a mundane task than a quality beef restaurant in a month.
I have several services routed from the web to Gopher, even Internet Archive. I can read the news using a laugable bandwidth and CPU on the remote servers.
The average desktop today uses far more power with local and remote services in a month than my machine and the Gopher servers over a year.
Or you can use Dillo browser with the gopher plugin
https://github.com/dillo-browser/dillo-plugin-gopher