#Hypertext

Vassil Nikolovvnikolov@ieji.de
2025-05-25

[Explicit and implicit hyperlinks]

@kentpitman @amoroso

> links not just being navigational for the purpose of visible reading order, but hidden sometimes, and needing to be teased out. Deep structure, as it were

Some paper books offer a form of that, where end notes (even nontrivial ones) are not marked in the text.

"The world will end when all words in Wikipedia turn blue."
(Attribution unknown)

#HyperText

2025-05-25

These late 1980s and early 1990s papers reviewed the state of hypertext research and applications, covering systems such as NoteCards, GNU Info, Intermedia, CREF by @kentpitman, HyperCard, and more. They capture the intense activity and exploration around a still young and rapidly evolving field.

A Survey of Hypertext
csis.pace.edu/~marchese/CS835/

State of the Art Review on Hypermedia Issues And Applications
academia.edu/download/11333046

#HyperText #retrocomputing

Kind of an update on my game All lost souls are human.

Started working on it again.

#blog #blogger #hypertext

m1a1-thesockmonkey.blogspot.co

Jens Oliver Meiertj9t@mas.to
2025-05-12

From the archives:

When to Open Links in a New Tab:

meiert.com/blog/links-in-new-t

#links #usability #hypertext

James Walters :linux: :python:jameswalters@fosstodon.org
2025-05-07

Always use a proofreader.

#html #htmx #hypertext #react

Two similar article previews stacked vertically. The top one titled "Introducing Hyper — A simple React alternative (Developer Preview) (10 minute read)" describes Hyper as a simple markup language for building UIs with clean syntax. The bottom one has the same format but replaces "Hyper" with "HTML" throughout the text, suggesting it's a parody or comparison of the two technologies.
Vassil Nikolovvnikolov@ieji.de
2025-05-04

@kentpitman @screwtape

CREF appears to be related (maybe just a little) to semantic networks.
(With regards to SUMMARIZES and SUMMARIZED-BY.)

#SemanticNetworks
#CREF #LispM #hypertext #history

@screwtape

That's distinguished from the CREF editor that I wrote in 1984, while on leave from my work on the Programmer's Apprentice to do a summer's work at the Open University.

CREF (the Cross-Referenced Editing Facility) was basically made out of spare parts from the Zwei/Zmacs substrate but did not use the editor buffer structure of Zmacs per se. If you were in Zmacs you could not see any of CREF's structure, for example. And the structure that CREF used was not arranged linearly, but existed as a bunch of disconnected text fragments that were dynamically assembled into something that looked like an editor buffer and could be operated on using the same kinds of command sets as Zmacs for things like cursor motion, but not for arbitrary actions.

It was, in sum, a hypertext editor though I did not know that when I made it. The term hypertext was something I ran into as I tried to write up my work upon return to MIT from that summer. I researched similar efforts and it seemed to describe what I had made, so I wrote it up that way.

In the context of the summer, it was just "that editor substrate Kent cobbled together that seemed to do something useful for the work we were doing". So hypertext captured its spirit in a way that was properly descriptive.

This was easy to throw together quickly in a summer because other applications already existed that did this same thing. I drew a lot from Converse ("CON-verse"), which was the conversational tool that offered a back-and-forth of linearly chunked segments like you'd get in any chat program (even to include MOO,), where you type at the bottom and the text above that is a record of prior actions, but where within the part where you type you had a set of Emacs-like operations that could edit the not-yet-sent text.

In CREF, you could edit any of the already-sent texts, so it was different in that way, and in CREF the text was only instantaneously linear as you were editing a series of chunks, but some commands would rearrange the chunks giving a new linearization that could again be edited. While no tool on the LispM did that specific kind of trick, it was close enough to what other tools did that I was able to bend things without rewriting the Zwei substrate. I just had to be careful about communicating the bounds of the region that could be editing, and about maintaining the markers that separated the chunks as un-editable, so that I could at any moment turn the seamed-together text back into chunks.

Inside CREF, the fundamental pieces were segments, not whole editor buffers. Their appearance as a buffer was a momentary illusion. A segment consisted of a block of text represented in a way that was natural to Zwei, and a set of other annotations, which included specifically a set of keywords (to make the segments easier to find than just searching them all for text matches) and some typed links that allowed them to be connected together.

Regarding links: For example, you could have a SUMMARIZES link from one segment to a list of 3 other segments, and then a SUMMARIZED-BY link back from each of those segments to the summary segment. Or if the segments contained code, you could have a link that established a requirement that one segment be executed before another in some for-execution/evaluation ordering that might need to be conjured out of such partial-order information. And that linkage could be distinct from any of several possible reading orders that might be represented as links or might just be called up dynamically for editing.

In both cases, the code I developed continued to be used by the research teams I developed it for after I left the respective teams. So I can't speak to that in detail other than to say it happened. In neither case did the tool end up being used more broadly.

I probably still have the code for CREF from the time I worked on it, though it's been a long time since I tried to boot my MacIvory so who knows if it still loads. Such magnetic media was never expected to have this kind of lifetime, I think.

But I also have a demo of CREF where took screenshots at intervals and hardcopied them and saved the hardcopy, and then much later scanned the hardcopy. That is not yet publicly available, though I have it in google slides. I'll hopefully make a video sometime of that just for the historical record.

3/n

#CREF #LispM #hypertext #history

Open Riskopenrisk
2025-05-01

The focus of early web was text (http is after all). Text is powerful. You can share poetry with text. Searching text made Google a giant. Sharing snippets of text made Twitter a minor giant.

But once money chased eyeballs, the focus changed to visual media. Less cognitive load. Text requires literacy, image does not.

The early "lack of vision" was corrected with acquisitions. Google bought Youtube. Facebook bought Instagram. Finally TikTok came along.

Pulp fiction trumped poetry.

Digital Mark λ ☕️ 🕹 🙄mdhughes@appdot.net
2025-04-28

I found the note I left myself from just waking up:

Dreamed I was back in the '80s, writing a Gopher/Markdown-like system on the Atari 8-bit, with the line-oriented editor I wrote on TRS-80, not MEDIT. I was explaining heading levels to someone, and drawing ATASCII banner art for h1, inverse for h2, etc.

This wouldn't be too hard to really do (but really, use MEDIT). Atari DOS is fast enough to load pages on demand.
#dream #atari #retrocomputing #hypertext

Digital Mark λ ☕️ 🕹 🙄mdhughes@appdot.net
2025-04-27
The Medley Interlisp Projectinterlisp@fosstodon.org
2025-04-13

In this videotape recorded in 1986 Frank Halasz demonstrates hypertext browsers and other types of cards of NoteCards, the hypermedia system he co-developed in Interlisp at Xerox PARC.

archive.org/details/Notecards_

#interlisp #hypertext #retrocomputing

Claus Atzenbeck 🇪🇺clausatz@hci.social
2025-04-07

Attention graduate students with a passion for #hypertext! We invite you to join the INTR/HT #SummerSchool at the ACM Hypertext Conference 2025 @ht in Chicago. This event is co-organized with my esteemed colleague Dene Grigar and supported by @sigweb. It's a great opportunity to network, learn, and grow in the field alongside leading experts. Apply by June 1. For more details, visit: ht.acm.org/ht2025/summer-schoo

HUMAN’25 WorkshopHUMAN@hci.social
2025-04-07

Exciting news! The Call for Papers #CfP for the #HUMAN25 Workshop on Human Factors in #Hypertext is now open. We invite researchers to submit their work to this event, part of the ACM Hypertext Conference #HT2025 @ht in Chicago. Looking forward to your contributions. Spread the word!
human.iisys.de/human25/call-fo

2025-03-31

The History of editing and publishing in web browsers - Techno Barje

Only WorldWideWeb, some versions of Netscape and SeaMonkey ever supported the editing. WWW was the only non-modal one, I think it was a worthy design pattern. SeaMonkey still maintains Netscape's composer.

#browser, #text_editor, #hypertext

I have a Hack Club meeting after school today! The first meeting went okay, but I think this meeting will go a bit better. It's just an introduction to #HTML and #CSS, maybe some #javascript :)

#webdesign #web #hypertext

2025-03-20

👉 "Real-Time History: Engaging with #LivingArchives and Temporal Multiplicities." After a great start yesterday, the conference resumes with a workshop on web archives and the Archives Research Compute Hub by Karl Blumenthal.
Thrilled that I will be able to present my paper "Weaving Time: Hypertextual Historiography in the Age of Living Archives" later in the afternoon.

ℹ️ Check out the conference program: ghi-dc.org/events/event/date/r
#DigitalHistory #DH #Hypertext

2025-03-15

Having fun making some #animation for a video. Could/should I be doing this with vectors for automagic tweening -- yeah, maybe. But working traditionally is more fun.

This one is a compilation of clips for explaining the xanalogical #hypertext model, by the way.

Jukka Niiranenjukkan@mstdn.social
2025-03-07

Someone on the algorithmic social media platforms was saying it's a "brave move" to post #zeroclick content that has no links.

I had to reply with a picture of the #Hypertext Wikipedia article, to demonstrate how these walled gardens have destroyed the fundamental idea behind the web - while still using HTML to render the content of their own service.

There's nothing "social" about a service that downranks user posts with links. We should have a different name for it than "social media".

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst