Lunduke
Comedy • Gaming • News • Science & Tech
The Internet Archive's last-ditch effort to save itself
A lost lawsuit, a flimsy appeal, and misleading public statements... things aren't looking good for the Internet's archivist.
April 24, 2024
post photo preview

On April 19th, The Internet Archive filed the final brief in their appeal of the "Hachette v. Internet Archive" lawsuit (for which, judgment was handed down, against Internet Archive, last year).

What is curious, is that this final brief fails -- almost completely -- to reasonably address the core issues of the lawsuit.  What's more, the public statements that followed, by The Internet Archive, appeared to be crafted to drum up public sympathy by misrepresenting the core of the case itself.

Which suggests that The Internet Archive is very much aware that they are likely to lose this appeal.

After a careful reading of the existing public documents relating to this case... it truly is difficult to come to any other conclusion.

The Internet Archive does some critically important work by archiving, and indexing, a wide variety of culturally significant material (from webpages to decades old magazine articles).  In this work, they help to preserve history.  A extremely noble, and valuable, endeavor.  Which makes the likelihood of this legal defeat all the more unfortunate.

What is "Hachette v. Internet Archive"? 

Here's the short-short version of this lawsuit:

The Internet Archive created a program they called "Controlled Digital Lending" (CDL) -- where a physical book is scanned, turned into a digital file, and that digital file is then "loaned" out to people on the Internet.  In 2020, The Internet Archive removed what few restrictions existed with this Digital Lending program, allowing an unlimited number of people to download the digital copy of a book.

The result was a group of publishers filing the "Hachette v. Internet Archive" lawsuit.  That lawsuit focused on two key complaints:

  1. The books were "digitized" (converted from physical to digital form) -- and distributed -- without the permission of the copyright holders (publishers, authors, etc.).
  2. The Internet Archive received monetary donations (and other monetary rewards) as a result of freely distributing said copyrighted material.  Again, without permission of the copyright holders.  Effectively making the Internet Archive's CDL a commercial enterprise for the distribution of what is best described as "pirated material".

That lawsuit was decided, against The Internet Archive, in 2023 -- with the judge declaring that "no case or legal principle supports" their defense of "Fair Use".

That judgment was appealed by The Internet Archive.  Which brings us to today, and thier final defense (in theory).

What is the final defense of The Internet Archive?

Let's take a look at the final brief in The Internet Archive's bid to appeal this ruling.

In true Internet Archive form, a PDF of the final brief in their appeal has been posted to Archive.org.

The general defense of The Internet Archive is fairly simple: The Internet Archive's "Controlled Digital Lending" falls under "Fair Use".  And, therefor, is legal.

Let's look at two of the key arguments within the brief... and the issues with them.

Not "For Anyone to Read"

"Controlled digital lending is not equivalent to posting an ebook online for anyone to read"

This argument -- part of the brief's Introduction -- is quite a strange defense to make.

The "Controlled Digital Lending" program, starting in March of 2020, literally posted a massive book archive "online for anyone to read".  This was branded the "National Emergency Library".

Good intentions aside, the Internet Archive is now attempting to claim that they did not do... the exact thing that they proudly did (they even issued press releases about how they did it).

As such, I don't see a judge being swayed by this (poorly thought out) argument.

"Because of the Huge Investment"

"... because of the huge investment required to operate a legally compliant controlled lending system and the controls defining the practice, finding fair use here would not trigger any of the doomsday consequences for rightsholders that Publishers and their amici claim to fear."

Did you follow that?

The argument here is roughly as follows:

"It costs a lot of money to make, and distribute, digital copies of books without the permission of the copyright holder...  therefore it should be legal for The Internet Archive to do it."

An absolutely fascinating defense.  "Someone else might not be able to commit this crime, so we should be allowed to do it" is one of the weirdest defences I have ever heard.

Again, I doubt the judge in this case is likely to be convinced by this logic.

There are many other arguments made within this final brief -- in total, 32 pages worth of arguments.  But none were any more convincing -- from a logical perspective -- than the two presented here.  In fact, most of the arguments tended to be entirely unrelated to the core lawsuit and judgment.

The Court of Public Opinion

Let's be honest: The Internet Archive looks destined to lose this court battle.  They lost once, and their appeal is, to put it mildly, weak.

Maybe you and I are on the side of The Internet Archive.  Maybe we are such big fans of Archive.org that we want to come to their defense.

But feelings don't matter here.  Only facts.  And the facts are simple.  The Archive's actions and statements (and questionable legal defense) have all but ensured a loss in this case.

So... what happens next?

What do you do when you have a profitable enterprise (bringing in between $20 and $30 million per year) that is on the verge of a potentially devastating legal ruling which could put you out of business?

Why, you turn to the court of public opinion, of course!

And you spin.  Spin, spin, spin.  Spin like the wind!

Here is a statement from Brewster Kahle, founder of The Internet Archive", who is working to frame this as a fight for the rights of Libraries:

"Resolving this should be easy—just sell ebooks to libraries so we can own, preserve and lend them to one person at a time. This is a battle for the soul of libraries in the digital age."

A battle for the soul of libraries!  Woah!  The soul?!

That's an intense statement -- clearly crafted to elicit an emotional response.  To whip people up.

But take another look at the rest of that statement.  The Internet Archive founder says that resolving this case "should be easy".  And he provides a simple, easy-to-follow solution:

"just sell ebooks to libraries so we can own, preserve and lend them to one person at a time"

Go ahead.  Read that again.  At first it makes total sense... until you realize that it has almost nothing to do with this specific case.

Let's ignore the "one person at a time" statement, which is a well established lie (the Internet Archive proudly distributed digital copies of physical books to anyone who wanted them, not "one at a time").

But take a look at this proposed resolution... note that it has very little to do with the actual case.  The case is about the digitizing of physical books, and distributing those digital copies without permission of the copyright holder.  This proposed resolution is about... selling eBooks to lenders.

Yes.  Both have to do with eBooks.  And, yes, both have to do with lending eBooks.

But that is where the similarities end.  And the differences, in this case, are absolutely critical.

Let's take a look at the actual ruling -- which The Internet Archive is attempting to appeal:

"At bottom, [the Internet Archive’s] fair use defense rests on the notion that lawfully acquiring a copyrighted print book entitles the recipient to make an unauthorized copy and distribute it in place of the print book, so long as it does not simultaneously lend the print book.  But no case or legal principle supports that notion. Every authority points the other direction."

The Internet Archive's publicly proposed resolution does not address this ruling at all.  Which means that, when talking to the public, The Internet Archive is being dishonest about this case.

But they are using flowery language -- "battle for the soul of libraries" -- so they'll likely manage to convince many people that they're telling the truth and representing the facts of the case fairly and honestly.  Even if they are not.

There Are Important Disagreements Here

None of which is to say that the points which The Internet Archive is making... are necessarily wrong.

From the announcement of their appeal, the Archive states the following:

"By restricting libraries’ ability to lend the books they own digitally, the publishers’ license-only business model and litigation strategies perpetuate inequality in access to knowledge."

While this statement is designed to evoke specific feelings and responses -- among specific political demographics (see: "perpetuate inequality") -- there is an underlying set of issues here that are worth thinking about.

  • Is it important that libraries be able to lend official digital editions of books?
  • Should publishers, authors, and other copyright holders be forced to supply digital versions of their written works to libraries?
  • If digital works, borrowed from a library, are then copied and distributed more than the rights allow... who is ultimately responsible for that?  The library?  The creator of the software system which facilitated the lending?  Nobody at all?
  • Should Libraries or Publishers be able to censor or modify digital works... or should a published digital work be maintained as it is at time of publication?  (This issue comes up a lot when talking about censorship and revisions of works.)

These are legitimate questions.  And, while the answers may appear obvious, there truly are distinct disagreements among publishers, authors, and libraries.

Some of these issues are raised by The Internet Archive, BattleForLibraries.com, and others.

The "Battle for Libraries" campaign

But none of these questions -- not one -- are part of the ruling in "Hachette v. Internet Archive".

The question that has been answered in this case is simply:

  • If you buy physical media (such as a book), can that media be digitized and distributed on the Internet (without authorization or notification of the copyright owner)?

And the answer is, thus far, a resounding... "No".

The Can of Worms

What happens if the judge chooses to uphold the existing judgment against The Internet Archive?

A number of things seems possible (with some seeming like a downright certainty).

  • Publishers, authors, and copyright holders of works distributed by The Internet Archive may choose to seek damages.  Which could put The Internet Archive in a precarious financial position (to say the least).
  • The Internet Archive may be forced to remove other content of questionable copyright.  Including software, video, and audio archives.
  • Other archival projects may now come under increased scrutiny... thus making it riskier to archive and distribute various types of material.
  • And, of course, The Internet Archive could attempt to appeal the case ever higher.  Which may be tricky.

Then again... The Internet Archive could win this appeal.

Unlikely.  But, hey, weirder things have happened.

community logo
Join the Lunduke Community
To read more articles like this, sign up and join my community today
19
What else you may like…
Videos
Podcasts
Posts
Articles
Titus in the Hotseat on "Lunduke: The Alex Jones of Linux"

Our buddy, Chris Titus, put out a show entitled "Lunduke: The Alex Jones of Linux". Now, obviously, I had to invite him over to The Lunduke Journal to tell him all of the things he got wildly wrong.

Note: This is two people disagreeing on some things -- while agreeing on others -- and being cool to each other about all of it. This is how it should be.

The Titus Video:
https://www.youtube.com/watch?v=j_uqTouv6LA

01:20:13
What the Fart is a Byte?

Seriously. Do you think you know what a Byte is? And who created the "Byte"? Are you 100% sure? Time for some quick history on the Bit and the Byte.

Article: "Who (really) created the Byte?"
https://lunduke.locals.com/post/5599596/who-really-created-the-byte

00:14:27
Did Lunduke just scare The Register off Twitter?

The Register victim blames Red Hat employee who files discrimination lawsuit. Gets called out. Immediately deletes Twitter account.

Note: This is not important... just goofy.

00:12:27
November 22, 2023
The futility of Ad-Blockers

Ads are filling the entirety of the Web -- websites, podcasts, YouTube videos, etc. -- at an increasing rate. Prices for those ad placements are plummeting. Consumers are desperate to use ad-blockers to make the web palatable. Google (and others) are desperate to break and block ad-blockers. All of which results in... more ads and lower pay for creators.

It's a fascinatingly annoying cycle. And there's only one viable way out of it.

Looking for the Podcast RSS feed or other links? Check here:
https://lunduke.locals.com/post/4619051/lunduke-journal-link-central-tm

Give the gift of The Lunduke Journal:
https://lunduke.locals.com/post/4898317/give-the-gift-of-the-lunduke-journal

The futility of Ad-Blockers
November 21, 2023
openSUSE says "No Lunduke allowed!"

Those in power with openSUSE make it clear they will not allow me anywhere near anything related to the openSUSE project. Ever. For any reason.

Well, that settles that, then! Guess I won't be contributing to openSUSE! 🤣

Looking for the Podcast RSS feed or other links?
https://lunduke.locals.com/post/4619051/lunduke-journal-link-central-tm

Give the gift of The Lunduke Journal:
https://lunduke.locals.com/post/4898317/give-the-gift-of-the-lunduke-journal

openSUSE says "No Lunduke allowed!"
September 13, 2023
"Andreas Kling creator of Serenity OS & Ladybird Web Browser" - Lunduke’s Big Tech Show - September 13th, 2023 - Ep 044

This episode is free for all to enjoy and share.

Be sure to subscribe here at Lunduke.Locals.com to get all shows & articles (including interviews with other amazing nerds).

"Andreas Kling creator of Serenity OS & Ladybird Web Browser" - Lunduke’s Big Tech Show - September 13th, 2023 - Ep 044
post photo preview
Winamp going open source!

... On September 24th. Weird. Wonder why that date specifically. Just saw this a moment ago and had to share.

https://about.winamp.com/free-llama

post photo preview

Helping my kid with her math...
Kid: I hate all these negative numbers
Me: Ya, wouldn't it be great if everything in the world was just absolute?
Kid: <looking at me confused>
Me: Just imagine how positive everything would be.

post photo preview
Funny Programming Pictures Part XL
(That's Roman Numerals for "40". Not "Extra Large".)

Let the great Ctrl-C, Ctrl-V commence!

 

THUNDERDOME.

 

That's some Big Data Edge Computing Artificial Intelligence (tm) right there!

 

This is, as far as I can tell, the only thing that Kubernetes does -- turns servers off and on in the most convoluted way possible.

 

Lol.  Wait.  I feel like I have a comic strip about this... Found it!

 

Pole Positition is rad.

 

This needs to be posted regularly.  As a reminder.  John Connor is VERY disappointed in y'all.

 

Not gonna lie.  I've done this.  Repeatedly.  Don't judge me.

 

Is it weird that I kinda want this for a modern computer monitor?  Somehow it makes the Windows login screen seem so much cooler.

 

That's the one think that'll probably save us from Skynet: The darn AI will probably run out of memory because of all the shoddy code.

 

I am including this here for one simple reason: I had to write some regular expressions today.  And I, literally, had to look up how to do everything.  Again.  For the 50th time.

 

Santa knows what's what.  I bet there's some seriously tight VBA and Pivot Tables in Santa's spreadsheets.

 

This one isn't about computers at all.  I just felt like felt like you could use a little pick-me-up.

 

It's funny because AI is copying the works of others and not providing anything new for our culture.

 

I don't see how this wouldn't work.  Infinite energy.  Forever.  Thank you, Rust programmers!

 

This Albanian Virus is significantly less intrusive than McAfee Antivirus.  Which, as it happens, fails to detect the Albanian Virus.

 

Every time somebody tells me about a new versioin of Java being released, I say to them "I thought Java 6 was the last one."  Just kidding.  Nobody has ever told me about a new version of Java being released.
Read full Article
post photo preview
The very first interview about Linux with Linus Torvalds - Oct 28, 1992
A 22 year old Linus talks about his creation... before he knew it would change the world.

On October 28th, 1992, the first interview about Linux — with Linus Torvalds — was published in a small E-Mail newsletter.

That newsletter, the aptly named “Linux News”, was written by Lars Wirzenius — a friend, colleague, and fellow student of Linus Torvalds’ at the University of Helsinki.

This newsletter was significant for two big reasons:

  1. It was the first such newsletter to be written specifically for Linux. (The first large format Linux magazine would appear 2 years later, as The Linux Journal.)

  2. Issue #3 of this newsletter contains the first interview ever, with Linus Torvalds, about Linux.

What follows is that entire interview, completely unedited. It appears here exactly as it did in 1992… when Linux was only a bit over 1 year old (and not yet at version 1.0). At this point in time, Linus — then just 22 years old — had no idea of the massive impact and longevity of Linux.

There are some fun little bits of Linux history in here -- including the role DOS and Prince of Persia play in the story of Linux (not to mention the Commodore Vic-20).

A huge thanks to Lars Wirzenius (and to Linus Torvalds) for making this back in 1992 (and making it available to all of us now).


Why is Linux News better than BYTE, CACM, the National Inquirer, and sliced bread? We interview Linus! In this first-ever, breathtaking, revealing interview, the Grand Wizard Linus tells it all! Well, almost...

LN: Tell us a bit of yourself and your background. Age, education, occupation, family, pets, hobbies, computing history, etc.

Linus: Hmm. I'm 22 (as some avid kernel source readers have already found out: there is a hidden clue in there somewhere...), and am (slowly) working my way towards a fil.kand (MSc? whatever) in computer science at the University of Helsinki. I'm currently in my fourth year (hmm.. fifth, but one was spent in the army) of studies, and I expect to sit here studying for a long time to come.

I still live at home (which is why I can afford to work on Linux and study at the same time without working too) with my (100% white) cat (Mithrandir, but it's called everything from "randi" to "klomppen" depending on my mood) and my sister and mother. The fun never ends.

I started with computers (a VIC-20) when I was about 11, first with BASIC, then learning 6502 machine code (assemblers are for wimps). I looked on with envy while my friends got their C-64's (I didn't have any more money then than I have now), but was eventually able to get a Sinclair QL and get some real programming done under a multitasking (albeit somewhat weird) system.

On the sinclair QL I continued to program in assembly (The QL BASIC (SuperBasic) was ok, but I wasn't interested), and I wrote various more-or-less useless programs (ranging from a FORTH compiler and an editor-assembler system of my own to pac-man to a msdos compatible floppy disk driver). The QL was a fun machine, but there weren't very many of them in Finland, and although I was generally happy to write my own programs (still am), it did teach me to buy hardware that actually is supported.

LN: When and why did you start writing Linux?

Linus: I took this course on UNIX and C at the university in the fall of 1990, and I got hooked. I had naturally seen some of the PC-contemptibles running msdos, and I was relatively happy with my QL, although some of the 386's were a lot faster. But one of the books we read during the course was "Operating Systems, Design and Implementation" by Tanenbaum, and that way I learnt about Minix. I wanted my home machine to have a similar setup to the suns at the university, and Minix seemed like a good candidate.

So when I had scrounged up enough money, I bought myself an AT-386 compatible machine (well.. I didn't have enough money, so I'm still paying on it, but it seems I'll get enough money for Linux to finally pay off the last rates). I had long since decided that anything less than a 386 wasn't worth it, and with Minix on it, I thought I'd have a nice enough system.

As it turned out, Minix wasn't available in Finland (at least I wasn't able to find it easily), so while I got my machine on January 5th 1991 (easy date to remember due to the monthly payments :-), I was forced to run DOS on it for a couple of months while waiting for the Minix disks. So Jan-Feb was spent about 70-30 playing "Prince of Persia" and getting aquainted with the machine.

When Minix finally arrived, I had solved "PoP", and knew a smattering of 386 machine code (enough to be able to get the machine into protected mode and sit there looping). So I installed Minix (leaving some room for "PoP" on a DOS partition), and started hacking.

Getting Minix wasn't altogether a pleasant experience: the keyboard bindings were wrong, and it didn't exactly act like the suns I was used to (ugghh. I *hate* the bourne shell for interactive work). The keyboard was easy to correct (although I didn't like the Minix keyboard driver code), and applying Bruce Evans' 386-patches made the system a bit more "real".

So somewhere around March-91, I had a 386 system running Minix-386, and I was able to install awb's gcc-1.37.1 port. After that, I was able to port bash to the resulting mess, and things looked a bit better. I also spent my time generally fooling around (porting gcc-1.40 and various other programs), and kept on learning about the 386 while doing so (writing small boot-disks that would set up a protected mode environment and print out various inane messages).

I had noticed by that time that Minix wasn't enough even with the 386 patches (various troublesome problems: no job control, ugly memory management, no fpu support etc). So I slowly started to try to make something out of my protected mode trials, and the result is Linux.

LN: Please give a short summary of the history of Linux.

Linus: Difficult. "Linux" didn't really exist until about August-91 - before that what I had was essentially just a very basic protected mode system that had evolved from a glorified "Hello world" program into a even more glorified terminal emulator. Linux stopped for quite a while at the terminal emulator stage: I played around with Minix, and used my protected mode program to read news from the univerity machine. No down/upload, but it did a fair vt100 emulation, and did it by using two tasks which communicated from keybodard->modem and modem->screen.

By mid-summer -91, "Linux" was able to read the disk (joyful moment), and eventually had a small and stupid disk driver and a simple buffer cache. So I started out trying to make a filesystem, and used the Minix fs for simple practical reasons: that way I already had a file layout I could test things on. After some more programming (talk about glossing things over), I had a very simple UNIX that had some of the basic functionalities of the real thing: I could run small test-programs under it.

By that time I looked around for some standards texts - I decided early on that I didn't want to write the user-level programs, and that in order to easily port things I'd either have to make the new system compatible with Minix (ugghh) or follow some other kind of standard. What I wanted was a POSIX guide, not so much to be 100% posix, but in order not to do anything really stupid I'd regret later.

My quest for the posix standards failed, as the posix standard committee sells the standard to feed itself as I found out, but I did get a good pointer to the (then very alpha and unsupported) GNU libc.a, which had an early manual accompanying it. The manual was of some help, but the biggest help was actually the contact to the person who pointed it out to me: [email protected]. He was/is the organizer of the pub/OS subdirectory at nic.funet.fi, and was interested in giving Linux a home at nic.

Back then, I was only idly thinking about making my system available (and I had no real time-table), but arl happily created a pub/OS/Linux subdirectory at nic, and thus also gave the system it's name. I wasn't really ready for a release yet, so the directory contained just a README for about a month ("this directory is for the freely distributable Minix clone" or something like that). Arl probably thought the project wouldn't come to anything.

Anyway, around the end of August-91 or so, I had a system that actually worked somewhat: I was able to run the Minix shell (recompiled with new libraries) under it, and some other things also worked. I released Linux-0.01 in September, telling about it by mail to those who had shown interest in it when I asked around on the minix newsgroup. 0.01 was a source-only release, and I don't think anybody actually compiled it, but it was a statement of intent, and people could look at the sources if they wished. I don't think more than about 5-10 people ever looked at it - I wasn't yet too happy about it, so I didn't announce it publically anywhere.

A few weeks later (October 5th by the minix news-archives), I had gotten my act together sufficiently to release 0.02, along with a couple of binaries you could run under Linux (bash, gcc, update and sync, I believe). It still needed minix-386 to compile the kernel, as the harddisk parameters were hardcoded into the hd driver, but I know some people had it up and running: arl even sent me some ftp-statistics about it (which I've sadly deleted by now). Gcc wasn't reliable under linux yet: it couldn't compile big files due to various buffer-cache problems, but you could get small programs going even under 0.02.

Not much later, I released 0.03, which actually worked pretty well - the buffer cache mostly worked, as did most other things. Heady with my unexpected success, I called the next version 0.10, and by that time I already got comments from early beta-testers, as well as actual patches. The linux community wasn't much: maybe 10-20 minix users who enjoyed hacking a new kernel.

After 0.10 came 0.11, and things were pretty much plain sailing. The system was stable enough to be used for further developement, and it was "just" a matter of correcting bugs and extending the system. I added swapping to the system in three days just before X-mas 91, and was finally able to say that I was no longer playing catch-up with Minix. The swapping code was ugly and not very well tested: it actually had bad bugs in it until I needed it myself when X11 came around, but it was something of a milestone. The next version (0.12) came out exactly (?) one year after I bought my computer (Jan 5th -92), and it was the version that finally got popular: by that time it was a very much valid alternative to Minix, and people started getting interested.

Later versions (0.95 etc) have had a lot of new features, and quite a few bug-fixes. There have also been major re-writes (first the fs was slowly changed to have a vfs layer, then the kernel sleep/wakeup primitives got rewritten, and then the mm got restructured). In spite of that, I think 0.12 was what might have been called 1.0 - it had the basic features, and worked.

LN: Have you enjoyed the past year and a half? Have you liked some things especially, have there been things you haven't liked?

Linus: It's definitely been fun. Things have changed pretty radically: the early couple of months were solitary hacking runs with 5-10 reboots a day to check out bugfixes/features - seeing the system evolve noticeably in a relatively short time. Now, most of my Linux hacking time goes into design (new features do take some more thought now) and/or administrative things like keeping up with linux mails etc - it's seldom a question of 40+ hours a week of pure hacking.

Getting mail (within limits) is fun: especially if it's 99% positive, as it has been. And people have been generally enthusiastic, sending patches, ideas, requests for features, etc. There are downsides: before the newsgroup got founded, I often got more than 70 mails a day. Things have calmed down significantly: while I still get 20-40 mails per day, many of them are from the mailing-lists and not to me personally, so that I can essentially ignore them if they aren't interesting.

Negative things have been mostly due to driver problems: while people have been very nice about it, it's still not fun getting mail about "the system from hell that ate all their files". Especially if I haven't had a clue about what could be wrong. Other problems have included just lack of time and different priorities: some people have gotten impatient when I haven't included some special feature or other. I usually need some kick-starting if it's not something I'm especially interested in.

LN: Why is Linux copylefted? The copyright was different in the early versions. Why did it change? Do you support the GNU view of software in general? What are your feelings about freeware, shareware, and commercial software?

Linus: One of the basic principles has always been being that it should be freely distributable without any money-begging. I generally dislike shareware: I feel guilty about not paying, so I don't use it, but on the other hand it is irritating to know that it's there. Illogical, but that's how I feel.

Early versions of Linux had a very strict copyright: it disallowed any payments at all (not even copying costs etc), while otherwise being similar to the GNU copyleft (ie freely distributable assuming full source is made available). It was probably an over-reaction to the dislike I felt against the way Minix had been set up: I thought (and still do) that Minix would have been better off had it been freely available by ftp or similar.

The copyright got changed with version 0.12, as there were a couple of mails even back then asking about the possibility of a copying service or similar. After removing that clause from the copying conditions, I essentially had the GNU copyleft (without the legal verbiage), so I decided I might as well use the copyleft as-is. And as Linux depended (still does) heavily on copylefted programs, it's only natural that the kernel should be copylefted as well.

LN: When are you planning the 1.0 release, and what do you expect it to include?

Linus: I've planned the 1.0 release for a long time, and I've always waited just a bit longer. Right now my final deadline is "before X-mas", but I hope it would be ready before December. No major new features: I want some cleanups and to get rid of bugs, but it's nothing special I'm waiting for right now.

LN: How do you feel about Minix, 386BSD, and Hurd and their authors? Are they rivals, or or allies?

Linus: 386BSD and Hurd are most definitely allies - I'll be happy to help them any way I can (for 386BSD I was already able to help with the math-emulator, and I've been in contact with some others re: vm86 etc). If 386BSD had been available a year earlier, I would probably never have started on Linux, but as it is, I'm happy to say that 386BSD didn't automatically mean that Linux wasn't worth it. Both 386BSD and Linux have their points, and I naturally think Linux is more fun.

As to Hurd, I don't know when it will be ready nor what it will look like. But it will be different enough that I don't think there is any point in considering it a rival. I doubt Linux will be here to stay, and maybe Hurd is the wave of the future (and maybe not), but at the very least it's an interesting project.

Minix... Hmm. It's no longer a rival, unless ast does something really unexpected with it - the niches are simply too different. Linux won't work on many machines that Minix runs happily on (x86, x<3, amiga, mac etc), and even on a 386, Minix is still probably preferable as a teaching tool due to the book. But for anybody who used Minix to actually get a UNIX environment at home, I don't see any reason to stay with it, as both 386BSD and Linux are free and give much better features.

On the other hand, I have to admit to a very unbecoming (but understandable, I hope) feeling of glee when I saw that c.o.linux had finally more readers than c.o.minix. There was a heated discussion about Linux on the Minix newsgroup back when c.o.linux (actually, alt.os.linux at that time) had just begun, and ast tried to ridicule it (one of his comments on c.o.minix being that I wouldn't have passed his course in OS design with such a bad system..). Ast and I mailed about it, and it left a slightly bitter after-taste.

LN: The Jolitzes suggested a while ago a contest between 386BSD and Linux, what do you think about it?

Linus: I don't necessarily think it would be a good idea: I cannot imagine how it would be "judged" or whatever. The only contact Linux and 386BSD has had has been only positive (aside from occasional flame-wars, but it's a religious argument..), and I don't think there is any need to try to get any kind of rivalry going. The argument seems to have been that such a contest would make both systems better, but I frankly doubt that is the case: both 386BSD and Linux will evolve even without any special contest held between them, and a contest would just result in more rivalry and flame-wars.

Linux and 386BSD have totally different goals - 386BSD wants to be BSD, while Linux just is whatever we make of it. 386BSD was helpful in giving me some ideas (I read the Jolitz column in DDJ with interest), and while it's a bit scary to have a big and well known UNIX kernel that fills a similar niche as Linux, there is no reason to choose one over the other on a larger scale. People will prefer one or the other, and if either shows itself to be much better/popular, so be it.

LN: What about the future? Are you planning to support Linux, or do you intend to retire and let it survive by itself?

Linus: I'm most certainly going to continue to support it, until it either dies out or merges with something else. That doesn't necessarily mean I'll make weekly patches for the rest of my life, but hopefully they won't be needed as much when things stabilize.

LN: Are you going to write a book about Linux? Or a detailed history, > with all the gory details revelead?

Linus: I don't like writing documentation, and writing a book is certainly not planned. There is some pressure for me to write a history, hope this interview will server at least partly as one. And there certainly won't be any gory details: if there were, I've already forgotten them (or flushed them: I have sadly deleted my correspondence with ast along with all other old mail. I simply don't have room for it, and I'm too lazy to back it up.)

LN: Is Linux your dream operating system? Are there things that you dislike, or would like to do differently, if you would start over from scratch?

Linus: There are things I'd like to change - but then it wouldn't be UNIX any more. There are good points to a microkernelish design and distributed systems: I just haven't got the resources to do anything about it. I'd like to do a more exotic system, with better support for pending I/O, distribution of processes etc, but with just one 386 at home, I'm not likely to do anything about it in the next few years. And maybe I'll have found a new area of interest by then anyway..

But in general, I think Linux does what I was looking for pretty well. There are details I dislike in the kernel, but the basic ideas have worked well, and there are no major ugly warts in the Linux design. So in that way it is kind of a dream system - just enough problems to keep up the interest, and keep it evolving. No program is ever perfect, and operating systems are interesting programs: there are a lot of things you have to keep track of, and a lot of different ways you can solve the problems. Linux does it one way: 386BSD has many basic similarities in design, but some major differences in implementation. Then there are OS's like Hurd (well, Mach right now) and Amoeba which have a totally different design strategy, giving different problems and solutions. There may be one right way of doing things, but I doubt it: and Linux doesn't do too badly.

Read full Article
post photo preview
Browsing the World Wide Web via E-Mail -- 1990's Style
A look back at "Doctor Bob's Guide to Offline Internet Access".

Back in the 1990s… browsing “The Web” was a distinctly different experience for many people.

Some had a limited amount of time which they could be “On-Line”. Others had access to Internet E-Mail, often through a local dial-up BBS… but not the ability to use a graphical Web Browser. (Yes… “E-Mail” has a dash in it… that’s how it was in the beginning — as it is “Electronic Mail” — and that’s how it shall forever stay.)

Luckily, a solution presented itself:

“Doctor Bob’s Guide to Offline Internet Access”

First published in 1994 by “Doctor Bob” Rankin, the guide to offline Internet access focused on ways you could fetch (and interact with) various types of Internet servers entirely via E-Mail.

The sheer amount of different types of Internet servers that could be used via E-Mail was nothing short of amazing: FTP, Gopher, Jughead, Usenet, Finger, Whois, Nslookup, Traceroute, and (of course) the “World Wide Web” were all usable (to one degree or another).

“Doctor Bob” continued to update and release new versions of the guide until 1999, when he handed duties over to Gerald E. Boyd. The final version (to my knowledge) was released in 2002 and is available in full at faqs.org.

But… HOW?!

The way all of this worked was actually pretty ingenious in its simplicity.

There were servers — quite a lot of them — that you could email. In the body of your email you would include any of a number of different commands. The server would receive your email… and send a response back to you with the result of your command.

You could almost think of these “Web via E-Mail” servers as command line tools… that you use via E-Mail. Most of them even included a “Help” command that would email you an introduction and list of available commands.

One of the most popular (and earliest) servers, known as Agora, was developed by the World Wide Web Consortium — with the final release (0.8b) published in 1997.

Let’s say, for example, you want to read the contents of “Lunduke.com”. Easy peasy! You’d simply email the Agora server of your choice (Doctor Bob’s included several to get people started) with the following in the body of your email:

send http://lunduke.com

You would then get an email response (sometimes quickly… sometimes with a large lag time) with the text-mode version of that webpage.

Fun tidbits: The Agora Web Browser was written in Perl and ran on DEC Alpha servers. It was based (in very large part) on the second Web Browser ever created: the portable “Line Mode Browser” from 1990: a text-mode tool for fetching webpages from a command line.

Searching the Web via a search engine was possible via Agora, and typically was done by including a fully formed URL (with all of the search words) in the email. For example, the following would use the Lycos Search engine to search for “linux sucks”:

https://search.lycos.com/web/?q=linux+sucks

Not the most user-friendly method in the world, but it was functional.

Later, more advanced, “Web via E-Mail” servers would include some additional features to make this all a bit easier.

For example, “GetWeb” and “WWW4MAIL” (two of the most popular, full featured servers) would allow you to perform the same search (for "linux sucks") by sending the following email:

SEARCH LYCOS linux sucks

See? Much nicer.

The Impact of Doctor Bob’s Guide

Many modern Internet users may have never even been aware of Doctor Bob’s Guide to Offline Internet Access”… just the same, its impact was far reaching.

People, across the world, utilized the techniques in the guide to gain some form of “Web Access” in areas with little availability of Internet Access. In fact, its usage was so widespread that various versions of the guide were translated to 32 different languages.

All The Servers are Gone

To my knowledge, no such “WWW via E-Mail” servers (Agora, GetWeb, or WWW4MAIL) are still in operation. In fact, even finding the source code for some of these servers has proven challenging.

There have been a few attempts at writing a new such server over the years — including “newAgora” written in Python. However, none seem to have any longevity to them (newAgora was last updated 11 years ago).

This isn’t terribly surprising, as the “WWW” has become increasingly difficult to use via text-mode browsers over the last 20 years. Add on top of this the continually shifting SSL requirements of most servers… and it has simply become too complex of a task.  Especially considering the lack of interest  in supporting such functionality.

Just the same, it’s sad when these sort of systems are no longer functional. A whole new generation of people will never have the opportunity to experience what it was like to “browse the web” entirely via E-Mail.

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals