Self Wright

Category: Computing Page 1 of 2

On the “unix philosophy”

There’s this talk: https://www.youtube.com/watch?v=3Ea3pkTCYx4

There’s this blogpost about an aspect of the talk: https://blog.deref.io/unix-and-microservice-platforms/

There’s this discussion on the blogpost about the talk: https://lobste.rs/s/mjo19d/unix_microservice_platforms#c_mleswe

And then there’s this post, about a comment (by @andyc, or as I think about him, “the oil-shell person“) about the blogpost about the talk, which, by being the third derivative, should be that much smaller.

So I’ll just highlight the reframing of “unix philosophy” into this one-liner

Semi-structured text streams, complemented by regexes/grammars to recover structure

And the contrast between

data-centric rather than code-centric

and

protocol-centric, not service-centric

My history with computers, part 4: The early internet

Context

In my previous post I talked about how the realm of what was possible expanded when we got a better, faster computer … but it took a whole other leap with the first “on-line” experiences.

Quaint rumors

I think the first way of knowing anything about this was Internet for Dummies (probably this). Having literally no other point of reference, I read and re-read this.

It was wild.

http://thoughtcatalog.com/wp-content/uploads/2013/09/aol_v4-1.png

Part of it was about the various “walled gardens” that were the most popular options: Compuserve, *America Online * (or so the dummies book told me, we had to start with what was available at the time, a “text” connection with a national telecom provider).

Part of it was gobbledygook about setting up PPPoE settings with an ISP (as an aside, folks who actually ventured into all this without a technical background in those days must’ve been effing brave. There was a lot of stuff to configure back then, none of this “oh is the WiFi on?”, no)

But all of it was about how cool it was to interact with people online.

First contact

I have a vague memory of this, but there was some sort of an “internet course” I signed up for (or rather my dad signed me up for). It was supposed to be a few days of an hour each, and was a bit dry, but in the end there was, yes, some time with an actual browser.

I’m sure there are millions who experienced it the first time this way: Netscape Navigator, the coarse-grained meteor logo with its brilliant flash … and then the page loads … what is this thing?!

If this sounds lame, well, I was lame, but this was also a genuinely rare experience at the time.

On-ramp to the information superhighway


The way magazines talked about this new thing was pretty funny too, in retrospect (and given where we’ve ended up, painfully idyllic). The world-wide web, the information superhighway, all kinds of phrases trying to describe what people thought about it, all of it optimistic.

Well, nearly all: I watched a talk by Neil Postman towards the end of the 90s, it was a devastating critique of the impact of television, it sounds like an early warning today (if you like that sort of thing, byte-sized versions: 1, 2)

“Cyber cafes” sprung up like weeds, offering 30-minute slots to be online. Just imagine that, having your entire web presence — not just your laptop or desktop, not just your smartphone or smart tv, everything! — being limited to this tiny slot of time, not just per day, but per week! This was, for many people, the only chance to catch up on emails, chat, whatever.

https://images.techhive.com/images/article/2014/10/slide-4-100522664-orig.png

I used to go for a sort of computer class … think of it as a sort of after-school activity … and there was time at the end when I was waiting here, when I opened the browser (the wars were swiftly over by now, Internet Explorer had already won, though I kept trying out new releases of Netscape Navigator (later Communicator) on our home computer) and just randomly go places.

My early web

I wish I could remember more, but I don’t, so here are a few initial forays that come to mind.

WWF (now the WWE)

(Where would I be without the Wayback machine, to remind me how things used to look?)

I wouldn’t watch a minute of this today, but back then I was about-to-stop-being-a-fan. So I printed out a t-shirt design and my mom (yep, I had a great family) actually copied it onto a real t-shirt with fabric dyes. Totally lame, but it felt epic.

X-files

I was a legit fan (maybe I still am, at some level … more on that later). So the official X-files website was the first one I devoured in depth-first fashion.

Then discovered fansites. Then discovered the shipping sites. And now you know more about me than you want to. Okay.

Geocities

Ooh, Geocities, such a 90s thing. A utopian take on having different tribes and communities(there was already such a diverse bunch) have clusters of home pages.

Obviously, I hung out at “Area 51”.

Someone tried re-inventing this recently with “Neocities”, but … you know, you can’t repeat the past.

(Update: someone made a mirror of the old site)

Hotmail

My first email. Probably one of the first “email-as-a-service” offerings. At this point it might be possible to guess my first password.

https://api.time.com/wp-content/uploads/2014/03/hotmail.png

Yes, for a long time, this was the only password I had: there were no other places to “log in” to, and the computer at home was single-user!

(It was a whole six years before I switched to Gmail, but I shouldn’t jump ahead)

Web-rings

There weren’t blogs as yet, just stand-alone websites (this was when people actually wrote html! Think about that! People are capable of so much more, and yet …)

Non-web bits

Every online activity wasn’t directly related to “surfing the web”. There were a bunch of things like downloading themes and desktop backgrounds that happened because it was easier to do them.

Messaging

Yahoo Messenger!

MSN Messenger!
AOL Messenger! (yes, the one part of America Online that survived longest)

ICQ! (what a weird name, now that I think about it … and having to memorize numerical userIDs … sheesh)

I forget which came first, but I ended up using all of these extensively, and (naturally) all of these chats and contacts are now lost.

There is definitely some tradeoff between keeping records and throwing them away. While I’m apprehensive about having everything I do recorded these days, I also like coming across at least a few key images or emails etc from the past.

Gopher

I didn’t really use a lot of this, only knew about it from the Dummies book in fact, and … really one of those “alternative routes” that never really got taken because “the internet” and “the web” became synonymous, and that was before “the web” became “2.0”.

Napster

Hoo boy, Napster. The way people got music — and also the way music really became globally available and accessible, in the days before Youtube/iTunes/Spotify/whatever.

Run software. Search. See results. Download. Wait

https://upload.wikimedia.org/wikipedia/it/6/62/Napster_2.0_Beta_7_screenshot.png

I didn’t even have a fast internet connection (it was measured in kbps), in the beginning, so I waited a whole day to download one measly MP3.

I’ll tell you, it felt wonderful to listen to that one little MP3 over and over.

Yes, I know. It sounds … pathetic, now. No real takeaway, except perhaps that we value whatever we put in effort for, heh.

Transition

Time to stop, and hit publish, or this’ll never be done.

Next time? Dunno, maybe my first (and now that I think about it, also the last!) “personal desktop”.

My history with computers, Part 3

Old computers sometimes had a “Turbo” or “Boost” button to manually switch to a higher clock speed. Toggling this on and off could count as a valid game-playing strategy, if you needed to “speed past” obstacles, etc. Yes, it’s just as bizarre as it sounds.

Context

In the last decade (and half, roughly) people have gotten used to a lot of niceties in our operating systems, smooth integration between different devices, nifty apps, wonderful cameras, and more — but not increases in speed.

It is hard to convey how different this aspect was in the 90s. Every year, sometimes twice a year, there were glossy magazine advertisements about faster computers.

A new computer

So around 1998, it was possible to buy a new computer, with a CPU rated at 233Mhz. Two hundred thirty-three megahertz. It also had a fancy new operating system, the just-released Microsoft Windows 98 (ooh 😐).

There was an actual sound card (something that isn’t thought of as a “pluggable thing” any more), which meant it was possible to get speakers to play actual sound (today if you buy speakers, it’s as a part of your room, not as a part of your computer).

The display (or “monitor”; heh, no one uses that word any more) had color, and there was a mouse that could be plugged in, and this computer didn’t just have a floppy drive, but a new optical media, the CD-ROM.

Aside: relative speed evolution

The first computer at our home, mentioned in the earlier post (late 1994), had a CPU with a clock speed of 33 Mhz. Thirty three megahertz (btw this seemed huge to me at the time: “so many calculations in a single second!”)

My first ever personal desktop (mid-2002, more on this later), had a single-core CPU with a clock speed of 1Ghz. One thousand megahertz, or a 30x increase.

My first MacBook (mid-2008) had a dual-core CPU rated at 2.0Ghz. Two thousand megahertz, or a 2x increase.

My current MacBook Pro (mid-2019) has a 8-core CPU rated at 2.3Ghz. Two thousand three hundred megahertz.

My iPhone (early 2018), uses the “A11 Bionic” with a maximum clock rate of 2.39Ghz. Two thousand three hundred ninety megahertz.

You can imagine the graph in your head.

Programming

QBasic was gone, to be replaced with … Visual Basic. This allowed a lot of experimentation with simple forms, but I didn’t really have any ideas on what to do with it, so I let it lapse.

There was also Turbo C , which, despite the name, was a reasonably popular language environment (from Borland, which is not a name most would recognize today, but at the time, it was … like JetBrains plus Visual Studio, and more). There weren’t a lot of materials to learn from, though I remember at least being able to copy in a few examples, and so on.

Still later on, around 2000-ish, I got some game programming books, and really liked learning from them, since it was very straightforward to build something with DirectX (never mind) in C .

Nothing comparable to the vast tools and materials available to kids these days, but … good times.

Apps

There were a whole bunch of computer magazines that came with CDs, containing free trials of all sorts of stuff, and it was something to look forward to every month — to try out whatever was new that month: install it, fool around with it, then delete.

I wish I had pictures or notes or anything, but I don’t, so this vagueness will have to do.

I do remember the first time I used Microsoft Flight Simulator (which, btw, is making a big comeback). Even with that relatively poor graphical resolution, a 3-d experience of this sort was magical.

Something else that stands out: Microsoft Encarta. It was the first digital encyclopedia and they did a really good job of it. There were audio and video clips, lots of articles to read and switch between.

The pros and cons with paper should’ve been apparent already: the content was beautiful, though I can’t imagine someone spending hours and hours interacting with Encarta they way I can imagine someone spending that time with a paper version (but maybe that’s just me).

Aside: the time of Microsoft

In case it isn’t obvious: yes, this was the decade of Microsoft domination, something that people have no gut feeling for anymore — but twenty years ago, before big-Google, big-Facebook, big-Amazon, big-Twitter, big-Netflix, big-Apple, there was only big-Microsoft.

Games

This was the highlight of my time with the machine 🙂

FPS

First of all, I finally had something to play Quake with (the minimal RAM requirements were 8MB; our earlier computer had 4MB, while this one had 64MB. As a fun exercise, try to find out how much a single tab in your browser is using right now).

Quake was made by the same company (ID Software) that made Wolfenstein (which we had played so much of on our earlier computer), and Doom (which I missed out on for whatever reason). Again, this is something hard to convey now, but these were iconic first-person shooter games, the very first ones, in fact … which is probably why they were popular, since they seem quite boring by today’s standards.

Anyway, Quake was just the beginning, this machine was in a sweet spot to play most of whatever came out, and the free apps on the CDs in the monthly computer magazines were usually free games.

Aside: single-player gaming

Although much remains the same in games over the decades (apart from the massive improvement in their visual appearance), something that is very different is the experience “un-connected” solitary game.

Most games today either directly involve other players, or indirectly (through comparison in a leaderboard, etc). I think the only equivalents of “playing something alone, immersed in the world” are certain mobile games, like Monument Valley, etc. where you own the game, you play the game, and no one else really knows about how you played, the experience is yours alone.

Early on, everything was like this (although it was quite common for friends to sit along side you as you played, so there’s that).

RPG

Just as ID Software dominated gaming in the first-person shooter genre, in the first part of the decade, another company, Blizzard Entertainment (of the two, still going strong!) dominated role-playing games.

All that’s needed to convey this are a few names: Diablo, Starcraft, Warcraft, each of which I probably spent hundreds and hundreds of hours on.

It’s worth mentioning that there was a lot of competition early on, and the only reason these stand out is that they balanced a lot of factors in RPGs very well, designing the details very, very well.

Note 1: If I had to pick a favorite, it would be Starcraft.)

Note 2: But, more on all this some other time, especially an account of this one game that was insignificant but that I liked: Microsoft Urban Assault

Aside: storage media

Going from a floppy disk that stored 1.44MB to a CD-ROM that stored 650MB was a big change, one that really opened up a whole variety of new, rich content.

DVDs and Blu-Rays went an order of magnitude higher each, but have been used for richer and more detailed versions of existing content and not newer kinds of content (in my opinion).

There were other stops along the way, and not just for alternatives like HD-DVDs that no one remembers. For a while it was quite common to have a “Zip drive”, awkwardly between a floppy and a CD.

(Of course, a new laptop today has neither of these)

Aside: man and machine

I should point out something: I had a certain sort of … affection … for the first computer we had (I remember being upset and crying once (embarrassing, right) when it didn’t start and appeared to be broken), in a way that I didn’t have for the second one (which was “just a machine”), or any of the countless ones (laptops, desktops, tablets, phones, watches, appliances) since.

It might be a pets vs cattle thing, dunno.

Transition

I haven’t really thought through the episodic nature of this series, which means there isn’t any plan of having “equal chunks”. But yes, we’ll plod along steadily. Next time: the internet (!)

My history with computers, Part 2- “Mid 90s

Context

Picking up where I left off last time: we’d got a first, new computer, the first set of simple games, and a first operating system (ye olde DOS).

QBasic

As I’ve mentioned, the only programming environment, programming interface, programming tool, programming editor I knew about or used, was the version of QBasic that came bundled with MS-DOS.

This might sound pathetic now, but felt very cool to me back then. I hadn’t experienced “programmable calculators”, so this was also the only “programmable thing”.

This beige box was the only thing that could compute at all. All this sounds redundant, because we have so many little computers all over the place now and they’re ubiquitous, but it’s hard to give an idea of how unique one of these was.

(Like this, except in black and white)

Everything was one giant file, with jumps all over the place. Incredibly messy, and IMO a terrible way to learn how to write anything (so much to unlearn later, sigh). But still, a great way to get started MAKING stuff.

Using it

Just to give an idea, here’s how a sample interaction might go (say I wanted to make some random patterns or drawing, in my “computer time” that day):

  • The computer is off, put it on (the idea of leaving it on all the time would have been crazy!)

     

  • It boots into the C:\> prompt, pretty quickly (no logins, single-user!)

     

  • I run QBASIC, see the screen above

  • I write some small fragment like

    SCREEN 1
    LINE (35, 50) - (100,150), , B
    
  • I hit run, and see a small rectangle (in the beginning, coming from LOGO, this is most of what I did)

     

  • I press a key, back in the editor, make some changes, repeat.

Game programming books

The installation of QBasic came bundled with an impressive game that gave the impression that a lot was possible (the code was very spaghetti-zed, but I supposed relatively okay).

(Like this, except in black and white)

At the time there were also a lot of books with games — by which I mean they had programs that you could type out and run (remember how I said everything is “just one large file”?)

I was fortunate my mother could bring these from the library of the school she worked at, and I learnt a lot (okay, questionable, but it definitely felt good) from reading through them.

I was also fortunate that my younger brother (with great patience!) would read aloud each line for me to type in, so we could play the game later.

One of these was a Star Trek game, the longest of the bunch, that we painstakingly typed in over several days, slowly making progress page by page (sounds ridiculous as I type this, but … yeah, you had to be there), and inevitably I must have made some small typo somewhere, that was then … impossible to track down, so we were quite dejected when it didn’t work.

However, the opening theme did play, with its signature tune, and that felt good. I should note that there were no speakers, so there was no sound, just a beeper with different frequencies.

I did try to read the source code, and it was essentially a “first RPG”, with an energy level that got depleted by actions, movement in a grid in space, ability to fire phasers and photon torpedoes, all that good stuff.

(I googled it, and … of course there’s a Youtube video for this: https://www.youtube.com/watch?v=gLKw4AU4KHU)

Windows

I had seen Windows machines at school, when I finally got a chance to play with one of two computers that had a mouse. All I did was use MS Paint, because moving the cursor, pointing and clicking and seeing dots of color appear was such a novel experience!

Finally, one day, my dad brought a stack of floppies (because that’s how anything got installed) for Windows 3.11. It required baby-sitting the whole install experience, removing and inserting each floppy in the precise order in which they were labelled.

GUI

Now, after starting up the computer, at the c:\> prompt, it was possible to run win, and then see a brief splash screen.

(Like this, except in black and white)

After which there would be, well, windows on the screen.

(Again: this, except in black and white)

Things were getting exciting.

Remember though, still no mouse (that would come a few months later). So we got really good at keyboard shortcuts for selecting windows, moving, resizing, whatever.

Apps

A big boost came from getting (in another long bunch of floppies) Microsoft Office (!)

Each app took about a minute to load up, but we could now use (still one at a time) Word, Excel, Powerpoint, and Access!

I remember Access has a dialect called AccessBasic that I read the manual for, tried to use, and failed. I wanted to make a “home library system”, but spent all my time prettifying the frontend and never quite got the flow of entering books and looking them up to work properly.

I vaguely remember repeating this painful install process a couple times, and using Norton Disk Doctor and a bunch of other tools that simply have no analogue today.

Games

Bundled games included Minesweeper and Solitaire, though I never quite liked them all that much.

At this time, windows was very much (until Windows 95, I think) a “shell within DOS”, so it was quite normal to play some games within Windows, and to exit Windows and play some games within DOS.

As far as I can remember (again, I wish I had written something down), there were better games in DOS, especially the ones my brother got from his friends.

One game stands out: Wolfenstein. Again, this was black-and-white without sound, but … it was the first-ever FPS. Let me repeat that: the first-ever first-person shooter (for me, at least). All I had seen were flat, 2-d games, maybe a few platforms, and … here was something totally different.

(Again, like this, except black-and-white, and no sound)

In today’s world of bloated “built on Electron” apps, it’s nearly impossible to appreciate the skill that went into creating an experience like this on a limited computing platform such as we had.

I do remember a visual Chess game on Windows, perhaps Chessmaster?

Transition

Time to stop again, so I can come back and write again later. Next time: an upgrade.

My history with computers, Part 1: “Early 90s”

An early computer using the Intel Pentium

Context

As hinted at in a previous post, I thought I’d go over a history of my interaction with computers. This was … harder than I thought, mostly because I barely remember anything. If only I had pictures, notes or journals, sigh (so, I picked a generically representative image here above). Still, it’s a useful exercise to try to recount all this, so I will do my bit.

Chronologically, this post is set between roughly 1994 and 1997.

Images

Before I physically saw a computer or used one, I knew of its existence through magazines, and reference books1.

I do remember one occasion where someone I knew bought a big and expensive computer at their home, and I got to see it, and was very impressed by the (at the time, very, very novel) color graphics display and a mouse.

School

I was fortunate to have a “computer lab” at my school. It was populated by what would today by utter relics, not notable enough to feature even in a museum2.

Yet with no context and not having every physically touched anything else, they were, of course, quite marvelous to me. They were even then oddballs, one-offs — and they had to be, because hardware was incredibly expensive then! — but I do recall a good number of them being BBC Micros3 (and there might have been a solitary Sinclair4).

Two features to note here, common to each:

  • a Floppy drive, the sole mode of connection to the outer world (no network of any sort), and also the sole means of storage (yes, no hard drives either!)
  • Basic 5 as the sole programming language (okay, it was Logo before that for a while)
  • Black-and-white raster graphics on a roughly 14-inch screen. Yep.

Thinking back, I can accept the floppy drive (noisy and slow, and this was the older 5-1/4” jumbo drive, btw), but thinking of how anything to do with “real programming” was limited to Basic makes me tear my hair out. It made it so hard to imagine how anything else was made.

Home

It was a big deal then, when we at home got a computer of our own. This was relatively very expensive at the time, and I’m fortunate to have parents who spent money on this as opposed to almost anything else for themselves.

So one day we had a shiny6 new 386 7.

It had 4MB of RAM8 , and a 256MB hard-drive9, along with a 14-inch black-and-white raster display.

It ran MS-DOS 6.2210 and came pre-installed with … yes, Basic11

Over time, we got “office suite” applications, and I felt very accomplished as I learnt Lotus 1-2-312, DBase 413 and WordStar14.

There were early games (e.g. Arkanoid15, Dave16) at this point, which me and my brother played in isolation, though later on, he brought cooler games (e.g. Commander Keen17) from his friends.

Transition

I’m going to stop here, because I could go on and on otherwise, but also because this was fun to write and if I actually “get this out”, I can actually write “the rest” too.


  1. Which were pretty good for the time, btw, remember this was before the widespread advent of “the web”, e.g. this Time-Life series ↩︎
  2. I’m thinking, for example, of the Computer History Museum ↩︎
  3. I think of them as the big hulking Raspeberry Pis of that era ↩︎
  4. maybe this one ↩︎
  5. Or rather, GW-BASIC ↩︎
  6. I used to keep taking off and putting on the dust cover on it. Seriously. ↩︎
  7. At the time, I remember the “range” of computers were roughly defined as Intel chip generations. So, 186, 286, 386, 486, and 586 a.k.a. the “Pentium”. This sounds silly now, the equivalent of people deciding whether to buy a Kaby, Coffee, Comet or Cooper -lake today ↩︎
  8. I remember thinking, wow, 4 million sounds so big! ↩︎
  9. A big upgrade from the floppy-disk-only machines I had seen earlier ↩︎
  10. I remember the version because I read the manual (cringe) front-to-back a couple of times ↩︎
  11. Or rather, QBASIC ↩︎
  12. it was an early instance of what is today called a “killer app” ↩︎
  13. I found an old manual (!) that shows what it looked like … and I was fascinated/horrified to see that its newsgroups are still active ↩︎
  14. Hey, R. R. Martin still uses it ↩︎
  15. I always felt this was a crazy name for a glorified pinball machine ↩︎
  16. you can play this in your browser today! ↩︎
  17. I don’t know which version, maybe this one ↩︎

Monthly Curations: November 2019

What the Apollo Guidance Computer looked like

The “trinity of computation”

I never thought of it this way

Logic tells us what propositions exist (what sorts of thoughts we wish to express) and what constitutes a proof (how we can communicate our thoughts to others). Languages (in the sense of programming) tells us what types exist (what computational phenomena we wish to express) and what constitutes a program (how we can give rise to that phenomenon). Categories tell us what structures exist (what mathematical models we have to work with) and what constitutes a mapping between them (how they relate to one another). In this sense all three have ontological force; they codify what is, not how to describe what is already given to us.

In this sense they are foundational; if we suppose that they are merely descriptive, we would be left with the question of where these previously given concepts arise, leading us back again to foundations.

Memristance is futile. Not.

I came across this wired article recently, and what I read sounded too science-fiction-y to be true, so then I decided to go to the source, and found this video (see below) by a researcher at HP, and it turns out to be both true and “science-fiction-y”.

We are used to thinking in terms of standard circuit elements — resistors, capacitors, inductors. One establishes a relationship between voltage and current, the second better voltage and charge, and the third between magnetic flux and current.

Now it never occurred to me to really think about it this way (it’s one of those things that’s only obvious in hindsight), but there is a missing piece of symmetry here.

Look at that list again, and it might jump out at you that among current, voltage, charge and magnetic flux, they’re related in pairs to each other, with the exception of charge and magnetic flux. Seeing this now, it might be reasonable to speculate on another circuit element that should do precisely that. And indeed someone did, about forty years ago, and named the missing piece the memristor.

Now I should acknowledge that there is a bit of controversy whether what HP labs claims to have discovered really matches up with this idea, so we’ll just have to wait a few years to test these claims, since the first commercial applications of this technology won’t be out for another five years at least.

But let’s continue. One of the observations made in the video linked I above is that the memristance obeys an inverse square law. This means the tinier the dimensions, the greater the observed effect. Which also means this is something that would belong purely in a chip, and not something you’d be putting on a breadboard any time soon.

The most exciting property, though, is that it’s behavior in the future depends on its past. So it is both a logic component as well as a storage component. So you could build a dense cluster of these things and determine which parts do what function, in a configurable sense, much like an FPGA on steroids.

I used to think (again, only because this is what I was taught) that the fundamental logic component was a NAND gate — but this turns out not to be true. Instead, it turns out that if we consider the interaction between input A and input/output B expressed using memristors, as an IMP gate, then we can construct a NAND gate out of these.

Further, multiple layers of these memristors can be stacked above a conventional CMOS layout, and densely packed together, leading to unprecedented on-chip memory, perhaps on the order of petabits!

So, how would this change things? It would certainly deprecate the SRAM ->DRAM->Hard Drive pyramid of caches we have right now, and we would not only have an ocean of universal memory, but our processing elements would be floating on this ocean, and entirely commingled with it!

We certainly won’t need to deal with the Von Neumann bottleneck any more …

Comparative Latencies

It is usually hard to get an idea of how the time taken for various fundamental operations varies, and it does matter, but it’s hard to viscerally get it (time intervals in nanoseconds, microseconds, milliseconds aren’t really felt in the same way).

I came across this idea of representing the smallest number as a single second and everything else in terms of it, so that the relationship between the numbers is represented in more of a human scale, which results in the following table:

Op Latency Table

I wanted to show this in a chart, but it never shows more than the last two values, so I had to break it down into a series of smaller charts (I could use a log scale to represent them too, but that would’ve again lessened the impact you feel when seeing these numbers side by side)

Mostly similar, as long as it’s on the chip

 

This is a big jump!

Tremendous jump! Main memory latency is like 0.1% of the SSD access time!

… so imagine how much slower disk is, compared to RAM.

And if that was slow, you should check out the internet …

… exceeded only by an operation on the “machine”; this is finally when we can feel seconds ticking by.

Op Latency 6

Obviously, the worst thing you can do is try to restart the “real” system.

 

Lispium ?

What if you did the following:

  • Take a chromebook
  • Modify the chromium build running to run Sbcl within it.
  • Create lisp bindings to the internal surface, so that all UI elements can be created and manipulated within the Lisp image.
  • Allow downloading, compiling and running arbitrary lisp code
  • One of the tabs is always a Repl
  • Add a caching filesystem that would persist part or whole of the image

… might this create a modern-day Lisp machine? Maybe.

Did I miss anything obvious here? If not, this sounds doable in a few years.

I’m lazy, do you if you like this idea (I’m sure there’s a guaranteed niche market for these machines), go ahead and throw it on to Kickstarter. Or something.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén