Noel Rappin Writes Here

20 Years

Posted on March 16, 2021


Previously on Locally Sourced: It’s been quite a while. I’ve been very focused on finishing the big update to the draft of the book. Which I have done, giving me a bit of a breather until the technical reviews come in. I’m hoping this means I’ll be getting this newsletter out more frequently.

I was looking back over the past newsletters, which I do from time to time to prevent repeating myself, and I realized that I had promised two posts from talks that I felt never got a wide audience and only delivered one of them.

Here’s the only video of the talk, from a Chicago Ruby meetup in March, 2019 It’s about the history of web development.

It’s been 20 years, anything change much?

My first web project that was larger than one developer, kicked off in March 2000. I think about it all the time.

In current terms, we’d likely call it a CMS. The project was an initial collaboration between six overlapping subsidiary companies of a larger company. The goal was to market their products directly to their customers organized by product function rather than by subsidiary.

The site was mostly data involving the products, which included text descriptions, but also manuals and video. There was something of a newsfeed with industry-related articles. The product data import was via batch import of product data, I have no recollection of how the newsfeed was managed, but I assume we provided admin forms of some kind. It was intended to eventually be international and also to integrate with a sales site that was being developed in parallel.

We probably couldn’t quite do it as a WordPress site today; I think the client would have insisted on a custom tool, and the data requirements were a little idiosyncratic but still… that’s the ballpark.

In 2000, this site took five engineers, one full-time designer, a part-time DBA, and a most-time project manager three months. And not three easy months, it ended with a few weeks of 70-80 hours. It was not fun. I do have some good stories, maybe another time. (I’m not identifying the client because even 20 years later, their lawyers still scare me…)

My point here is that this took about 20-25 person-months, and I’m pretty sure that a reasonably competent Rails team could do something similar right now in no more than a tenth of the time. It’s hard to compare directly, of course, and we can quibble all day, but it feels about right to me that similar projects that were possible in 2000 take about an order of magnitude less time.

Really, an order of magnitude?

If you believe my handwaving, that’s an impressive improvement in software delivery, and it raises a lot of questions. Not least of which is “do you believe my handwaving?”

I can argue that the actual speed improvement is more impressive than 10x. I can also argue the speed improvement is less impressive than 10x. I contain multitudes.

Here’s the argument for more impressive. The modern site, even with being built in a fraction of the time, would do many things even better than the old site with the same basic functionality, just with things that would come along almost for free because the tooling has gotten better.

The modern site would have access to more fonts and prettier text options, it would also look better on different sized screens. A modern site would be way easier to deploy and also easier to collaborate as a team (no git in 2000…). It’s almost certainly going to be easier to modify the modern site going forward (we eventually spent months internationalizing it in 2000). So, even the order of magnitude improvement could understate the actual improvement.

Plus, there are a whole set of things you can do in a browser in 2021 that you couldn’t do in 2000 without inventing your own browser, which would have been prohibitive.

I have three arguments that this improvement is less impressive than it seems and I’m just slinging numbers around.

  • The improvement is potentially a statistical illusion caused by the time frame, and if I picked a different time frame, the improvement might be less impressive. In particular, you could argue that tooling improvements mostly happened by, say, 2007 and after that all the low-hanging fruit was gone, and things haven’t gotten much better since.
  • While it’s possible to build sites faster now, it’s irrelevant, because at the same time expectations have also increased for a web app, you couldn’t actually ship the 2021 2.5 months version, because it’d be missing features users expect.

I’ll get to the third argument in a second.

I about half-believe both of these, and I think they both come down to what kind of value our apps are supposed to provide. In 2000, the customer information app took 25 person-months, in 2020 it takes maybe 2 or 3, but in 1980 it would have taken forever because you’d have had to build out the infrastructure. In 1980, though, businesses still provided information to customers, it would just have involved print catalogues and probably would have been more expensive, but they still did it.

And yes, in some areas expectations have changed, but in some areas they haven’t, and you can clearly build useful impressive things very fast. (I look at Hey, World as a recent existence proof…) Some things are clearly more expensive now though, security being the one that comes to mind, our concerts in 2000 were much more modest.

What’s gotten better?

The next question is what has changed over the 20 years. And the interesting thing to me is what hasn’t changed. The 2000 app used HTML with code interleaved with it (we used ColdFusion, but ERB is still basically the same structure). It used a relational database. It even used some JavaScript. We’re not, generally, writing web apps that are shipped as binaries or which are structured like windowed desktop apps. (Mobile apps are a different story).

What’s clear to me as that this is very much a story of compound interest from gradually improving tools. Once upon a time, deployment was a huge source of friction in Rails apps. Pragmatic had a whole book about it in 2008. Bit by bit the tooling improved, and now you can, if you want, have an entirely no-friction deploy. (If you want, you can also have a deploy process that is complex beyond 2008’s wildest imagination).

Multiply that by everything, and I think that’s basically the answer. It’s a huge triumph for the concept of open source tooling, I don’t think we could have gotten close to this without what open source is at its best. (Granting that open source at its worst can be pretty awful).

And yet…

And yet, while I do think that we’ve gotten a more-or-less order of magnitude improvement in web tooling over 20 years, let’s compare to hardware.

21 years after my first web project, I’m typing this on an Apple M1 MPB. The computer I used for the 2000 project… I don’t have the exact specs, but it was a single core 250 Mhz processor, more or less. My new computer has about 100 times the hard drive, 250 times the RAM… the actual speed is hard to compare, but a factor of 50 seems easily plausible, you could talk me into a factor of 100 for some tasks.

All that is to say that, while we’ve gotten faster at delivering software in the last 20 years, the hardware has gotten faster at a much much extreme rate. (You can see this in other ways, when I started as a developer, developer tools were among the most processor intensive things I ran, now they are decidedly not).

The Mother of all Analogies

Here are two stories.

Over my entire adult lifetime, cars in the US have gotten safer, exponentially so. And car injuries have gone down but not as much as you might expect. What’s also happened is that speed limits have gone up, so some of the safety potential has gone into increasing speed, not increasing danger.

What might all that hardware speed be going toward, if not to software delivery speed? Well, clearly, some of it is going to graphics and higher-order languages and is being sort of imperfectly converted into software speed.

But there’s another way to look at it.

In 1968, Doug Englebart gave what is sometimes called the Mother of All Demos. It’s online, you can watch it.

This demo showed the use of a mouse to manipulate text, hypertext clicks on links, video conferencing, remote collaborative editing, and version control. All of which was so far beyond what was thought possible in 1968 that he got a standing ovation at the end. And all of which would be ubiquitous 40 some-odd years later.

Englebart’s demo was on an SDS-940, the specs of which don’t matter except for the one where, in 1966, they cost $125,000 US dollars, which is somewhere on the order of $1,000,000 in today’s dollars. About 60 of those computers were sold. On top of which his team spent about 5 years writing software, so the total cost was about $5,000,000 dollars for a tool that could at most be used on 60 devices world-wide, and probably way less than that in practice.

Today, there are now literally billions of devices that are capable of all of these tasks and way more, you can buy them for well under 100 dollars, if you want. Even 20 years ago, that wasn’t true—the number of people who had internet access capable of video conferencing was still pretty low.

That’s where a lot of our hardware speed has been used, on increasing access. A lot of our tooling and hardware increases have also made web development easier for more people.

I have no idea what’s going to happen in the next 20 years, or even the next 20 minutes, but I’m hope we use both these efficiency and access gains to build great and useful things.



Comments

comments powered by Disqus



Copyright 2024 Noel Rappin

All opinions and thoughts expressed or shared in this article or post are my own and are independent of and should not be attributed to my current employer, Chime Financial, Inc., or its subsidiaries.