Java in the top 3, with Scala and Clojure not even in the top 10. Given the near power-law distribution of measured language usage, Java must still be above 90% share (and that's probably a gross underestimate).
Not even close. Depending on how you measure this, Clojure might be in the top 20 (it is if you believe the Redmonk rankings), but it's hard to see it making it into the top 10 in this decade. As with the previous prediction, there's just way too much inertia here. Breaking into the top 10 means joining the ranks of Java, JS, PHP, Python, Ruby, C, C++, and C#. Clojure just isn't boring enough. C# was able to sneak in by pretending to boring, but Clojure's got no hope of doing that and there isn't really another Dylan on the horizon.
This seems basically right, at least for most values of “you”.
Wikis, newsgroups, mailing lists, bulletin boards, forums, commentable blogs — they're all bullshit. Home pages are bullshit. People want to socialize, and create content, and compete lightly with each other at different things, and learn things, and be entertained: all in the same place, all from their couch. Whoever solves this — i.e. whoever creates AOL for real people, or whatever the heck this thing turns out to be — is going to be really, really rich.
Facebook was founded the year that was written. Zuckerberg is indeed really, really, rich.
Five years from Steve's prediction would have been 2009. Although the iPhone was released in 2007, it was a while before sales really took off. In 2009, the majority of phones were feature phones, and Android was barely off the ground.
Note that this graph only runs until 2013; if you graph things up to 2015 on a linear scale, sales are so low in 2009 that you basically can't even see what's going on.
It's hard to tell if this is correct (Steve, feel free to let me know), but it seems true in spirit. Google has more and more services that they charge for, and they're even experimenting with letting people pay to avoid seeing ads.
If you include tablets, Apple hit #1 in the market by 2010, but I don't think they do better than all of the old workhorses combined. Again, this seems to underestimate the effect of dark matter, in this case, people buying laptops for boring reasons, e.g., corporate buyers and normal folks who want something under Apple's price range.
More of a throwaway witticism than a prediction, but sure.
That's a pretty good set of predictions for 2004. With the exception of the bit about Lisp, all of the predictions seem directionally correct; the misses are mostly caused by underestimating the sheer about of inertia it takes for a young/new solution to take over.
Steve also has a number of posts that aren't explicitly about predictions that, nevertheless, make pretty solid predictions about how things are today, written way back in 2004. There's It's Not Software, which was years ahead of its time about how people write “software”, how writing server apps is really different from writing shrinkwrap software in a way that obsoletes a lot of previously solid advice, like Joel's dictum against rewrites, as well as how service oriented architectures look; the Google at Delphi (again from 2004) correctly predicts the importance of ML and AI as well as Google's very heavy investment in ML; an old interview where he predicts "web application programming is gradually going to become the most important client-side programming out there. I think it will mostly obsolete all other client-side toolkits: GTK, Java Swing/SWT, Qt, and of course all the platform-specific ones like Cocoa and Win32/MFC/"; etc. A number of Steve's internal Google blog posts also make interesting predictions, but AFAIK those are confidential. Of course these all these things seem obvious in retrospect, but that's just part of Steve's plan to pass as a normal human being.
In a relatively recent post, Steve throws Jeff Bezos under the bus, exposing him as one of a number of “hyper-intelligent aliens with a tangential interest in human affairs”. While the crowd focuses on Jeff, Steve is able to sneak out the back. But we're onto you, Steve.
Thanks to Leah Hanson, Chris Ball, Mindy Preston, and Paul Gross for comments/corrections.
When asked about a past prediction of his, Peter Thiel commented that writing is dangerous and mentioned that a professor once told him that writing a book is more dangerous than having a child -- you can always disown a child, but there's nothing you can do to disown a book.
The only prediction I can recall publicly making is that I've been on the record for at least five years saying that, despite the hype, ARM isn't going completely crush Intel in the near future, but that seems so obvious that it's not even worth calling it a prediction. Then again, this was a minority opinion up until pretty recently, so maybe it's not that obvious.
I've also correctly predicted the failure of a number of chip startups, but since the vast majority of startups fail, that's expected. Predicting successes is much more interesting, and my record there is decidedly mixed. Based purely on who was involved, I thought that SiByte, Alchemy, and PA Semi were good bets. Of those, SiByte was a solid success, Alchemy didn't work out, and PA Semi was maybe break-even.
[return]