Drewloid's Blog

Author Archive

I love following some of the blogs, even the apple bloggers.  I’ve discovered a few who are semi-conscious.  Marco Arment has definitely impressed me positively more than once, and he’s popped out a gem.


For someone to basically admit that he is a recovering fanboy is pretty amazing if you ask me.  When I was in my early 20’s I suffered no such delusions that my way was the only way, at least when using computers!  But then, I was studying programming and I had the crazy idea that a personal computer should work the way the it’s person wanted it to work.  The fanboys who wanted every other option to disappear were, and still are, endlessly irritating to me.  Although now it is simply an annoyance that I have to accept that that is the way some people are.  Some people are so immature and so low on self-esteem that they cannot accept the fact that our world of amazing diversity that allows people to make highly individualized choices is in no way belittling their fanboy choices.

So how far have we come from the invention of the personal computer to now, and how personal are they?  Pretty far, but we really have a very very long way to go.  And I’ll suggest that how you measure our progress might be very much colored by whether your idea of the first “personal” computer is an Altair or an Apple ][.

Consider the hammer.  An elegant and simple tool, it has quite a few uses and anyone who makes things pretty much can’t live without at least one.  It mashes things.  That’s what it does.  It is the source of the adage “if all you have is a hammer, then every problem looks like a nail.”  Of course, a hammer is useful for oh so much more than pounding nails.  I just watched the movie “Thor” a couple nights ago and apparently Thor’s hammer is useful for kicking up ice shards and shooting them at Frost Giants or flying through the air into a monster’s mouth and out the back of it’s head to kill it.  That is pretty darn versatile for a hunk of metal with a handle.  The hammer is an extension of the hand, which is simply the most versatile physical too there is.

The computer, as has been said before, is an extension of the brain.  So it has to be able to do a lot of stuff.  And it does.  And new applications are being discovered all the time to amaze and delight us, even a cynic such as myself.  I mean, really, what’s up with this internet thing?  Once I have email, netnews, and ftp, what more could I need?  Apparently quite a lot.

I love Apple, and I love Microsoft.  I used to hate Apple, because I could not stand how constraining the Mac was and also that I could not afford one.  But even when I could afford one I wasn’t too excited about it.  Partly because I did not want to associate with the fanboys, and partly because I had no interest in buying a computer that was going to tell me how I could use it.  It’s a computer, and I’ll use it any damn way I like!  And I am sure as hell not going to buy a computer where I have to *ask* to get my floppy disc back!

By comparison, Microsoft was freedom, Microsoft was flexibility.  DOS and Windows enabled the world that anybody could build any computer any way they like.  This is an absolutely monumental engineering challenge by the way, and the robustness and flexibility of Windows is absolutely one of the greatest engineering successes ever.  The vast ecosystem of capabilities that are enabled by Windows is a testament to diversity that Apple simply cannot match in any way.  Which is perfectly fine.

Microsoft makes software for people who care about getting things done at a reasonable cost.  A vast diverse variety of things.  Arment is calling this out in his latest blog entry.  Finally one of the fanboys is actually explicitly acknowledging that what he uses his computer for and the way he uses it, the way it is an extension of *his* brain, is not going to work for someone else.  He has learned the lesson of the IT department.  The IT department does not offer general computing services.  The IT department offers business support services and provides a computer infrastructure that delivers those services.  So if you want to do something outside the boundaries of what they have narrowly defined, too bad.  The IT department doesn’t do that.  If you want something extra, take it up with the CIO.

The irony, to me, of the complaint of the Apple fanboy, is that they don’t like Microsoft because it leads to IT department thinking.  Except that it is Apple which has the IT department thinking, and Microsoft enables whatever kind of computer infrastructure freedom you want.  Apple delivers appliances.  They work the way they work, and that is that.  And like an IT department, they promise to make the end user experiences they enable to actually work.  Those are called Service Level Commitments, by the way.

Microsoft enables quite a variety of SLC’s, and Apple, being much more like an IT provider, offers one.  Apple is finding an amazing level of profitability by hitting a sweet spot of SLC which they make just a little bigger every year and growing into appliance markets.  Apple will probably never have more than 25% of any general market in the long run, but since they take the 25% most profitable part of the market they don’t particularly care.  Microsoft goes for the rest of the market so they derive profitability from volume.  They are both part of a much larger picture and they have their niche, regardless of what anyone wants to say about monopoly power.

This is how it has to be.  Because you simply can’t take a one size fits all approach with computers.  No single specific computing system is going to serve everyone’s needs equally.  That is simply impossible.  If you have any trouble accepting this, go to Home Depot or Lowe’s or Menard’s and have a look at the selection of hammers, which is but a small fraction of all the many and various kinds of hammers that are out there in the world.  It’s a hard thing at the end of a lever for mashing and there is an astonishing degree of diversity.

If anyone ever suggested that there needs to be only one kind of hammer  … well, I can’t even imagine anyone being that silly.  So anyone who suggests there needs to be only one kind of computer, well that is even more silly.  Because if there needs to be that much diversity for an extension of the hand, then how much diversity do we need for the an extension of our brain?

Like the Macalope, I seem to be endlessly fascinated by the way the various tech industry blogger/pundits are aggressive about being asleep. And by sleep, I mean the opposite of awake as explained by Pema Chodron when she says : “We are one blink of an eye away from being fully awake.” I’m just a poor engineer who has gone over to the dark side of marketing and trying to understand the world who only recently has decided to accept that a large number of people are simply pretending to be asleep.

But the pundits definitely test me and my patience. Like this guy:

In the second to last paragraph Mr. Thurrott seems to be contradicting a quote from Larry Tesler. Really? Were you *there*, Paul? No, you weren’t, so STFU.

I don’t know much about this guy and I don’t read his site, I just came across this article when I was reading one of the Macalope’s posts. It is truly mind-bogglingly ballsy to suggest a different reading of history than someone who was actually on the ground at the time.

And then he caps off his little thought with the observation that it apparently took Apple 10 years to switch to Intel from PowerPC, and the planning for this change must have started in 1996. He somehow forgot that the Mac started out on the Motorola 68K line, and a quick look at the Wikipedia article on the Mac says that the first PowerPC Macs appeared in 1994. Somehow I don’t think Apple was seriously wanting to bolt to Intel processors only 2 years into the PowerPC architecture, a design that had Motorola as a partner.

Much more likely is that Apple’s engineers, Tesler among them, simply wanted to have an operating system that was more portable between processor architectures. To whatever extent that it is true Apple considered using Windows NT (I am amazed that Thurrott is actually suggesting this, NT was nowhere near capable of satisfying Apple’s requirements back in those days) they were probably just looking at an operating system that was running on x86, PowerPC, and MIPS architectures and thinking that would be a nice thing to have. As near as I can tell, by 1996 Solaris was running on PowerPC, x86, and SPARC. And according to the Wiki article on NeXTSTEP, it was running on the 68K, x86, SPARC, and PA-RISC. The portability of NeXTSTEP is completely unsurprising to anyone who knows jack about operating systems since it was derived from Mach. Finally, (according to the Wiki article) BeOS was running only on the PowerPC in 1996, with x86 support not shipping until 1998.

So as Thurrott is contradicting Tesler, he doesn’t appear to let a little thing like facts get in the way. He says straight up “every single one of them – all four of them – ran on Intel chips.” I’ll admit I wasn’t really paying attention too much to Be back in those days so I have to trust the Wiki article, but I’ll accept that BeOS wasn’t running on Intel platforms in 1996.

I just don’t get these guys, any of them. The Macalope seems to have quite a bit of fun reaming them, so it’s fun to read those rants.

I see this as part of a bigger picture of a bunch of people who want to collect ad money by attempting to appear that they know something and have something of value to add to discussion about the tech industry.

But when the pundidiots start trying to rewrite history in contradiction to the statements of people who were actually present and might know at least a little of what they are talking about then I’m thinking there is a problem. They sure as hell can’t have anything of interest to write in predicting the future if they can’t even be bothered to have straight facts about the past on hand. I suppose it is easier to look like you can predict the future if you can change to past to support your expectations.

My best guess is that with this post Mr. Thurrott is attempting to suggest that Apple took ten years of planning and technical work to transition from PowerPC chips to Intel chips. I can only guess his motivation. Perhaps he is trying to suggest some kind of weakness in Apple? I can’t imagine. What I can say with some confidence as an engineer who has actually ported a major operating system is that porting MacOS (again, since it is based on Mach) to a new processor is a 1 year job on the outside. Maybe another year to work out all the bugs of integrating the hardware and the software. The vastly bigger job is the migration of your ecosystem, which would take Apple on the order of 1-3 years.

Now if you happen to be talking about a migration from Intel to ARM, that’s a whole lot easier since Microsoft is already driving the migration of a major operating system from the one to the other.

I wonder how Mr. Thurrott will rewrite that history 10 years from now? I wonder if anyone will still be reading his blather so he can make money from ad placements? I’m kind of hoping the answer to the latter is “nobody”.

It’s a tried and true tradition in software development (as in many areas of endeavor) to have “war stories”, those tales that people who do things tell of their most challenging and difficult times on the job.

I had lunch with a friend yesterday who is doing hardware enabling, and I was chatting with a co worker today who does a similar job. It brought me back to those days when I was doing hardware bring up.

As much as I dislike having to deal with other people’s bugs when I am writing code on some platform, the difficulties often pale in comparison to when the other guy’s code is the hardware. You can’t run the debugger down into the hardware a lot of the time. You need special hardware to debug the hardware. Sometimes it is a probe, or a bus or protocol analyzer, or some variety of special thing that costs a lot and you can’t get budget for it. If the ability to probe the hardware is even available.

These are the kinds of bugs that take the longest, are the hardest to find, and exercise your problem solving skills well beyond what happens with software. With software you can usually do something with software to figure it out. With hardware … well, you need more hardware – if you can get it. And the most annoying part is that the other software guys don’t grasp just how difficult it is. They haven’t done it so they don’t get it and they think you are taking too long. It’s just another bug in the bug list to them. They don’t understand it is an order of magnitude more challenging.

So war stories about fixing hardware bugs, for those of us, the special few who dare to play with making hardware do what we want, are the best war stories. Except only people who have hardware war stories can appreciate them. And then when a purely software guy floats his best war story, all we can do is smirk. Because my best software war stories are just so much less impressive (to me) than my hardware war stories.

Kind of makes me wonder what kind of war stories the silicon process guys have when they are dealing with the problems of semiconductor physics and things that happen in a nanometer and smaller. That’s gotta be fun 😉

I’m thinkng the hardware on this little transformer is pretty decent, especially with the add on keyboard. but the typing experience has a bunch of key lag and it’s totally harshing my love.

it’s a nice try, but i have to suspect that while it could be an android problem, it is more likely an oem integration problem.

i’m not in love with the ipad either, and i doubt the bluetooth keybaord has this problem. maybe it s a browser problem. regardles, i don’t know why there needs to be a 1 second key lag everyy few keystrokes, or why the cursor keeps jumping to the title box. I must brushing the mousepad. once again, pc oem’s miss the little things

Seriously, people who do stuff that has a major impact in the world aren’t all that common.  And in a week, 3 of them have passed.  Wow.

First, Steve Jobs trundles off.  Not completely unexpected I suppose, but it’s a pretty big deal.  It’s also kind of a big deal that there are a bunch of boneheads out there who don’t grasp what he did.  He invented the personal computer.  Sure Ed Roberts at Mits built the Altair 8800, and that was darned important, but don’t kid yourself, the Altair looked a lot like a Data General Nova and wasn’t all that friendly.  Jobs and Woz built the Apple ][, and it was the full meal deal in a nice box.  It wasn’t scary looking.  Personal, get it?  Of course, Jobs was especially great because he wasn’t a one hit wonder.  His work will be influential for at least the next century.  Apple ][, Mac (ok, Jef Raskin gets a lot of credit for that one), iPod, iPhone, iPad.  The computing paradigm of the next 10 years is all right there with the release of iCloud, and in this business, 10 years is a really really really long time.

And I just read two more big important people left today.

Robert W. Galvin died this morning.  Ok, I’ll admit, Motorola is kind of a joke today, but when Bob Galvin was in charge this was possibly the most innovative company in America and the world.  Most of what we take for granted today in the form of radio and wireless came from Motorola.  Let’s get real.  This guy did not shy away from risk and his company’s greatest achievement is probably the invention of the cellular phone.  Innovation in the form of the Iridium satellite phone system is what ultimately took them down.  That kind of risk takes balls.  But that was just part of it.  He had crazy ideas like profit sharing plans for employees during the McCarthy era.  His company didn’t invent the microprocessor like Intel did, but they made some pretty great ones.  The Motorola 6800 processor begat the MOS Technology 6502 which landed in the Apple ][, and the original Macintosh had the Motorola 68000 in it.  And Motorola under Galvin was one of the very first American companies to adopt the Total Quality Movement principles that other Americans had come up with and Japan used in the 70’s to redefine out notion of quality.  The quote at the end of the New York Times obit is “The absolutely distinguishing quality of a leader is that a leader takes us elsewhere.”

And as if that is not enough, Dennis Ritchie died this past weekend.  I guess when you are an uber-programmer then that isn’t quite a big news.  He only invented the Unix operating system and the C programming language.  These are the technologies that are behind a very significant portion of all computer cycles executed today.  At heart, everything Apple ships is running Unix and uses a variant of the C programming language.  The original Apple ][ used the 6502 processor which traces lineage back to the PDP-11 (via the 6800) where Unix and C got a lot of their early development.  When DEC withered away the Moto 68000 family was the place for Unix.  One of the major models of writing computer software we use today came directly from this guy’s work, and tons of newer technologies descend from his thinking.  Open Source can trace it’s roots back to Dennis Ritchie, as can Linux.

These three people led the invention of a really vast and huge amount of the stuff we take for granted today, and their separate innovations are surprisingly related.  Galvin provided the hardware technologies behind Steve Job’s successes.  Ritchie provided the software (open up a terminal window on your Mac and that’s unix).  Jobs provided the creativity and vision to focus on the experience of the technology rather than the focus on technology for the sake of technology.

Innovation and leadership on the scale these three people provided doesn’t happen all the time; it is quite rare.  And I’ll note one more thing.  Bob Galvin was 89, Dennis Ritchie was 70.  Steve Jobs was only 56.  Steve was nowhere near done.

Perhaps I am simply weak, but I simply cannot stop enjoying and laughing at the iPhone competitors and the many pundits who keep second guessing Apple’s moves.  You know, the strategy that is leading them to make all the profits in the phone business, the “tablet” (actually iPad) business, and in the PC business.

So when I see Andy Lee’s comments in today’s Seattle Times, I laugh a bit more.


I think his comments are generally reasonable and make sense for someone in his position.  It certainly would not make sense for him to lend any credibility to Apple’s strategy since that option is not available to him, and besides, it would not be possible for him to win trying to play Apple’s game.

But in the answer to the first question – “some people are making comparisons of pace.”  Well, that’s just bust a gut funny.


Soviet Russia made a lot of progress on a lot of fronts real fast in the 50’s – easy to do when you are way behind and there is someone showing the way to help you get caught up.  But don’t imagine that you can possibly continue the “pace” once you get caught up.

And of course, Lees is smart enough to say it is someone else saying this stuff – “I’m not saying it, so and so is saying it.”  That’s a standard language pattern to deflect challenges and criticism.  So I suppose it is the same pundits and critics who continue to entertain me saying it.

Microsoft product launches were never this entertaining.  Although watching the stock go up by leaps and bounds when I had a lot of it (and it was doing that) was pretty good.

I’m very sad that Steve Jobs has passed, and I send my love and condolences to his family.  And I think that at this time, rather than focus on being sad, I will focus on my gratitude for the things that Steve did and how we have benefitted from the things he did while he was here.

Thank you

  • None
  • Roberta Koral: Hey Drew..Andrew ar Andy, whichever you prefer. I just found your blog. Roberta here.
  • globularity: Sharp analysis. -Davoid
  • Stephanie: What a marvelous article, thanks for writing it "friend."