Drewloid's Blog

Archive for November 2011

I love following some of the blogs, even the apple bloggers.  I’ve discovered a few who are semi-conscious.  Marco Arment has definitely impressed me positively more than once, and he’s popped out a gem.


For someone to basically admit that he is a recovering fanboy is pretty amazing if you ask me.  When I was in my early 20’s I suffered no such delusions that my way was the only way, at least when using computers!  But then, I was studying programming and I had the crazy idea that a personal computer should work the way the it’s person wanted it to work.  The fanboys who wanted every other option to disappear were, and still are, endlessly irritating to me.  Although now it is simply an annoyance that I have to accept that that is the way some people are.  Some people are so immature and so low on self-esteem that they cannot accept the fact that our world of amazing diversity that allows people to make highly individualized choices is in no way belittling their fanboy choices.

So how far have we come from the invention of the personal computer to now, and how personal are they?  Pretty far, but we really have a very very long way to go.  And I’ll suggest that how you measure our progress might be very much colored by whether your idea of the first “personal” computer is an Altair or an Apple ][.

Consider the hammer.  An elegant and simple tool, it has quite a few uses and anyone who makes things pretty much can’t live without at least one.  It mashes things.  That’s what it does.  It is the source of the adage “if all you have is a hammer, then every problem looks like a nail.”  Of course, a hammer is useful for oh so much more than pounding nails.  I just watched the movie “Thor” a couple nights ago and apparently Thor’s hammer is useful for kicking up ice shards and shooting them at Frost Giants or flying through the air into a monster’s mouth and out the back of it’s head to kill it.  That is pretty darn versatile for a hunk of metal with a handle.  The hammer is an extension of the hand, which is simply the most versatile physical too there is.

The computer, as has been said before, is an extension of the brain.  So it has to be able to do a lot of stuff.  And it does.  And new applications are being discovered all the time to amaze and delight us, even a cynic such as myself.  I mean, really, what’s up with this internet thing?  Once I have email, netnews, and ftp, what more could I need?  Apparently quite a lot.

I love Apple, and I love Microsoft.  I used to hate Apple, because I could not stand how constraining the Mac was and also that I could not afford one.  But even when I could afford one I wasn’t too excited about it.  Partly because I did not want to associate with the fanboys, and partly because I had no interest in buying a computer that was going to tell me how I could use it.  It’s a computer, and I’ll use it any damn way I like!  And I am sure as hell not going to buy a computer where I have to *ask* to get my floppy disc back!

By comparison, Microsoft was freedom, Microsoft was flexibility.  DOS and Windows enabled the world that anybody could build any computer any way they like.  This is an absolutely monumental engineering challenge by the way, and the robustness and flexibility of Windows is absolutely one of the greatest engineering successes ever.  The vast ecosystem of capabilities that are enabled by Windows is a testament to diversity that Apple simply cannot match in any way.  Which is perfectly fine.

Microsoft makes software for people who care about getting things done at a reasonable cost.  A vast diverse variety of things.  Arment is calling this out in his latest blog entry.  Finally one of the fanboys is actually explicitly acknowledging that what he uses his computer for and the way he uses it, the way it is an extension of *his* brain, is not going to work for someone else.  He has learned the lesson of the IT department.  The IT department does not offer general computing services.  The IT department offers business support services and provides a computer infrastructure that delivers those services.  So if you want to do something outside the boundaries of what they have narrowly defined, too bad.  The IT department doesn’t do that.  If you want something extra, take it up with the CIO.

The irony, to me, of the complaint of the Apple fanboy, is that they don’t like Microsoft because it leads to IT department thinking.  Except that it is Apple which has the IT department thinking, and Microsoft enables whatever kind of computer infrastructure freedom you want.  Apple delivers appliances.  They work the way they work, and that is that.  And like an IT department, they promise to make the end user experiences they enable to actually work.  Those are called Service Level Commitments, by the way.

Microsoft enables quite a variety of SLC’s, and Apple, being much more like an IT provider, offers one.  Apple is finding an amazing level of profitability by hitting a sweet spot of SLC which they make just a little bigger every year and growing into appliance markets.  Apple will probably never have more than 25% of any general market in the long run, but since they take the 25% most profitable part of the market they don’t particularly care.  Microsoft goes for the rest of the market so they derive profitability from volume.  They are both part of a much larger picture and they have their niche, regardless of what anyone wants to say about monopoly power.

This is how it has to be.  Because you simply can’t take a one size fits all approach with computers.  No single specific computing system is going to serve everyone’s needs equally.  That is simply impossible.  If you have any trouble accepting this, go to Home Depot or Lowe’s or Menard’s and have a look at the selection of hammers, which is but a small fraction of all the many and various kinds of hammers that are out there in the world.  It’s a hard thing at the end of a lever for mashing and there is an astonishing degree of diversity.

If anyone ever suggested that there needs to be only one kind of hammer  … well, I can’t even imagine anyone being that silly.  So anyone who suggests there needs to be only one kind of computer, well that is even more silly.  Because if there needs to be that much diversity for an extension of the hand, then how much diversity do we need for the an extension of our brain?

Like the Macalope, I seem to be endlessly fascinated by the way the various tech industry blogger/pundits are aggressive about being asleep. And by sleep, I mean the opposite of awake as explained by Pema Chodron when she says : “We are one blink of an eye away from being fully awake.” I’m just a poor engineer who has gone over to the dark side of marketing and trying to understand the world who only recently has decided to accept that a large number of people are simply pretending to be asleep.

But the pundits definitely test me and my patience. Like this guy:

In the second to last paragraph Mr. Thurrott seems to be contradicting a quote from Larry Tesler. Really? Were you *there*, Paul? No, you weren’t, so STFU.

I don’t know much about this guy and I don’t read his site, I just came across this article when I was reading one of the Macalope’s posts. It is truly mind-bogglingly ballsy to suggest a different reading of history than someone who was actually on the ground at the time.

And then he caps off his little thought with the observation that it apparently took Apple 10 years to switch to Intel from PowerPC, and the planning for this change must have started in 1996. He somehow forgot that the Mac started out on the Motorola 68K line, and a quick look at the Wikipedia article on the Mac says that the first PowerPC Macs appeared in 1994. Somehow I don’t think Apple was seriously wanting to bolt to Intel processors only 2 years into the PowerPC architecture, a design that had Motorola as a partner.

Much more likely is that Apple’s engineers, Tesler among them, simply wanted to have an operating system that was more portable between processor architectures. To whatever extent that it is true Apple considered using Windows NT (I am amazed that Thurrott is actually suggesting this, NT was nowhere near capable of satisfying Apple’s requirements back in those days) they were probably just looking at an operating system that was running on x86, PowerPC, and MIPS architectures and thinking that would be a nice thing to have. As near as I can tell, by 1996 Solaris was running on PowerPC, x86, and SPARC. And according to the Wiki article on NeXTSTEP, it was running on the 68K, x86, SPARC, and PA-RISC. The portability of NeXTSTEP is completely unsurprising to anyone who knows jack about operating systems since it was derived from Mach. Finally, (according to the Wiki article) BeOS was running only on the PowerPC in 1996, with x86 support not shipping until 1998.

So as Thurrott is contradicting Tesler, he doesn’t appear to let a little thing like facts get in the way. He says straight up “every single one of them – all four of them – ran on Intel chips.” I’ll admit I wasn’t really paying attention too much to Be back in those days so I have to trust the Wiki article, but I’ll accept that BeOS wasn’t running on Intel platforms in 1996.

I just don’t get these guys, any of them. The Macalope seems to have quite a bit of fun reaming them, so it’s fun to read those rants.

I see this as part of a bigger picture of a bunch of people who want to collect ad money by attempting to appear that they know something and have something of value to add to discussion about the tech industry.

But when the pundidiots start trying to rewrite history in contradiction to the statements of people who were actually present and might know at least a little of what they are talking about then I’m thinking there is a problem. They sure as hell can’t have anything of interest to write in predicting the future if they can’t even be bothered to have straight facts about the past on hand. I suppose it is easier to look like you can predict the future if you can change to past to support your expectations.

My best guess is that with this post Mr. Thurrott is attempting to suggest that Apple took ten years of planning and technical work to transition from PowerPC chips to Intel chips. I can only guess his motivation. Perhaps he is trying to suggest some kind of weakness in Apple? I can’t imagine. What I can say with some confidence as an engineer who has actually ported a major operating system is that porting MacOS (again, since it is based on Mach) to a new processor is a 1 year job on the outside. Maybe another year to work out all the bugs of integrating the hardware and the software. The vastly bigger job is the migration of your ecosystem, which would take Apple on the order of 1-3 years.

Now if you happen to be talking about a migration from Intel to ARM, that’s a whole lot easier since Microsoft is already driving the migration of a major operating system from the one to the other.

I wonder how Mr. Thurrott will rewrite that history 10 years from now? I wonder if anyone will still be reading his blather so he can make money from ad placements? I’m kind of hoping the answer to the latter is “nobody”.

It’s a tried and true tradition in software development (as in many areas of endeavor) to have “war stories”, those tales that people who do things tell of their most challenging and difficult times on the job.

I had lunch with a friend yesterday who is doing hardware enabling, and I was chatting with a co worker today who does a similar job. It brought me back to those days when I was doing hardware bring up.

As much as I dislike having to deal with other people’s bugs when I am writing code on some platform, the difficulties often pale in comparison to when the other guy’s code is the hardware. You can’t run the debugger down into the hardware a lot of the time. You need special hardware to debug the hardware. Sometimes it is a probe, or a bus or protocol analyzer, or some variety of special thing that costs a lot and you can’t get budget for it. If the ability to probe the hardware is even available.

These are the kinds of bugs that take the longest, are the hardest to find, and exercise your problem solving skills well beyond what happens with software. With software you can usually do something with software to figure it out. With hardware … well, you need more hardware – if you can get it. And the most annoying part is that the other software guys don’t grasp just how difficult it is. They haven’t done it so they don’t get it and they think you are taking too long. It’s just another bug in the bug list to them. They don’t understand it is an order of magnitude more challenging.

So war stories about fixing hardware bugs, for those of us, the special few who dare to play with making hardware do what we want, are the best war stories. Except only people who have hardware war stories can appreciate them. And then when a purely software guy floats his best war story, all we can do is smirk. Because my best software war stories are just so much less impressive (to me) than my hardware war stories.

Kind of makes me wonder what kind of war stories the silicon process guys have when they are dealing with the problems of semiconductor physics and things that happen in a nanometer and smaller. That’s gotta be fun 😉

  • None
  • Roberta Koral: Hey Drew..Andrew ar Andy, whichever you prefer. I just found your blog. Roberta here.
  • globularity: Sharp analysis. -Davoid
  • Stephanie: What a marvelous article, thanks for writing it "friend."