The Mac at 30: The last laugh

I bought my first Macintosh thirty years ago this month, within days of the original Mac going on sale. It was my first personal computer purchase. I had used Apple II’s and even Atari’s where I worked, but I had never owned a computer. Until January 1984.

I had been deliberating for several months prior to January, trying to decide what computer to get. Should it be an Apple IIe, an IBM PC, or this Macintosh thing that was coming? I decided to wait and check out the Mac. I was glad I did. After playing with one at a computer store for all of about two minutes, I was sold. I bought a Mac that day.

The only mystery to me at the time was why everyone in the market for a personal computer did not instantly come to the same conclusion. Why didn’t IBM and Microsoft simply go out of business by the end of the year? For me, the comparison was as if you were back in the days of the earliest mobile phones (the ones that were about the size of a shoebox, which were coincidentally the models available around 1984) and suddenly a company came out with the equivalent of a fully functional iPhone 5S — and the reaction of most consumers was something like “Eh! That iPhone is just a toy. You can’t make real phone calls with it. I’ll stick with my Motorola DynaTAC.”

Yes, the original Mac was that much of a leap forward. Unless you were there back in 1984, it’s hard to imagine how startlingly radical it all was. Heck, I was there and it’s still hard for me to imagine. The Mac featured a graphical display navigated by a mouse; it had an intuitive Finder desktop metaphor, WYSIWYG fonts and the ability to create pictures with the likes of MacPaint. No other computer had anything close to this (except perhaps for Apple’s own Lisa computer, which cost 4X as much). The rest of the bunch were all still stuck using a command line.

The rational part of my brain understood that the original Mac had significant limitations. With only 128K of memory, a single floppy drive and virtually no application software, it clearly wasn’t ready to replace existing PCs. But the emotional part of me would have none of that. At the very least, you had to be rooting for the Mac to succeed, even if you weren’t ready to buy one. Or so I thought.

As you likely know, my expectations did not translate into reality. Many people were not rooting for the Mac. At least not at first. The Mac did not become an overnight blockbuster — despite generally rave reviews. Numerous obstacles, some quite huge, blocked Apple’s (and Mac’s) path to a successful future.

But the story has a happy ending. Apple’s vision for the Mac — and my initial reaction to it — were eventually vindicated. Within less than a decade, with Microsoft “translating” the Mac OS into its Windows software, every personal computer in existence was using a mouse and a Mac-like graphical user interface. Apple had won the war, even if they almost died in doing so.

But Apple didn’t die. Further, not one of the companies that was making personal computers back in 1984 is still in the business of doing so today. Not one of the original computer model names is still in use. Except for Apple and the Mac.

It may not be a tactful, socially correct thing to do, but I can’t resist a bit of gloating here on Apple’s behalf. To all those who laughed at the Mac back in 1984: the last laugh’s on you. The Mac has survived, continues to thrive, and stands alone. Apple itself has become the biggest company on earth!

That’s certainly something worth celebrating. So Happy Anniversary, Mac! Congratulations. Well done. It’s been a fantastic 30 years.

On personal note, as is true of many others, I owe my career to Apple. Without the tools that the Mac provided and the culture that surrounded the Mac, I would not be where I am today. Thank you, Apple.

And now…let the next thirty years of Mac begin…

[For Apple’s take on the Mac’s 30th anniversary, including a look back at highlights from the past three decades, check out Apple’s website.]

Setting the record straight re my article on Adobe Reader

I’d like to set the record straight regarding my recent Bugs & Fixes article for Macworld — the one on problems opening PDF files in the latest version of Safari when the Adobe Reader plug-in is installed. To put it as kindly as possible, I could have done a better job on this one.

In a nutshell, the problem cited in the article is that, after installing Adobe Reader, you get continually warned by Safari every time you try to a load a PDF file—requiring a couple of clicks before the PDF will load. I found this so annoying that I decided I’d rather bypass any security advantage and eliminate the warnings altogether. I then detailed how I had some trouble figuring out where and how to do this.

I placed the onus of the blame for this on Adobe and its Reader app. In retrospect, I should have placed the lion’s share of the “blame” (if the term is even appropriate here) on Safari itself. And on myself. Adobe was only making use of a new feature in Safari, one that Apple touts as a providing enhanced security for Internet plug-ins. At some level, this was obvious to me: The fix I cited (which I should have recalled without needing to search for it) required going to a Safari Preferences setting (as shown in a figure in the Bugs & Fixes article). Further, the Preferences screen lists numerous plug-ins besides the one for Adobe Reader. This is a clear tip-off that the situation extends beyond the Reader plug-in. Still, partly because of my prior bias against the Reader, as well as because Apple’s own PDF software does not trigger the same warnings and because I had not had any similar problems with any other plug-in, I viewed it as an Adobe failing. Not so.

I compounded my error by adding a comment that said Adobe “could have handled {the matter} much better.” What could Adobe have done to help ameliorate this? Actually, not much. However, I would have suggested two things. First, on the assumption that many Adobe Reader users may not be clear as to what is going on here, the Adobe Reader app (perhaps in its Preferences screens) should have included the pertinent information—bypassing the need to search Adobe’s or Apple’s support sites. Second (although I understand Adobe would be reluctant to do this), Adobe could provide a simple way to disable or uninstall the plug-in altogether (short of manually having to drag the plug-in files out of the /Library/Internet Plug-Ins folder)—for the benefit those who would like to keep Reader around but not use the plug-in (if there is a setting that does this, I couldn’t find it). Still, as I said, this is minor stuff. And to its credit, Adobe Reader’s Preferences>Internet screen does link to a relevant article on how to manually disable the plug-in.

On a related note, my Bugs & Fixes article also raised the question as to what is the difference between the “Allow” and “Always Allow” options in the Safari settings. I found none. This, as it turns out, is explained in an Apple support document. The answer is: when selecting “Always Allow,” “Safari loads and displays the content without prompting, even if the Internet plug-in is blocked by OS X File Quarantine.” With “Allow,” the Quarantine content remains blocked.

Overall, I fell down on the job with this article. More than once. For that, I apologize. Thankfully, it’s quite rare for this to happen. And I’ll do my best to make sure that it never happens again.

Smaller is the new larger

Back in the old days (of less than a decade ago), for any device that had a screen, users most lusted for the largest size they could get. Cost and space considerations might prohibit such a purchase, but “bigger is better” remained the mantra.

Old: Bigger is better

Interested in buying a laptop? It was the 17-inch model that turned the most heads. Deciding on a new iMac? If you had the dough, you got the one with the largest display.

Over the years, Apple’s Cinema/Thunderbolt Displays have increased in size from 22” to 27”. You can’t buy a smaller one today even if you wanted to do so.

Getting a new flat-panel television? The preferred choice has been the largest one that fits within your room and your budget.

New: Smaller is better

In the last few years, however, the apple of a tech user’s eye has often taken a U-turn.

Apple has eliminated any 17-inch laptop from its lineup. Even the 15-inch MacBook Pro seems to be falling out of favor, as the 13-inch size emerges as the new standard. For the MacBook Air, many reviewers cite the 11-inch model as preferable to the 13-inch. A similar situation exists with the iPad Air vs. iPad mini, with a majority of reviews (at least among the ones that I’ve read) leaning towards a preference for the mini.

That’s right…even techno-savvy “power users” that write reviews, and for whom price is usually not a prime consideration, often say they want the smallest model they can get. What a turn-around!

Apple’s new Mac Pro is being lauded for its small size, compared to the old Mac Pro behemoth…even though the size reduction comes at the expense of internal expandability.

Among flat-panel TVs, larger sizes still rule. But even here, I’ve noticed a trend towards downsizing from a few years ago. Buyers now seem more averse to getting a television that overpowers their room — unless they are setting up the room as a dedicated home theater. Users also seem increasingly content to watch movies on their iPads, skipping the larger television entirely.

What’s going on?

What exactly is going on here? What’s behind this seismic shift, with smaller now the new larger?

To some extent, it’s been developing for quite some time. Sales of laptops have long ago eclipsed desktop machines, despite a laptop’s much smaller display. In fact, many analysts predict that consumer desktop computers will disappear from the landscape altogether in the not-too-distant future. For most, the convenience of being able to move around with the computer outweighs all other considerations.

However, I trace the origins of the more recent surge to the arrival of the iPhone. The iPhone was the first device to offer most of the functionality of a laptop computer, but with a size that could fit in your pocket. Suddenly, mobility and portability were the buzz words of the day.

With the advent of the iPad, even laptops are now viewed as heavy and clunky in comparison. Tablets are well on their way to replacing laptops as the primary computing device of the masses.

Helping to fuel this transition are improvements in display quality. In the past, buying a small-sized device often required accepting a trade-off of an impractically tiny display. But with today’s higher resolution screens, including Apple’s Retina display, a small display can show an incredible amount of real estate and still have legible text.

The result is that smaller is not only more acceptable these days, but preferred — especially for those who place a high value on mobility. Such users increasingly extol the advantages of smaller. They complain that you can’t use a full-sized iPad with one hand or that a 15-inch laptop is too big to use within the confines of an airline seat. Smaller sizes are deemed more convenient for any sort of travel.

There remains one major exception to this shift: smartphones. Although Apple has so far largely resisted the trend, Android users have shown a preference for the largest phone that can fit in their pocket (or, in some cases, one too large to fit). There are now 6-inch Android smartphones! Perhaps conceding to this trend, there are rumors that Apple will release a larger iPhone later this year.

Of course, even a 6-inch phone is still quite small — compared to a tablet, a laptop or a desktop computer.

Overall, I view the shift towards “smaller is better” as part of a larger technological and cultural shift away from viewing computers as a “destination” (something you go to an office to sit down and use) towards becoming a transparent and ever-present part of our lives. I suppose the inevitable end-point will be when computers are implanted in our brains and have no external size at all. Not to worry. That’s still at least a few years away.

My Mac Pro column: A reply to comments

Last week, I posted a column detailing why I thought the high cost and, to a lesser extent, the limited internal expandability of the forthcoming Mac Pro would mean that many current Mac Pro owners, including myself, would not upgrade to the new model.

Based on the large number of retweets and Facebook “likes” the article generated, it apparently resonated positively with a significant segment of readers. That’s always nice to see. However, you wouldn’t have guessed this by just reading the posted comments, which were mainly critical. I guess that’s to be expected; people are generally more motivated to write when they disagree. [Update: Dec. 4: Interestingly, most of the comments posted in the last two days have been positive. Go figure.]

I’ve read and considered all the comments. Rather than separately respond to each one, I decided to offer this more general reply. As many of the comments repeated the same basic points, this seemed a more reasonable and effective way to go.

Paradigm shift? Maybe, maybe not

Numerous comments indicated that the new Mac Pro represents a “paradigm shift.” In contrast, I was accused of being stuck in “old school” mode, unable to “get it.”

Most especially, rather than viewing the shift in emphasis from internal to external storage as a negative, many viewed it as preferable. They noted that, with the superfast Thunderbolt 2 connectors, users would be unlikely to see any speed deficit with external drives as compared to internal storage. Further, external drives give you more flexibility, allowing you to add or swap drives with ease. One commenter even questioned why anyone would need more than 256GB of internal storage anymore.

I am generally as enthusiastic about embracing paradigm shifts as anyone. I am not typically one to reject any change that represents progress. So I have to admit that it’s possible I’ve missed the mark here.

However, let me be clear: This is not an either-or situation. It’s not as if one has to choose to keep all storage internal or all external (beyond some minimal 256GB). I have a combination of internal and external drives now with my current Mac Pro, and I would expect that arrangement to continue with any new model I might purchase. So, you can have your cake and eat it too.

Still, there is a certain minimum amount of software (applications, documents, media etc.) that I prefer to keep on my startup drive. For one thing, I like to know that, in the event that the Thunderbolt connection fails for any reason, I still have access to these essentials. And 256GB is not sufficient to allow me to do this.

In this regard, it’s worth noting that a 27-inch iMac with a 1TB Fusion drive can be had for under $2000. A 1TB Fusion drive Mac mini costs under $1000. You can get a MacBook Pro with 512GB for as little as $1800. These are common configurations. Certainly, I would expect desktop Pro users to want at least as much storage. At the very least, I can’t see viewing a smaller drive as an advantage.

Still, some commenters compared the situation here to previous Apple-initiated “paradigm shifts” involving getting rid of floppy drives or, more recently, optical drives.

“Again Apple has seen the future much sooner than most. Remember when everyone was up in arms because Apple stopped using the floppy disk? Seems rather silly now, huh?”

One problem I see with such comparisons is that these other shifts began on Apple’s lowest cost machines—the iMac or the MacBook Air. They eventually spread throughout the Mac’s entire line-up. In contrast, this supposed shift in emphasis from internal to external storage is making its first appearance with Apple’s top-of-the-line machine. I will be surprised if it trickles down to iMacs and laptops. If this is a paradigm shift, it is one that will be restricted to pro desktops.

More generally, there is the Mac Pro’s relative lack of internal expansion options of any kind, not just storage. In this case, I do see a more typical paradigm shift in play here. There is virtually no internal expansion for today’s MacBooks and iMacs. This approach has now spread to the Mac Pro as well. This is clearly the direction Apple wants to go, for better or worse. In either case, I don’t view it as a “deal-breaker” for the Pro. So I don’t want to make too much of this.

For professionals only? Yes

A related criticism was that I didn’t grasp that the Mac Pro wasn’t meant for users such as myself. Rather, it was meant for “high-end professionals”—users who will come out financially ahead by buying a Mac Pro because its “tech spec” advantages will save money in the long-run, outweighing its initial high cost.

In some of these cases, I have to wonder whether the readers didn’t actually read my column. The comments seem to be attacking a “straw man”—someone who was claiming that new Mac Pros are inferior machines with problems that are so telling that the machines are doomed to fail. When the Mac Pro goes on sale and proves to be a success, this straw man will “eat his words.”

The problem with these arguments is that I never said or implied any such thing. In contrast, I recognized that the Mac Pro is most assuredly a “professionals-only” machine—designed for people working in video production, graphics layout, publishing, science labs and such. I specifically stated that “the new Mac Pro will appeal to this small but profitable professional market.” Indeed, I expect this market will enthusiastically embrace the new Mac Pro. I also acknowledged the Mac Pro is attractive even to a “not high-end user” such as myself: “The promise of lightning-fast speed combined with the allure of its futuristic cylindrical design seemed irresistible.”

My key assertion was a limited one: Given the high cost of a “fully configured” Mac Pro setup—especially when compared to the improved relative performance of Apple’s latest less expensive Macs—I expect most “pro-sumers” (not high-end professionals) who previously opted for a Mac Pro will not do so this time around.

There was a time when the Mac Pro line suited the needs of more than just the highest end of the market. This no longer appears to be the case. This doesn’t mean the Mac Pro is doomed. But it does mean that the Mac Pro will have a more narrow appeal. At least, that’s my assessment. If I’m wrong, we’ll know soon enough.

Expensive? Yes, but

Speaking of cost, a few commenters challenged my basic assertion that a new Mac Pro setup is “expensive.” To buttress their argument, some cited examples of the much higher costs of previous generation computing devices, going back as far as decades ago. Such comparisons don’t make much sense to me. Yes, computers back in the 1960’s could cost hundred of thousands, if not millions, of dollars and yet have less computing power than today’s iPhone. But so what? That’s not the choice facing today’s users.

It may also be true that the new Mac Pro is cheaper than some workstation solutions that exist today. Again, this is largely besides the point.

The simple point is this: The new Mac Pro, even adjusted for inflation, will cost significantly more than a comparable prior-generation Mac Pro. Unless you truly need a workstation-like machine, it’s going to be very hard to justify this cost.

Whining? Sigh

Not surprisingly, some comments amounted to name-calling attacks—using phrases such as “whiner,” “strained hit piece,” “yellow journalism” and more. Such is life on the Internet.

Some of these commenters clearly have no idea of my background. If they did, they would know that I have owned nothing but Apple computing devices since buying a Macintosh in 1984. I have made a career writing about Apple products, primarily lauding their advantages over the competition. The idea that I would be motivated to write some sort of “hit piece” is almost funny.

Even if this were not the case, such comments shed no light on the discussion. That’s why, beyond what I’ve written already, I see little point in directly responding to these comments. It only gives the commenters more attention than they deserve.

Happily, most of the comments did not fall into this “attack” category. Rather, they were respectful disagreements. As such, they pushed me to rethink my positions, in an effort either to better defend them or to change them. I always welcome that opportunity.