Putting it together: MacBook, USB-C and the iPad Pro

Apple’s new MacBook made an impressive debut at yesterday’s Apple Media Event. With features such as a 2304 x 1440 Retina display, Force Touch trackpad, and fanless design, it lives up to Apple’s billing as an innovative “reinvention” of a state-of-the-art laptop computer.

Still, despite dropping the Air suffix from its name, the new 12-inch laptop is a very close relative of the Air — both in appearance and target audience. On the other hand, the MacBook is so light (just two pounds) and so thin (24% thinner than an 11-inch MacBook Air) that its truest competitor may turn out to be the iPad Air rather than the MacBook Air.

Reinforcing this iPad matchup, the new MacBook comes in the same assortment of three colors (silver, space gray and gold) as do Apple’s iPads. And (as with all iOS devices and unlike Apple’s other laptops), the new Macbook has no custom configuration options.

USB-C

There’s one more iPad similarity. And it’s a big one: The MacBook has only one port for wired connections (not counting an audio-out jack)! Really. Just one. That’s down from four (2 USB, 1 Thunderbolt, and a power port) in the MacBook Air. The new port even looks like an iOS Lightning port. But it’s not. It’s an entirely new, never-before-seen-on-an-Apple-device port called USB-C. This USB-C connection supports charging, USB 3.1 Gen 1 and DisplayPort 1.2. It does it all, as they say.

My first reaction to this news was: “What? Only one? Even if Apple wanted just one type of port, couldn’t they at least have included two of them?” That way you could charge a MacBook and have an external drive attached at the same time. As it now stands, unless you get an inevitable third-party USB-C hub, you can only do one of these things at a time.

And no Thunderbolt? This means you can’t connect a MacBook to Apple’s Thunderbolt display — an option that had been strongly promoted by Apple just a couple of year’s ago.

I was ready to conclude this whole USB-C thing was a serious misstep on Apple’s part. And it may yet prove to be so. But, more likely, it is Apple once again staying ahead of the curve, pushing the envelope, or whatever similar analogy you prefer. Remember when the iMac first came out, without any floppy drive? People said it was a huge mistake. But it turned out to be prescient. This is Apple doing the same thing.

First, given the target audience for this Mac, which is the low end of the market, the limitations of a lone USB-C port are likely to be less than they may appear. For example, prospective MacBook owners are not the sort to purchase a Thunderbolt Display. That’s more for the MacBook Pro crowd.

More importantly, with the new MacBook, Apple is pushing us towards a world when all connections will be wireless — either to other local wireless devices or over the Internet to the cloud. Want to back up your MacBook? Connect it wirelessly to a Time Capsule. Want a larger display? Use AirPlay to mirror your display to a television. Want to store your super-large music and photo libraries? Use iCloud.

iPad Pro?

Let’s return to the iPad/MacBook similarity. Rumors continue to circulate that Apple will be releasing a 12-inch iPad Pro later this year. Does such a device still make sense, given the arrival of this new MacBook?

Personally, I much prefer an iPad to a laptop for many tasks. There are many times when I find iOS apps and a touchscreen more convenient and more practical than Mac app alternatives and an intrusive physical keyboard. Want to read the New York Times, check the weather, read a Kindle book, play a game, listen to a podcast? The iPad is the better choice. When I am home, I use my iPad Air almost exclusively, while my MacBook Pro gathers dust (I have a desktop Mac for tasks that the Air doesn’t handle well).

The iPad Air also beats even this latest MacBook in terms of weight and size — by a wide margin: the iPad is half the weight and almost half the thickness of the MacBook.

Overall, I don’t see the new MacBook significantly affecting sales of the iPad Air or mini.

A supposed iPad Pro is a different story. An iPad Pro will presumably be targeted for “productivity” tasks that are the traditional domain of laptops — tasks where you typically prefer a physical keyboard. The new MacBook will give an iPad Pro a run for its money here. Even if you could “get by” with just an iPad Pro, a MacBook (with the more powerful and flexible OS X) will be the better choice for getting work done.

Bottom line: Many people will still prefer to own some combination of iOS device(s) and Mac(s). I certainly will. But it’s hard to imagine users opting for both a new MacBook and an iPad Pro. It will be one or the other. And the new MacBook is more likely to be the winner. That’s why I am beginning to have serious doubts about the viability of an iPad Pro. The new MacBook may kill the device before it’s even born.

One final thought: If an iPad Pro is coming…might it come with a USB-C port instead of (or more likely in addition to) a Lightning port? If so, this would allow the Pro to offer an assortment of productivity options not currently possible with existing iPads.

The Oscars vs. the public

Although it’s still one of the highest rated shows on television, last week’s Academy Awards garnered its lowest ratings in six years, down about 16% from last year alone. Some (including myself) see this as a reflection of how terrible the show is — with monotonous acceptance speeches for little-understood categories (like Best Sound Mixing), tacky musical numbers that seem to have arrived via a time warp from another century and an emcee whose attempts at humor fall embarrassingly flat.

But hey, the Oscars (with few exceptions) have always been like that — even when they got much higher ratings. We watched anyway — because we wanted to gawk at the stars, learn the winners of the “big” awards as soon as they were announced and perhaps get to savor an unexpected memorable moment or two.

Lastly, we hoped to see our favorite films get rewarded with a statuette. And herein lies a critical “problem” with the Oscars, at least as cited by many pundits. As noted by Michael Cieply and Brooks Barnes in a New York Times article, the Best Picture winner, Birdman, collected only about $11 million in ticket sales between the time it was nominated and the day of the awards. In contrast, American Sniper, another Best Picture nominee, took in $317 million over the same period — equalling almost as much as the combined total of the other seven nominees. Yet Sniper went home with only one award — a minor one for sound-editing.

The Oscars vs. movie-goers disparity is actually worse than the Sniper example. If you look at the top eight highest grossing movies of 2014, American Sniper is the only film on the list to have gotten a Best Picture nomination. This is not a new phenomenon. It is a trend that has been developing for years. Indeed, it is precisely what led the Academy to expand the potential list of nominees beyond its prior five movie limit. The goal was to increase the number of popular films that got nominated. It hasn’t quite worked out that way. Chances are most viewers won’t see their favorite films win any awards — because their favorite films aren’t even nominated.

Cieply and Barnes’ explanation is that the Oscars “have become hopelessly detached from movie viewers…Both the Academy and the echo chamber of Hollywood’s awards-system machinery have nearly broken their connection with the movies that many millions of people buy tickets to watch.” The Academy voters have become “elitist” and “not in step with anything that is actually popular. No one really believes anymore that the films they chose are the ones that are going to last over time.”

I beg to to differ.

If the Academy is “elitist,” then so is almost every other institution that gives out film awards. According to IMDB, Birdman received 170 wins and 152 nominations, including several Best Picture wins and a spot on almost every critic’s Top Ten list. The only other movie with comparable credentials was Boyhood, another low-grossing Best Picture nominee. In contrast, The Hunger Games: Mocking Jay – Part 1, the highest grossing film of the year, received only 1 win and 9 nominations — none of which were for Best Picture. Even American Sniper had only 8 wins and 25 nominations overall.

As I see it, the reason for this is that the Oscars, as well as most of the other awards cited by IMDB, are not based on popularity, at least not as measured by box office success. Ideally, they are determined primarily by artistic merit. These two criteria show very little overlap, especially these days. This disconnect is by no means unique to movies. You see the same thing in literature. The winners of the National Book Awards, the Pulitzer prize or Nobel prize are rarely given to books that topped the New York Times Best Sellers list.

I’m not naive. I know that politics, among other non-quality factors, contribute to an Oscar win. But it’s still fair to say that sheer popularity is not the determining factor — probably less determining now than ever before. And this is as it should be. I view the trend of recent years as a positive one. Would we rather return to a time when movies like Oliver! or Driving Miss Daisy wins Best Picture? I hope not.

Birdman may or may not be your choice for Best Picture. But it is undeniably a great film. It had terrific acting by the entire cast, creative cinematography, an inventive percussive soundtrack, topped off by an original thought-provoking screenplay. I contend that this is a film that will “last over time” — certainly much more so that most of the films that make up the top box-office winners — populated with big budget, special-effects-laden sequels and franchises like The Hunger Games, Captain America, The Hobbit and Transformers. Which film do you consider more likely to join Citizen Kane, The Godfather, or Casablanca on the AFI’s list of 100 greatest films of all timeBirdman or Transformers?

Despite all of this, I do concede there is an uncomfortable disconnect between the awards and the public — one that has grown larger over the years.

The disconnect is partly attributable to the rise of the blockbuster movie — which has divided the year into summer action movies vs. fall “serious” movies. The result is that the most popular films come out in the summer and the most Oscar-nominated films in the fall.

This is assisted by the fact that, as has always been true, money talks loudest in Hollywood. Ask a studio head if would he rather produce a crappy film that makes huge amounts of money or a great film that barely ekes out a profit. Almost always (maybe always), the answer will be the former. So action movie crap too often gets a green light.

The disconnect is also partly attributable to a change in viewing habits. More and more, people are content to view movies at home on their large screen televisions (or even their small mobile devices), rather than in theaters. This especially hurts theater ticket sales of smaller independent character-driven films — ones that skew toward an older less-theater-going audience and do not benefit much from being seen in a theater anyway.

There was a time when some of the best most memorable movies of the year were also among the most popular. The peak for this was probably the 1970’s — when movies like The Godfather, The French Connection, The Sting and Annie Hall won Best Picture. No more. We live in a time when, with few exceptions, the most popular and money-making films are the ones that most appeal to teenagers seeking the film equivalent of a comic book or young-adult novel. This is not the best criterion for a great film. And these films rarely get many award nominations.

So, yes, the Academy Awards are detached from the mainstream of movie-goers these days. While there is still the potential to make movies (such as American Sniper) that achieve both box office and critical success, it has become increasingly difficult to do so. But the solution is not to turn the Oscars into the People’s Choice Awards. Hollywood should continue to strive to give Oscars to what it perceives as the best films, regardless of box office receipts. If that means a decline in the popularity of the Oscar telecast, so be it.

However, I believe Hollywood should be able to figure out how to make the Oscar ceremony a much more entertaining event. That could go a long way to improving the ratings. I have ideas about this, starting with focusing on movies rather than dumb musical numbers…but that’s a subject for another column.

Addendum: I just finished reading Richard Corliss’ Time magazine article [subscription required] covering this same territory. I agree with his contention that a couple of the top grossing movies were well-received and could have qualified for a Best Picture nomination (especially “The Lego Movie” and “The Guardians of the Galaxy”) — although I can’t see any of them winning. However, I would point out that equal Rotten Tomatoes ratings are not all equivalent; two movies could both get a 90% approval from critics yet these same critics could agree that only one of the movies deserves consideration for Best Picture. Corliss also makes a good point that the subject matter of most of the nominated movies appeals more to an older audience — which likely reflects the 60-ish average age of Academy members. This could use some fixing. Beyond that, the Time column did not lead me to modify the views I expressed here.

The Ku Klux Klan vs. Muslim extremists

In a recent column for Time (These Terrorist Attacks Are Not About Religion), Kareem Abdul-Jabbar put it bluntly:

“When the Ku Klux Klan burns a cross in a black family’s yard, Christians aren’t required to explain how these aren’t really Christian acts.

Most people already realize that the KKK doesn’t represent Christian teachings. That’s what I and other Muslims long for—the day when these terrorists praising Mohammed or Allah’s name as they debase their actual teachings are instantly recognized as thugs disguising themselves as Muslims.”

At first glance, I find Abdul-Jabbar’s analogy to be compelling. Comparing extremist Muslims to the Ku-Klux-Klan makes a lot of sense. They are both hate-filled violence-prone minorities. However, on closer examination, the analogy begins to fall apart.

For one thing, by whose authority does Abdul-Jabbar assert that the terrorists are “disguising themselves as Muslims” — as opposed to being true Muslims? I assume that members of Al-Qaeda would make the same accusation about Abdul-Jabbar. As I have previously asserted, there are minority segments of all religions. Being a minority, even a violent minority, does not mean you cannot also be a legitimate member of a religion. There are certainly those would claim that advocating violence is as much a part of religious teachings, both Muslim and Christian, as advocating peace.

As for the terrorists who gunned down the staff of Charlie Hebdo — it is true that they are small in number. However, these terrorists were not just a bunch of thugs acting in isolation. They are not, as Abdul-Jabbar suggests, the equivalent of  “bank robbers wearing masks of presidents.”

Rather, the terrorists were trained and backed by Al-Qaeda in Yemen. And Al-Qaeda does not exist in a vacuum. It survives in part because of support from the population and authorities in the countries where they reside. Many Muslims in these countries offer tacit approval of such acts, even if they assert that they would never carry out such acts themselves.

Here is where I believe that Abdul-Jabbar’s Ku Klux Klan analogy is at its most accurate, although not in the way he intended. We shouldn’t look at the analogy from the point of view of a comfortable American living in 2015. Rather, look at it from the perspective of an African-American living in the deep South in the 1950’s.

Here you are, a black person at the time when the Ku Klux Klan’s power and influence were at their height. The Klan may represent only a tiny minority of the Christian population around you. They may represent a distorted view of Christianity, one that Christ himself would reject. Indeed, as a black person, you likely attend a Christian church that holds very different views.

Regardless, you know that none of this really matters. The larger truth is that the Klan survives because it is tolerated by the rest of the community. More than that, much of the community quietly approves of what the Klan is doing, even if they would never participate in its actions.

Indeed, the majority population of the Southern states are overtly racist. As a black person in the South in the 1950’s, you see this every time you are humiliated by the institutionalized racism that surrounds you. You have to go to the back of the bus. You can’t use the “whites only” water fountain. Schools are completely segregated. You can’t buy a house in most neighborhoods of a city. You can’t even vote. And you risk getting beaten by the police for challenging any of these restrictions. This racism is sanctioned by the government, all the way from the local councilman to the governor of the state.

This is the full picture of the time of Ku Klux Klan. With this full picture in mind, we see that the analogy to the Muslim situation today is apt, but differently than the way Abdul-Jabbar asserts.

Today, we see a Muslim world in the Middle East where, like the deep South decades ago, the population is unwilling to speak out against the actions of the extremists. Too often, the silence masks a disturbing approval of these actions. The supporters may not represent the majority— but they are far from a trivial component. In many instances, discrimination is institutionalized — even towards other members of the Muslim faith* — as seen in the gross inequality toward women and harsh penalties (including death) for those who rebel against the faith. And, of course, anti-Semitism is rampant everywhere.

The Ku Klux Klan was an extreme manifestation of racism in the South, but not the exclusive or even primary proponent of it. I believe the same is true today for the Muslim extremists in the Middle East.

If and when the day ever comes that the views of Abdul-Jabbar are representative of all parts of the Muslim world, I will happily join Abdul-Jabbar in what he “longs for.” Until then, I contend that these terrorist attacks are about religion — not the religion as Abdul-Jabbar practices it, but religion none-the-less.

Just saw this today: Egypt student gets 3-year jail term for atheism.

Smart device overkill

I own a smart TV. Among other things, I can use it to connect to Netflix, with no other device needed.

I also have a smart Blu-ray player. It too includes an option to select Netflix, as well as a small assortment of other “channels.”

Lastly, I have an Apple TV. As you probably already know, I can choose to watch Netflix from this device as well.

I have absolutely no need for three different ways to stream video from Netflix. One is definitely sufficient. [I’m not even going to go into the fact that I can also watch Netflix on my Mac, iPad and iPhone.]

Currently, the Apple TV is my preferred choice. This is because, of the three devices, it has the most feature-filled and easiest-to-navigate interface. I also stay exclusively with Apple TV because it is the device I use for channels, such as HBO GO, that are not available on the other two devices. Apple TV is also the only device that gives me access to my iTunes library and offers AirPlay. Case closed.

Essentially, if my television and Blu-ray player magically became dumb devices overnight, it would not matter to me one whit.

This is the dilemma that is facing the makers of these smart devices. The market is currently suffering from an overdose of overlapping devices. It’s especially tricky for television makers (see this Macworld article for related insight). No matter how smart televisions become, it won’t matter to their sales if people like me still prefer to use Apple TV instead. At the same time, Apple needs to worry that, if they don’t update the Apple TV sufficiently, people like me may yet abandon it in favor of improved and expanded features on televisions.

In the end, there may remain room for more than one choice to be retained and stay profitable. For example, those on a tighter budget might stick with their television alone (as this doesn’t require an additional purchase) while those with more disposable income go for an Apple TV or Roku.

Regardless, the current mishmosh is not sustainable. There will be winners and losers. The losers will gradually vanish from the landscape. I already anticipate this happening with smart Blu-ray players, maybe even with optical disc players altogether. Who will emerge as dominant in the battle between televisions vs. Apple TV/Roku devices remains to be seen. However, I expect that new hardware coming later this year will go a long way to determining which way the ball will bounce. Personally, I’m still hoping for a much improved Apple TV to win the day. But it’s far from certain that this will happen. Game on.