The Oscars vs. the public

Although it’s still one of the highest rated shows on television, last week’s Academy Awards garnered its lowest ratings in six years, down about 16% from last year alone. Some (including myself) see this as a reflection of how terrible the show is — with monotonous acceptance speeches for little-understood categories (like Best Sound Mixing), tacky musical numbers that seem to have arrived via a time warp from another century and an emcee whose attempts at humor fall embarrassingly flat.

But hey, the Oscars (with few exceptions) have always been like that — even when they got much higher ratings. We watched anyway — because we wanted to gawk at the stars, learn the winners of the “big” awards as soon as they were announced and perhaps get to savor an unexpected memorable moment or two.

Lastly, we hoped to see our favorite films get rewarded with a statuette. And herein lies a critical “problem” with the Oscars, at least as cited by many pundits. As noted by Michael Cieply and Brooks Barnes in a New York Times article, the Best Picture winner, Birdman, collected only about $11 million in ticket sales between the time it was nominated and the day of the awards. In contrast, American Sniper, another Best Picture nominee, took in $317 million over the same period — equalling almost as much as the combined total of the other seven nominees. Yet Sniper went home with only one award — a minor one for sound-editing.

The Oscars vs. movie-goers disparity is actually worse than the Sniper example. If you look at the top eight highest grossing movies of 2014, American Sniper is the only film on the list to have gotten a Best Picture nomination. This is not a new phenomenon. It is a trend that has been developing for years. Indeed, it is precisely what led the Academy to expand the potential list of nominees beyond its prior five movie limit. The goal was to increase the number of popular films that got nominated. It hasn’t quite worked out that way. Chances are most viewers won’t see their favorite films win any awards — because their favorite films aren’t even nominated.

Cieply and Barnes’ explanation is that the Oscars “have become hopelessly detached from movie viewers…Both the Academy and the echo chamber of Hollywood’s awards-system machinery have nearly broken their connection with the movies that many millions of people buy tickets to watch.” The Academy voters have become “elitist” and “not in step with anything that is actually popular. No one really believes anymore that the films they chose are the ones that are going to last over time.”

I beg to to differ.

If the Academy is “elitist,” then so is almost every other institution that gives out film awards. According to IMDB, Birdman received 170 wins and 152 nominations, including several Best Picture wins and a spot on almost every critic’s Top Ten list. The only other movie with comparable credentials was Boyhood, another low-grossing Best Picture nominee. In contrast, The Hunger Games: Mocking Jay – Part 1, the highest grossing film of the year, received only 1 win and 9 nominations — none of which were for Best Picture. Even American Sniper had only 8 wins and 25 nominations overall.

As I see it, the reason for this is that the Oscars, as well as most of the other awards cited by IMDB, are not based on popularity, at least not as measured by box office success. Ideally, they are determined primarily by artistic merit. These two criteria show very little overlap, especially these days. This disconnect is by no means unique to movies. You see the same thing in literature. The winners of the National Book Awards, the Pulitzer prize or Nobel prize are rarely given to books that topped the New York Times Best Sellers list.

I’m not naive. I know that politics, among other non-quality factors, contribute to an Oscar win. But it’s still fair to say that sheer popularity is not the determining factor — probably less determining now than ever before. And this is as it should be. I view the trend of recent years as a positive one. Would we rather return to a time when movies like Oliver! or Driving Miss Daisy wins Best Picture? I hope not.

Birdman may or may not be your choice for Best Picture. But it is undeniably a great film. It had terrific acting by the entire cast, creative cinematography, an inventive percussive soundtrack, topped off by an original thought-provoking screenplay. I contend that this is a film that will “last over time” — certainly much more so that most of the films that make up the top box-office winners — populated with big budget, special-effects-laden sequels and franchises like The Hunger Games, Captain America, The Hobbit and Transformers. Which film do you consider more likely to join Citizen Kane, The Godfather, or Casablanca on the AFI’s list of 100 greatest films of all timeBirdman or Transformers?

Despite all of this, I do concede there is an uncomfortable disconnect between the awards and the public — one that has grown larger over the years.

The disconnect is partly attributable to the rise of the blockbuster movie — which has divided the year into summer action movies vs. fall “serious” movies. The result is that the most popular films come out in the summer and the most Oscar-nominated films in the fall.

This is assisted by the fact that, as has always been true, money talks loudest in Hollywood. Ask a studio head if would he rather produce a crappy film that makes huge amounts of money or a great film that barely ekes out a profit. Almost always (maybe always), the answer will be the former. So action movie crap too often gets a green light.

The disconnect is also partly attributable to a change in viewing habits. More and more, people are content to view movies at home on their large screen televisions (or even their small mobile devices), rather than in theaters. This especially hurts theater ticket sales of smaller independent character-driven films — ones that skew toward an older less-theater-going audience and do not benefit much from being seen in a theater anyway.

There was a time when some of the best most memorable movies of the year were also among the most popular. The peak for this was probably the 1970’s — when movies like The Godfather, The French Connection, The Sting and Annie Hall won Best Picture. No more. We live in a time when, with few exceptions, the most popular and money-making films are the ones that most appeal to teenagers seeking the film equivalent of a comic book or young-adult novel. This is not the best criterion for a great film. And these films rarely get many award nominations.

So, yes, the Academy Awards are detached from the mainstream of movie-goers these days. While there is still the potential to make movies (such as American Sniper) that achieve both box office and critical success, it has become increasingly difficult to do so. But the solution is not to turn the Oscars into the People’s Choice Awards. Hollywood should continue to strive to give Oscars to what it perceives as the best films, regardless of box office receipts. If that means a decline in the popularity of the Oscar telecast, so be it.

However, I believe Hollywood should be able to figure out how to make the Oscar ceremony a much more entertaining event. That could go a long way to improving the ratings. I have ideas about this, starting with focusing on movies rather than dumb musical numbers…but that’s a subject for another column.

Addendum: I just finished reading Richard Corliss’ Time magazine article [subscription required] covering this same territory. I agree with his contention that a couple of the top grossing movies were well-received and could have qualified for a Best Picture nomination (especially “The Lego Movie” and “The Guardians of the Galaxy”) — although I can’t see any of them winning. However, I would point out that equal Rotten Tomatoes ratings are not all equivalent; two movies could both get a 90% approval from critics yet these same critics could agree that only one of the movies deserves consideration for Best Picture. Corliss also makes a good point that the subject matter of most of the nominated movies appeals more to an older audience — which likely reflects the 60-ish average age of Academy members. This could use some fixing. Beyond that, the Time column did not lead me to modify the views I expressed here.

Smart device overkill

I own a smart TV. Among other things, I can use it to connect to Netflix, with no other device needed.

I also have a smart Blu-ray player. It too includes an option to select Netflix, as well as a small assortment of other “channels.”

Lastly, I have an Apple TV. As you probably already know, I can choose to watch Netflix from this device as well.

I have absolutely no need for three different ways to stream video from Netflix. One is definitely sufficient. [I’m not even going to go into the fact that I can also watch Netflix on my Mac, iPad and iPhone.]

Currently, the Apple TV is my preferred choice. This is because, of the three devices, it has the most feature-filled and easiest-to-navigate interface. I also stay exclusively with Apple TV because it is the device I use for channels, such as HBO GO, that are not available on the other two devices. Apple TV is also the only device that gives me access to my iTunes library and offers AirPlay. Case closed.

Essentially, if my television and Blu-ray player magically became dumb devices overnight, it would not matter to me one whit.

This is the dilemma that is facing the makers of these smart devices. The market is currently suffering from an overdose of overlapping devices. It’s especially tricky for television makers (see this Macworld article for related insight). No matter how smart televisions become, it won’t matter to their sales if people like me still prefer to use Apple TV instead. At the same time, Apple needs to worry that, if they don’t update the Apple TV sufficiently, people like me may yet abandon it in favor of improved and expanded features on televisions.

In the end, there may remain room for more than one choice to be retained and stay profitable. For example, those on a tighter budget might stick with their television alone (as this doesn’t require an additional purchase) while those with more disposable income go for an Apple TV or Roku.

Regardless, the current mishmosh is not sustainable. There will be winners and losers. The losers will gradually vanish from the landscape. I already anticipate this happening with smart Blu-ray players, maybe even with optical disc players altogether. Who will emerge as dominant in the battle between televisions vs. Apple TV/Roku devices remains to be seen. However, I expect that new hardware coming later this year will go a long way to determining which way the ball will bounce. Personally, I’m still hoping for a much improved Apple TV to win the day. But it’s far from certain that this will happen. Game on.

Noah? No

Here’s a footnote to my prior article on Cosmos and God:

If ever there was a Bible story that makes absolutely no sense, it is the story of Noah. It’s so easy to find logical fallacies in the telling that it hardly seems worth the trouble to do so. However, Noah has recently received more than his usual amount of attention, thanks to the Darren Aronofsky movie starring Russell Crowe. An assortment of articles (such as this one) debate the story’s “accuracy.” Several Christian groups are alarmed at the movie’s supposed misrepresentations.

Although there’s not much I can add that has not already been said, I’ll offer a few personal thoughts anyway:

• In Genesis 6-9, God expresses his regret at creating “man” because all men have become evil and wicked. Really? Isn’t God perfect? If so, how could he create something he later regrets?

As an aside: Can you imagine what would happen if someone alive today claimed to be having conversations with God similar to what Noah had? They would almost certainly be declared insane.

• Can it really be true that everyone on earth was evil at the time of Noah, as God asserts in Genesis? What about newborn children? What about most children really? What about the people in distant parts of the world that were unaware of what was going on in the Middle East?

Did everyone really have to die to appease God here? Why couldn’t God have selectively destroyed just the truly evil people, similar to what he did when the Egyptian first-born were slain — as told in Exodus and recalled by Jews every Passover? For some reason, this was apparently not a possibility.

Instead, if the story is true, God executed the biggest act of genocide in history (as others have pointed out).

• Moving on to the specifics of the flood (and again as has been pointed out by others), it would be impossible for the ark to contain a pair of every living creature. There were just too many. It would certainly be impossible for them all to survive for the duration that the ark was afloat. For starters, the varying ecological requirements for each species would prevent this.

Digging a bit deeper, what about the polar bears in the Arctic, the penguins in Antarctica, or all the animals unique to Australia? What about species that live exclusively in caves? Or in the jungles of South America? Did they somehow make it to the ark? If so, how? And if not, how do we explain their existence today?

On a smaller scale, consider insects. There are “more than 925,000 species of insects that scientists have identified. Still, this represents only 20 percent of all species believed to exist” today. Were all of these species on the ark? What about all the microscopic organisms that existed at the time? Were these “paired up” and put on the ark? Not likely.

Yes, one could argue that God intervened, in some miraculous way, to allow all these animals to board the ark, co-exist and survive until the flood waters receded. But resorting to miracles is a slippery slope. If God could use a miracle to accomplish something like this, why require an ark at all? Why couldn’t God have instead used his miraculous powers to keep the necessary animals alive without an ark? Wouldn’t that have been much simpler?

I am sure that the people who take the Noah story literally have invented answers to respond to all of these questions. But that’s the point. They are “invented” answers. They are suppositions. They have no basis in fact.

My view here extends beyond the story of Noah to the Bible as a whole. Rather than viewing the Bible as literal historical truth, it makes more sense to view it as a collection of stories created by humans in an attempt to comprehend the world back when humans had very little knowledge about how the world really worked.

Consider this: From the time the New Testament Bible was written, it would take about 1,500 more years before we came to accept that the earth revolved around the sun and not the reverse. It is only in the last two centuries that we’ve come to understand that stars are actually distant suns light-years away from earth. At the other end of the spectrum, it took us the same 1,500 years to discover that microscopic organisms exist and the critical role they play in our lives. It’s been less than 100 years since we broke the genetic code and began to truly comprehend how reproduction and inheritance work. In that context, it’s not surprising that people might take the story of Noah seriously thousands of years ago. But not anymore.

Presumably, an omniscient God, present at the time of Christ, knew that someday we would “discover” bacteria, DNA, computers, space travel, black holes and all the rest that makes up modern science — including many things we have yet to discover. And yet the Bible makes no mention of any of this. Rather, as you would expect from a document written by humans, it is restricted to the (very limited and often incorrect) knowledge humans had at the time.

That’s why for me, rather than come up with torturous explanations for the contradictions and impossibilities contained in the Noah story, it makes far more sense to accept the obvious: The story amounts to a folk tale, a fable, a legend, a morality lesson. Call it whatever you want. Just don’t call it true.

The decline (and fall) of the DVR

Remember the videocassette recorder (VCR)? What a glorious piece of technology. For the first time in human history, people could record TV shows for later viewing (“time-shifting”). Freed from the shackles of when the networks offered programming, people could instead watch shows whenever it was convenient. Hooray!

The revolution moved on

And yet…about the only place you’ll find a VCR today is at electronics recycling sites. What happened? The revolution moved on. DVDs replaced cassettes.

While DVDs were a vast improvement in playback quality and convenience, they were almost never used for recording TV shows. For starters, DVD recorders were far more rare than the ubiquitous players. No matter. As it turned out, most people had never used their VCRs to record shows. It was just too complicated. Unless you planned to be home for the starting time of a show, you had to figure out how to record while you were away. A successful time-shift required that you (a) remember to insert a blank tape and rewind it if necessary (b) make sure the tape was sufficiently long to record your show, slowing the record speed as needed and (c) set the VCR to the desired television station or input setting.

Even after you overcame these hurdles, the Mt. Everest of hurdles remained: Figuring out how to program the device to start and stop at the correct time. The majority of people gave up at trying to get rid of the pesky blinking “12:00.” Even if you succeeded, you had to do it all again each time you wanted to record a show.

So most people wound up using their VCRs and DVD devices almost exclusively for playback of prerecorded material. And that’s where things stood until…

The revolution moved on…again

The digital video recorder (DVR) arrived! Suddenly, time-shifting was drop-dead simple. Want to record six different shows each on a separate station? No problem. You can even set up a season pass to record a series without needing to know exactly if or when episodes will air. And you can almost instantly select to playback any of your recordings. Fantastic!

There remain a couple of drawbacks to DVRs. For one thing, unlike with a VCR, you need to pay a monthly fee, typically to your cable company, in order to make use of the DVR. But, by now, this is a minor extension to what people are already doing. More troublesome, it’s almost impossible to make external copies of your recordings. This means that you can’t, for example, make a backup copy of a show or lend a recording to a friend—as you could easily do with videocassettes. And, should your DVR break and need to be replaced, hold on to your hat: you lose all your recorded content.

And that’s about where things stand now.

[Note: TiVos are better than most other DVRs in terms of offloading content. The latest Roamio TiVos can send recorded video to your iOS devices. And almost all TiVos can transfer shows to your computer, albeit it a painfully slow rate. Good, but still not an ideal solution.]

The revolution keeps on moving

The rumblings that signal the next major technological shift are already here: internet streaming and cloud-based video services such as iTunes, Netflix and HBO GO. Devices such as Apple TV and Roku are now serving as DVR alternatives in many households, despite the lack of any recording option. As these services and devices continue to improve, DVRs (as well as DVD and Blu-ray players) will eventually join VCRs in the dustbins of the not-too-distant future.

More substantial change is on the way. Here’s a glimpse of one possible future:

Imagine a cloud-based service that stores every movie and television show ever recorded/filmed/whatever (except perhaps movies currently in theaters and the most recent episodes of TV shows). Now imagine that you can access this immense library merely by paying a monthly fee. I expect the fee to be fairly steep by today’s standards, around $100/month. But it will be comparable to what most people are paying now for similar access (via a combination of payments to Netflix, Hulu, renting and buying movies, etc.).

As a subscriber, you will be able to stream any movie or TV show (without commercial interruptions) to any of your Internet-connected devices (TV, computer, tablet or smartphone).

You’ll also be able to watch content when you’re offline, for no additional fee. To do so, you’ll just download your desired items to your digital device. There will be some limitations here. As with current rented movies, the downloads will “expire” after a brief period of time, say a month. And there will be a limit to how many downloads you can have active at one time (perhaps a half dozen). Still, this should be more than adequate to cover your viewing for those occasions when you don’t have an Internet connection.

What about those movies you have to “own”—perhaps because you’re worried they might get removed from the cloud service someday? Once again, not a problem. There will be an option to purchase content, just as you can now do from the iTunes Store. However, because you’re also paying a monthly fee, I expect the purchase price to be cheaper than the current going rates.

There you have it: one service for just about anything you might want to watch (except for live sports and news shows), available just about anywhere and anytime you want to watch it. And no need to remember to record anything. Nirvana.

Roadblocks

The essential technology to implement this system exists today. In fact, for music, via services such as Spotify or Rdio, you can already pretty much accomplish what I’ve described here. Offering the same capability for video is not as simple. It will almost certainly require upgrades to the current Internet bandwidth. But that’s coming. I don’t consider this a dealbreaker.

The bigger question mark is whether the existing content creators and providers (Comcast, TV networks, Hollywood studios, etc.) will ever willingly go along with such a system. At present, given their track record, I’d have to say no. They will certainly put up a fight — a big fight — similar to what they are now doing with Aereo. But they do this with every outside challenge to the status quo, dating as far back as when Hollywood railed against television as a dire threat to its survival.

[Note: Aereo, although much more limited in scope than what I have proposed here, offers a feature not included in my proposal: it can function as a cloud-based DVR for live broadcasts. If this too became widespread, it would be another nail in the coffin for the traditional DVR.]

Still, I remain optimistic that, when a strongly desirable technological advance becomes practical, as is the case here, it cannot be blocked indefinitely. As the forces of change gather steam, the opposing parties will reluctantly make the necessary concessions while at the same time figuring out a way to continue to make money. Yes, there will be some losers as well as winners. But that’s how progress happens.

Something of this sort may well be what Apple has been trying to cobble together with its yet unannounced but long-rumored venture into television. If so, it would explain why they are having such a difficult time bringing it to market. Getting all the relevant parties on board is a balancing act that even Steve Jobs might have been unable to pull off. Regardless, I expect we’ll see the fruits of Apple’s labors sometime within the next year.

The other possibility is that Apple fumbles the ball and someone else (such as Comcast) picks it up and runs with it. I hope not. The result probably won’t be nearly as good for consumers as what Apple would have done.

Whoever succeeds and however they do it, one thing is certain: Change is coming and, when it arrives, it will be curtains for the DVR.