Steven Spielberg has a problem with the movie "Roma." Maybe not artistically -- I'm guessing that he, like most people, liked it -- but with who produced it and distributed it and how. And after learning about his objections to it, I'm choking on the irony of it all.
"Roma" is a Netflix movie. It made a brief, small-scale theatrical run to qualify for the Academy Awards, but the vast majority of people who have seen it have watched it via Netflix, either on their TV, laptop or tablet. Spielberg does not like that what he considers to be a TV movie was eligible for the Best Picture Oscar for which it was nominated. He thinks it should've been up for an Emmy instead. I read this morning that he intends to use his considerable power to prevent that from happening again in the future by getting the Academy Governor's Board to bar Netflix movies from Oscar consideration.
Spielberg has both aesthetic and business objections to Netflix flicks. On the aesthetic side, he is said to "feel strongly about the difference between the streaming and theatrical situation" as it relates to screen size, sound, and overall experience. For this I do have some degree of sympathy. I watch far more movies at home these days than I do at the theater, but I still have a soft spot for the moviegoing experience. If I am truly interested in a new release, I will make a point to get to the theater to see it.
But I don't have to. Maybe Spielberg assumes that those of us not rich enough to have a dedicated screening room in our Pacific Palisades homes are watching VHS cassettes on a 19" Magnavox sitting on a metal TV stand, but the fact is that it's not hard or even super expensive anymore to get a really nice visual and audio movie experience in our living rooms. I have a rather crappy TV by today's standards -- it's an HD flat screen, but I bought it like 13 years ago and it's not paired up with a big sub-woofer or surround sound or anything -- but it's still pretty good for anything but the grandest epics and most intense special effects-laden movies. For most movies I watch, including movies like "Roma," it's perfectly fine.
For the sake of argument, though, let's grant Spielberg's point about aesthetics. Let's defer to his obviously hefty cinephile bonafides and grant him that it's better to see a movie made for the big screen than made for Netflix. I'll grant that because what I find far, far more objectionable are his complaints about the business side of this.
The business objections of Spielberg and others on the Academy Governor's Board to Netflix movies are varied. Some of it is just that they don't like the money Netflix throws around, which is nothing I particularly care about. For the most part, though, Spielberg and his friends don't like the way Netflix interacts with movie distributors and theaters when it does those limited theatrical runs required for Oscar consideration. Specifically, Spielberg doesn't like the manner in which they rent theaters out instead of licensing films, which allows them to keep, rather than share, ticket sales, and allows them to avoid reporting box office numbers. In the aggregate, Spielberg's complaint is that Netflix is messing up a well-worn and established movie distribution model.
Which, when you think about it, is pretty damn rich coming from Spielberg. Because while I love a great many Spielberg movies, the guy's business legacy is that he fundamentally altered the model of film distribution in this country which, in turn, had a massive and, many argue, negative impact on the artistic side of filmmaking.
Spielberg broke into the business as part of the "New Hollywood" generation of writers and directors who came of age in the 60s and who came into professional prominence in the 70s. These were young auteur-types to whom Hollywood studios gave unprecedented freedom and autonomy because, frankly, the studios were losing money, were out of touch with the prevailing culture and had no idea how to woo audiences anymore.
New Hollywood movies focused on characters -- often characters who lived on the margins of society -- over spectacle. They trafficked in dark and often violent themes. Plots and storylines were heavy on ambiguity. Happy endings were not necessarily, or even often, the order of the day. From "Bonnie and Clyde" to "Five Easy Pieces" to "The Godfather" to "The Conversation" "Nashville" to "One Flew Over the Cuckoo's Nest" to "Raging Bull" and any number of movies I could name in between, some of the greatest movies in American history were made during this period.
The distribution model of these films was radically different than what we see today. Whereas now films open in thousands upon thousands of theaters on a single day, in the 1970s, films opened in a handful of cities at first and were rolled out to other cities over time, allowing word-of-mouth and critical consensus to build. This helped filmmakers gradually sell what were often tough sells, artistically speaking, to audiences. It's not fair to say anymore that certain things simply won't "play in Peoria," but it's probably the case that it's easier for things to play in Peoria if the people in Peoria hear that something played pretty well in Chicago, Indianapolis and Rockford a couple of weeks ago.
Then in 1975 Steven Spielberg made "Jaws" and it changed everything.
Rather than rely on word-of-mouth, Universal Studios spent millions on a well-planned and highly-coordinated marketing campaign to promote "Jaws" before its release. TV and Internet trailers are ubiquitous now, but they were rare in the mid-70s. "Jaws," however, featured a high-profile national prime-time commercial buy. The producers and the author of the novel on which the movie was based hit the TV talk show circuit to promote the film and the publisher of the book worked with the studio to ensure that the paperback version matched the film poster as a means of cross-promotion. The movie also had the most elaborate array of marketing tie-ins of any film to date, including a soundtrack album, T-shirts, plastic tumblers, a book about the making of the movie, beach towels, blankets, shark costumes, toys and games.
More significant was the abandonment of the slow-roll distribution. "Jaws" opened simultaneously in hundreds of theaters across the entire United States. It was more than a movie, it was an event. It made a massive amount of money in its first weekend and, in so doing, single-handedly ushered in the Blockbuster Era. Today we take first weekend box office figures for granted as a measure of a film's success -- indeed, we deem a film a success or a failure based, almost exclusively, in how that first weekend goes -- but we didn't start doing that until "Jaws" came out in June of 1975.
There is no question that the blockbuster distribution model makes way better business sense than the old way of doing things -- studios are rolling in cash now in ways no one every could've imagined back in the 1970s -- but it fundamentally altered the artistic and aesthetic sense of Hollywood as well.
Marketing is essential now in ways it never was before "Jaws." It's far easier to market spectacle and thrills than it is to market character sketches and ambiguity, so we get more of the former now than we do of the latter. It's far easier to get people into a movie theater if they have a really good idea of what they're going to see than it is to spring surprises on them, so modern marketing gives far more away about a movie's plot than it holds back and sequels, copycat films and films with shared universes proliferate. People like to feel good far more than they like to be challenged so, while movies have always been about entertainment, the product is made to go down with a few more spoons of sugar than they did during the New Hollywood era. The "Hollywood Ending," primarily a function of morality in the Golden Age, is now a function of test marketing and focus groups.
Which is not to say that good movies aren't made now and the the industry has gone to hell. There were a lot of truly crappy movies made in the early-to-mid 70s (we remember the good ones and forget the bad). There have likewise been tons of fantastic blockbuster movies that followed "Jaws" into America's multiplexes, many of which form the cultural DNA of people of my generation and beyond, many of which were made by Steven Spielberg. And despite the now 44-year-long blockbuster mentality of Hollywood, there have always been a handful of good, small, dark, morally ambiguous or challenging artistic movies that slip past the beancounters every year. And yes, even a couple of those were made by Steven Spielberg.
But it is inescapable that Spielberg almost singlehandedly changed the moviemaking business. He did it by basically blowing up one distribution model and replacing it with another and in so doing he fundamentally changed both the business and artistic sides of Hollywood. For him to now bitch that someone else is doing that is quite the damn thing.
A couple of years ago I wrote about my seven favorite movies in this space. Number one on that list was "The Conversation." It's still number one. I'm having a hard time imagining it will ever not be number one.
It's not a movie that, when you finish it, you say "ah, that was fun." It's not at all uplifting and there's very little action in it. Many people find it boring. I understand that. I don't blame those who don't like it for "not getting it" or whatever. Slow burns and character sketches are not for everyone. Most people watch movies to enjoy themselves and be entertained. They should, too. That's kind of the point of a movie, even if I like to torture myself with bleak, contemplative stuff like this on occasion.
Its lack of action and lack of feel-good appeal notwithstanding, aesthetically it's just a beautifully-shot and perfectly-acted movie. There isn't an ounce of fat on it. Gene Hackman is, if not my favorite actor of all time, in my top three, and this is his greatest role. And, as you can tell by our shared taste in eyewear, I like Harry Caul's personal style.
More deeply, I identify with its themes.
I've spent a lot of time in my life trying to find the right balance between observing the world with objective detachment and actively participating in it. When I was a lawyer I'd often find myself keeping myself too far removed from my clients when I found them or their interests objectionable or getting too close to them, sometimes losing my objectivity, when I did not. Since I've become a writer -- working at home, not interacting with many people in person on a daily basis -- I've felt like more of a voyeur than a participant in the world on occasion, with a tendency to disengage. This tendency is far more pronounced when I'm under stress or when I'm unhappy. It's not a good quality, and it's something I've worked hard to notice and head off when I slip into it, but I'll likely always have to work on it. To not become a low-tech version of Harry Caul, letting life simply happen to him. Either not caring to participate in the business of living beyond watching others do it or not knowing how to participate in it until it's too late.
I write all of this today because a friend of mine just pointed out a great interview of Francis Ford Coppola -- conducted by Brian DePalma of all people -- about the making of "The Conversation." It's from 1974, just as the movie was being released in theaters, so there is none of that reverent, "talk about your classic movie" stuff. You can tell Coppola knew he had a good movie on his hands -- it was nominated for Best Picture several months later, in a year that was stacked with amazing films -- but he freely talks about its flaws too, in a way I bet he wouldn't now if you asked him. It's also interesting because (a) there's an exchange in there in which I suspect DePalma got the seed for making the excellent "Blow Out" seven years later; and (b) based on stuff he says about his movie making style, you can see the hell Coppola would go through making "Apocalypse Now" a few years later coming straight down Market Street.
There are a lot of great technical details in the interview too. How Coppola went about filming the opening segment in the park, the choice of lenses to give it that voyeuristic feel and all of that. I've read a lot about that stuff before, but there's a new bit in there I hadn't read about the sound editing which kind of blew my mind. There are a lot of jarring transitions from loud to quiet in the movie and I used to think it was just because it was poorly mixed like a lot of 1970s movies are, but Coppola talks about how that was intentional and explains, quite satisfyingly, why that is so. It's one of those things that makes perfect sense and which I'm somewhat embarrassed I didn't think about while watching it, oh, 10 times.
It's been a year or two since I last watched it. After reading this interview, I'm going to have to make it 11 soon.
Last year I wrote a long true crime story that hit close to home. Like, really close to home: my great-great grandmother killed my great-great grandfather with an axe one snowy December morning in Detroit back in 1910. You'll be happy to know that she did this after my great-grandfather was born, thus allowing me to exist. Thanks for holding off on that, Nellie. I owe you one.
I had published all of this as a short ebook on Amazon and many of you bought it. Thanks for that! It's been out a while now, so I figured it was worth publishing the whole thing for free here, so here it is, in all of its dysfunctional family glory. Feel free to share it with family members who annoy you. It will really creep them out and, I suspect, treat you more kindly in the future.
If you paid $2.99 for the ebook and feel ripped off now that it's free, well, sorry. I'll make you a deal though: if someone important and powerful reads it and decides to option it for a Netflix movie or something fun like that, I'll invite you to the screening and/or buy you a beer at some point.
This story was originally written for Bloomberg BusinessWeek over the summer. Instead of running it they turned it into a highly-truncated cartoon thing that, being honest, was pretty darn clever and probably more appropriate for the subject matter than a 3,000-word story.
Still, I'd like to have the words I wrote for it all preserved someplace, so here they are.
On March 11, 2015, an anonymous tip was texted to the Franklin County Kentucky Sheriff’s Department that Gilbert “Toby” Curtsinger, a longtime employee of the Buffalo Trace distillery had some stolen barrels of bourbon on his property. A search warrant was executed and deputies drove out to Curtsinger’s house on a winding country road west of Frankfort. Stolen bourbon is not unusual in bourbon country, but Franklin County Sheriff Pat Melton believed that this tip was about something more than your typical bootlegger. He believed that it might be leading him to the Pappy Van Winkle Bandit.
If you’re even a casual consumer of bourbon, chances are you’ve heard of Pappy Van Winkle. It’s the rarest of the many varieties of bourbon made by the Buffalo Trace Distillery and, indeed, the rarest bourbon variety of them all. Pappy, as it is known colloquially, is extraordinarily hard to find. Just 8,000 barrels are produced each year, compared to the millions of barrels of mass market brands like Jim Beam or its Tennessee cousin, Jack Daniel’s. Bar patrons pay upwards of $100 for a single pour. Aficionados who are lucky enough to win lotteries for the privilege of buying it at retail snap up bottles for as much as $300. Those not so fortunate, but who still want the stuff, routinely pay thousands for a bottle on the black market.
On October 15, 2013 Buffalo Trace reported that a little over 200 bottles of Pappy, with a market value of around $26,000, had gone missing. Sheriff Melton characterized it as a “heist,” and characterized the stolen product as “The Mac Daddy” of bourbon. The theft made international headlines, with bourbon enthusiasts inside and outside of the industry speculating about who did it, marveling at the audaciousness of it all and, perhaps, wondering if the theft made it more or less likely that they themselves could get their hands on a bottle. When that tip came in, pointing a finger at a man who had inside access to the place where Pappy was born, Sheriff Melton believed he was about to crack the bourbon crime of the century.
Anthony Bourdain died today.
Unlike so many self-styled literary and entertainment industry badasses, there was simple skill, craft and humanity underlying the attitude, which he would freely allow to show. The former without the latter -- and without self-awareness-- is empty. Whatever he was doing to project that bad boy persona was immediately set aside when he got down to work writing about or chronicling a place, a people, a cuisine or whatever it was he was interested in at the moment.
In losing Anthony Bourdain, we didn't lose a "celebrity chef" or a "travel show host." We lost an insightful, empathetic and humane chronicler of the human condition. A man who could have so easily been a complacent, thrill-seeking, luxury-living, globetrotting celebrity but chose to be something more. He was an anthropologist who discarded dispassionate observation in order to advocate for the best in humanity, paying special attention to the vulnerable, the exploited and the overlooked.
Last year Bourdain went to West Virginia for an episode of his show, "Parts Unknown." In the space of one hour he did a better job of capturing my home state than a thousand poverty porn tourist journalists with pre-written stories parachuting in from coastal publications have ever done. It was typical of his work. He never went with the easy or expected narratives, even if doing so would've saved him a lot of work. Probably because he knew that those easy narratives obscured truths, perpetuated lies and, unwittingly or otherwise, served to work injustices, both large and small.
I embedded that episode below. You should watch it. If he ever went someplace special or interesting or unknown to you, you should watch that too.
My wife and I just got back from nine days in England. It was our honeymoon, delayed a year for various reasons, but coinciding with our first anniversary. I was going to write up a proper travelogue, but I'm too lazy to craft narratives, transitions and connections into something approaching passable prose, so I'm just going to barf out a list of stuff that happened and stuff I observed. Of course, it's gonna end up being longer than a travelogue would've been, but sometimes when you start barfing, you just can't stop.
Click through via that "Read More" button to the lower right if you're into that sort of thing.
Most people in the United States haven't heard of James, and those who have heard of them know them primarily through a surprise college radio hit they had with the song "Laid" back in 1993, later used in the "American Pie" movies. They're far more than a one-hit-wonder, however.
James has put out 13 studio albums with a 14th on the way in August. They've had scads of hits and top-selling albums on the UK charts and a fervent following there, in Europe and in Latin America. A seven-year hiatus in the early-to-mid 2000s notwithstanding, they have been and remain a working band and, unlike a lot of their contemporaries, they remain creatively vital. They put out a new EP and released a couple of songs from the new record a little over a week ago. Some of 'em are bangers.
My wife Allison has been a James fan for 20 years or so, has met the band, has friends she's met through James fandom around the world and has seen them live both in the U.K. and in America. We recently took a trip to the U.K., primarily for our honeymoon/first anniversary -- here's a fairly massive travelogue about the vacation -- but also to go see three James shows on a short tour they did of small venues in small towns across England, Scotland and Wales. As a super fan, Allison would've found a way to see them again eventually, with or without me, but this trip was my first time seeing them live. The first show, in Warrington, was the best show I've ever seen. The other two, in Blackburn and Halifax, were right up there. I'll spare you detailed reviews, but suffice it to say I enjoyed the hell out of myself.
Until I met Allison in late 2011, I was one of those people who didn't know much more about James than "Laid." In the past six and a half years they have become my favorite band. Part of that is a function of "guy meets girl who turns him on to some different music and the association sparks something," but there's more than that going on for me.
As we grew up and matured, men my age were never rewarded for feeling. The benefits of feigning indifference and affecting a pose of ironic and cynical detachment, on the other hand, were considerable.
As I entered adulthood, what one genuinely felt about anyone or anything was less important than the fact that people understood that one liked the right someones and somethings. The Gen-X-approved canon of music, movies, books, fashions, attitudes and personalities which were accompanied by a heaping amount of snobbery directed at those who did not share such tastes. For 1990s 20 or 30-somethings, one was living one’s best life to the extent one made it appear as if one’s life was directed by Quentin Tarantino, released on Matador records and written by David Foster Wallace. Those who did not fall into those general parameters were judged and judged harshly. Rob, from "High Fidelity" was a role model. It escaped us all, of course, that Rob was an emotionally-stunted jackass.
On a personal level, the archetypical Gen-X man exuded the sense that things were humming along just fine at all times and, if they were not, it was never much discussed. Staying in a narrow band of critically-approved tastes went hand-in-hand with portraying a nearly unshakable equanimity. Just as liking the wrong music risked judgment, deviating from a certain personal stance -- showing vulnerability and uncertainty -- was to invite uncomfortable personal conversation and scrutiny for which none of us were prepared.
Ironically, this highly regimented emotion-denying existence and self-imposed conformity was considered a sign of "authenticity."
Not that it felt phony or contrived. The cultivation and maintenance of the quintessential 1990s Gen-X male identity felt organic in the moment. The life I personally constructed around this larger ethos came to me naturally. I went to college, got married, began my career and had children, not just portraying every life event as if it were scripted and thus unremarkable, but feeling as if they were so. I was not some robot — there was happiness, sadness, joy, sorrow and confusion as life unfolded — but those were deviations from the cooler-than-the-room course one’s life was expected to take. Those deviations were expected to be temporary and were expected to right themselves over time.
In hindsight it’s no surprise that everything came crumbling down for me in the space of a few years. That the contradictions and self-denial my career presented and required of me were too great to ignore forever. That the problems in my first marriage were features, not bugs. That the strong and positive emotions inspired by fatherhood and by aging did not jibe with my well-cultivated sense of ironic detachment. I did my best to skate past the remarkable highs and the nearly unendurable lows of life with the help of just the right soundtrack, just the right wardrobe and enough culturally acceptable distractions to make it seem like everything was under control, but it wasn’t sustainable and never could have been.
I was in a very dark place when I met Allison and she knew it. Among the many things she did to help me get through that bad time was to play play me some James stuff.
The first song she played for me was "Tomorrow." The sentiment and structure of that song is pretty obvious and straightforward -- the singer once introduced it as a song he wrote "to keep a friend from jumping off a roof" -- but when you're emotionally stunted and emotionally raw, you need something straightforward like that. Having wallowed in enough dark, depressing music and sad bastard jams over the previous few months, "Tomorrow" was a breath of fresh air. It was the first music I had listened to in a while which suggested to me that things can and will get better rather than give me permission to embrace darkness and depression.
From there I began to listen to some other James stuff and I liked what I heard. While, critically speaking, one can slot them in with a lot of their Madchester and Britpop contemporaries, they don't fit in terribly neatly. They have been described by some critics as the "outcasts" or the "freaks and geeks" of that scene. I get that. They opened for the Smiths once upon a time, played with New Order and traveled in the same circles as The Stone Roses, Happy Mondays and all of those wonderful bands, but unlike a lot of their contemporaries they mined veins of positivity and non-conformity not typically covered in 1990s rock. Maybe this explains why they never broke big in an America which, at the time, was into far darker and sludgier sounds. I'm no music critic and I can't be totally sure about that, but I do know that I really needed to hear some positive, even anthemic music in late 2011 and James delivered.
The immediate need to pull myself out of a funk soon passed, but I have returned to James pretty frequently since that time, listening to their music both old and new. Doing so has helped address the larger problems associated with that emotionally-stunted world view of the typical 1990s Gen-X man I described before.
Allowing myself to feel things -- to like things, even if they're not cool things, without apology, excuse or shame, and to be fearless in doing so -- has been critical to my mental and emotional health and personal development over the past several years. It'd be an overstatement to say that getting into some band from Manchester has been the primary reason I've been able to do that, of course. Therapy, emotional reflection and support from and good examples set by loved ones has been far more important. But given that pop culture played a big hand in messing me and my contemporaries up in the first place, listening to a band that embodies that more open and positive ethos certainly helps.
When you're trying to grow as a person, you need to shed your skin. To strip away your protection. To laugh at the wonder of it all. To cry at the sadness of the world. To dip on in, to leave your bones, leave your skin, leave your past, leave your craft and leave your suffering heart.
Or so I'm told.
UPDATE: If you don't know that much about James, I made a playlist of my favorite songs. They may be too obvious for serious James fans, but it's a good introduction to the band.
I really enjoyed "Jessica Jones" season 1. Season 2 came out on Thursday and I continue to enjoy it. Beyond the characters and the plots, though, I am fascinated by Jessica's bourbon and whiskey choices.
If you don't know the show, Jessica is a private eye with a lot of past trauma and she drinks . . . a lot. Like, to crazy excess, usually to forget stuff or deal with stress. She often has hangovers but rarely seems drunk, even after drinking an entire bottle in an evening. They don't mention it, but I suspect that since she has super powers she has super tolerance too. Either way, getting the headache but not the buzz seems like a pretty shitty deal for her.
Her brands are what interest me most. Jessica is a brown liquor woman, but she was all over the map with her whiskey choices and I can't watch an episode without noticing what she's drinking and wondering why she, or, rather, the producers, chose it.
In season one she had a different brand every episode. Sometimes multiple brands an episode. Sometimes it was scotch, sometimes bourbon, sometimes Canadian. She occasionally drank some fictitious brands from the prop department. The real products came from multiple distillers. In light of all of that I don't suspect that any of those bottles were there by virtue of product placement.
If it was product placement it was pretty crappy product placement for the distilleries involved. For example, in one episode she asks a convenience store clerk for "the cheapest you got." He sells her Wild Turkey 101, which is not the cheapest he or anyone else has. I doubt Wild Turkey would like to have 101 portrayed as rotgut if it was paying to have its bottle featured. In the next episode she's drinking Old Grandad and earlier drank Beam, Teacher's and freakin' Cutty, so she obviously does know where to get cheaper stuff. She's a detective!
For the first two episodes of season two, she drinks only Tin Cup. Because the exclusivity and because the bottle and its label are shown so prominently, I suspected that Tin Cup had paid for exclusive rights for the much more anticipated Season 2. But . . . nah. In episode three she's back to Four Roses yellow label. Again, though, if Tin Cup did pay for that placement, they may not care for how it was used. Jessica drinks it like water -- at one point she literally fills a 10 ounce water class with the stuff, straight up and chugs -- and at another point she has a nightmare where she's hooked up to a Tin Cup IV, the bourbon flowing straight into her veins. There's no such thing as bad publicity I guess, but I feel like a distiller wouldn't want to have its brand being used explicitly to show how much of a problem drinker a character is. "Drink Tin Cup: the preferred brand for functioning alcoholics everywhere!"
If it isn't product placement, I don't understand all of the switching. Sure, a whiskey enthusiast may get a different bottle every time, but Jessica isn't someone you'd call a whiskey enthusiast. She's a drunk. Or at least a wannabe drunk. I've known some drunks in my time. If they're like Jessica and they are (a) functional; and (b) at least make a passable living, so that they don't have to take whatever they can get, they tend to have brand loyalty. Or at least price point loyalty. Even if they do change up brands, they don't bounce from bourbon to scotch to rye the way she does.
Last season some sites like Buzzfeed kept track of what she was drinking. I am only three episodes into season 2 -- it's a treadmill show for me, so it's a one a day thing, not something I, uh, binge -- but I'm gonna continue to keep track myself. I'm more fascinated in this than I am in the shady forces Jessica Jones is fighting. She'll beat them in the end. I have no idea what's gonna happen with the next bottle.
In the past week President Trump, first through a spokesperson, and then personally, demanded that United States citizens lose their jobs because he does not agree with their political views.
We can disagree about the underlying issues which led to him saying this. We can debate the nature of protest and the mode and manner of expression of views with which he takes offense. We can discuss the propriety of sports figures wading into non-sports topics. No matter where you come down on any of that, however, we are left with the President of the United States saying people should lose their jobs because he does not agree with their political views.
No one, no matter their views about the protests or comments of athletes, should find this acceptable. Whether one holds far right or far left views, every last American should find it abhorrent that a government official, let alone the most powerful government official, is demanding people's jobs because he does not like what they believe.
This is not a controversial assertion. It is not a close issue. It is, perhaps, the most basic and fundamental issue there can be when it comes to our rights and our liberties as Americans under the Constitution. It is the entire goddamn point.
This afternoon Angela Ahrendts, Senior Vice President of Retail at Apple Inc., said of Apple stores, “we don’t call them stores anymore, we call then town squares, because they’re gathering places.”
Ahrendts' comment could simply be written off as hubristic marketing-speak, but to me it's an unwittingly sad comment about how, in the current age, a luxury goods story can and does serve as a rough proxy for a public square and how, concurrently, civic society continues to be degraded.
While a small number of very rich people have always been able to keep themselves separate and apart from the masses, a larger and larger number of people are using money, technology and education to insulate themselves from the sort of everyday life all citizens once lived. Elite status, VIP sections, priority lines, “Cadillac” healthcare plans, private schools and all manner of other luxuries available to the professional and technological classes create a situation in which a larger swath of the well-educated and at least moderately well-to-do have created what amounts to a separate class apart from the rest of the country. A class that carries with it insidious assumptions, conscious or otherwise, that the affluent and educated are demographically superior to the poor. Or, perhaps, that the affluent and educated are the only people who even exist.
While, admittedly, there has always been some semblance of a class system in this country, the instances in which people come together in commons spaces -- in train stations, post offices, hospitals, libraries, public schools, museums and retail spaces -- has decreased dramatically. What's more, there was once a time in this country where the class divisions we had were at denied and diminished out of either shame or idealism born of the notion that the United States is not a class-based society. Today that conceit has been disposed of almost entirely, with “success” being increasingly equated with one's ability to buy one’s way out of the public sphere altogether.
We live in isolated and increasingly homogenous and cloistered communities. We have made it so that those with access to the gifts of the technological age can do their shopping, their banking and their interaction with the government via electronic means without ever having to encounter the general public or, at the very least, the part of the general public unlike themselves. The increasing power of a small handful of technology companies is exacerbating this trend, turning even basic acts of life, such as buying groceries, into a class-based pursuit.
As a result of all of this, the public sphere of life has broken down in many important ways. We do not come together as a society across economic classes in anything approaching the way we did even as recently as the early 1980s, let alone the way we did in previous decades. This is bad for democracy and social health because, when we do not interact with the whole of society in meaningful ways, we are no longer truly stakeholders in the whole of society. We are, at best, voyeurs, intellectually lamenting that which has befallen our fellow man, yet not really being invested in it in any real sense. When you encounter those in different circumstances than yourself only virtually, you can simply click away. Or you can just choose not to click in the first place.
Which brings me back to Apple. The nearest Apple store to me is in a place called Easton Town Center. It's a mall, but one of those outdoor malls that apes a cityscape, built on what used to be farmland out by the freeway outerbelt. There are storefronts and parking meters and sidewalks and all of that, but it's all private property. While it's a fake city, it holds the sorts of community events -- Christmas caroling, arts fairs, outdoor performances and the like -- that once took place in my town's real public spaces. Except it's not truly a community event given that no one has much business being there unless one is shopping or dining out at one of the luxury goods stores on its premises, and that's obviously not for everyone. And, of course, since it's private property, they can kick out anyone they want to for basically any reason or for no reason whatsoever.
Which certainly puts Apple's claim that its stores, a great number of which are located in places like Easton, are "town squares" in a different light. A light that is sadly telling of what our society has come to in this day and age.
Some people who take in interest in genealogy discover that they are Irish when they thought they were Scottish. Others find a long-lost cousin. When I began looking at my family history I found out that my great-great grandmother murdered my great-great grandfather with an axe on a snowy winter's night in Detroit, Michigan in 1910.
Nellie Kniffen's violent rampage and her husband Frank's grisly demise was front page news in Detroit for several weeks, but she and her crime were soon forgotten, both by the public and by her family. Those who remembered it tried hard to forget it and those who came after knew nothing about it at all.
Through research of public records, personal interviews and a review of the sensationalistic newspaper stories written before Frank Kniffen's body grew cold, I unearthed a chapter which had been torn out of my family's history. And I began to better understand the ghosts and demons which have haunted my family for over a century.
The story of Nellie and Frank -- Nellie Kniffen Took An Axe -- is available as a Kindle eBook for $2.99.
Last night I saw Brian Wilson perform at the Palace Theater for his “Pet Sounds: The Final Performances” tour. It’s amazing enough that Wilson is touring given his history of mental illness, drug abuse and now, as he's approaching 75 years-old, physical decline, but he's still doing it. And doing it well.
His voice is still recognizably his voice. He's not like Bob Dylan or Tom Waits or someone who has had to become a fundamentally different kind of singer than he once was due to ravaged vocal cords. The same old tone and timbre of 1960s Brian Wilson is there. He may not tear into the second verse of "I'm waiting for the day" with that aggressive edge so evident on the album version, he doesn't sustain notes like he did when he was young and, yes, he occasionally hits a clunker, but he's still unmistakably Brian Wilson.
If anything, the flaws in his singing enhance the experience of seeing him live. He's not a jukebox full of oldies like some other artists are. And the mere act of him being on that stage singing those songs elevated his performance above that of his increasingly vanishing contemporaries and vanishingly few artistic peers.
I don't begrudge The Rolling Stones, the Who or Paul McCartney going on the road and playing concerts into their 70s. They're legends, people love them and their music, they put on great shows and, of course, they're more than entitled to make money off of the art they created. But there is something . . . off about it. There is something off about Mick Jagger singing about how he can't get satisfaction when we know he's rarely had anything but satisfaction for the past 50 years. There's something silly about Roger Daltrey singing that he hopes he dies before he gets old when he's already old. There's something downright creepy about a wrinkly-faced Paul McCartney telling us that we know what he means about that girl who's just 17.
One might think this problem would be even greater for Wilson doing "Pet Sounds," actually. The album he once famously called "a teenage symphony to God" is about young romance. About that moment when teenage love changes from butterflies in one's stomach to one's first feelings of melancholy. It evokes emotions common to anyone who has ever experienced love, but they're feelings unique to a certain time and place in our lives that we never again recapture. That's not the stuff one would expect to wear well when sung by a 70-something year old man. Despite his age, however, there is something poignant about Wilson singing from the point of view of his younger self that is absent when others do it.
Ideally art stands on its own, without the audience bringing their own knowledge about the artist with them, but that's next to impossible when it comes to Wilson. We know what his life was like at the time "Pet Sounds" was recorded. We know how much more difficult it would become in the two decades-plus after it came out and how damaged Wilson came out on the other end. Jagger, McCartney and Daltrey all had personal ups and downs of course, but compared to Wilson they've lived pretty happy and contented lives. In light of this, their taking to the road seems like a pleasant but somewhat superfluous and undoubtedly commercial act.
Wilson didn't have the same sort of happy and contended second and third public acts as those guys. He's never gotten the chance to connect with his old songs and his old fans in the same way they have. And given that his former bandmates Mike Love and Bruce Johnston have long toured as The Beach Boys, playing the biggest Beach Boys hits in theaters, state fairs and other venues with relatively low ticket prices, a lot of his old fans might not even care too much anymore. They've seen what they wanted to see for the most part. As such, Wilson's act of singing his old songs -- these particular old songs, which were never as commercially successful as the stuff Love and Johnston perform -- seems more personal to him. More important and significant.
While I'm likely projecting to some degree, as Wilson sang through "Pet Sounds," it seemed as if he was reaching back through time for something necessary. Something he didn't get to fully enjoy and explore at the time and something he finally can now as opposed to simply putting on a show. He may have played these songs or things like them over and over again in his home, but on this tour he's getting to play them with a full band -- he had ten backing musicians and singers, including original Beach Boy Al Jardine and one-time Beach Boy contributor Blondie Chaplin -- forming those harmonies he's on record as saying are his favorite parts of his songs.
It was a great show for us, but you can't help but feel it's a rewarding and perhaps necessary act for Wilson to play these songs. Necessary in ways it's simply not necessary for others to play their old songs. This may have been most evident in the opening and closing of the show when, as a warmup/encore, he played some of the more popular Beach Boys songs like "Help me Rhonda" and "California Girls," with Jardine doing an admirable job with the Mike Love vocals. They were fine, but somewhat rote. The crowd stood, cheered and sang along, but Wilson seemed to be going through the motions with them to some degree. The big hits don't seem particularly important to Wilson.
The "Pet Sounds" songs, which were played in order, in their entirety after an intermission, felt more moving and stirring. And not just because they’re better songs. It's because, with the possible exceptions of "Wouldn't it be nice?" and "Sloop John B," a lot of people don't know all the words to a lot of them. And even if they do, they're not exactly jukebox singalongs. They provided an opportunity for Wilson to sing and perform for us in ways that McCartney and the Stones can't without resorting to an obscure R&B cover. People know "Pet Sounds," of course, but it's not back-of-their-hand stuff like "Maybe I'm Amazed" or "Brown Sugar." Wilson was reacquainting many in the crowd with those songs just as he was revisiting them himself and the net effect of it was stirring.
Stirring in and of itself, but also stirring because the man singing the songs is, in 2017, still here. Against all odds, he's still here. Reaching back for something one gets the sense he loves and needs just as much if not more than any of us.
In the wake of the Meryl Streep thing, Republicans are saying that this is why the left is out of touch with typical, working Americans.
The last four Republican presidents, by the way, have been (1) a billionaire reality show host; (2-3) scions of one of America’s wealthiest, most patrician east coast Ivy League dynasties; and (4) a literal Hollywood actor. So maybe spare me.
Anyway, I just read the Streep speech. To be offended by it you have to be either pro-mocking the weak and powerless or just tribally pro-Trump. There’s no partisan politics in it. There’s nothing about economics or policy separate and apart from “be kind to the weak and powerless.”
If that’s what passes for controversial, we’re basically screwed as a society.
There’s a hashtag thing going around Twitter now – #fav7films – via which people list their favorite seven movies. Here’s my brief list. I’d say three of the slots are subject to change at given time, but this is the list now:
1. The Conversation: It’s held at number one for a long time now. It’s a nearly perfect slow burn/psychological thriller on its merits, but as someone who often catches himself observing the world more than actually living in it, it resonates with me a bit more than most movies do.
2. Zero Effect: I like it for some of the same reasons I like “The Conversation,” though it’s obviously goofier. But it’s nowhere near as goofy as it seems on first look. There are some deceptively deep psychological waters being explored here and Bill Pullman, Kim Dickens and Ben Stiller all hit the perfect notes as they explore theirs.
3. Casablanca: It’s not all psychological crap for me. Sometimes you just gotta be entertained by some perfect old Hollywood romance, drama and humor. This may be the most perfect blending of all three in cinematic history.
4. Miller’s Crossing: It may not be the “best” Coen Brothers movie – “Fargo” is probably a better movie all things considered – but I’m a sucker for their more affected works for some reason and this one, while crazily affected, is just a joy to watch and quote over and over again. I like to think of Tom Regan as The Big Lebowski’s grandfather.
5. Chinatown: As a general rule, I like my heroes to only have half a handle of what’s going on until the very end, even while fighting like crazy to come out on top. And even once they get a handle on it and the plot resolves itself, I like them to still be perplexed by everything that happened and unsure what will happen next. Life is sort of like that. Forget the less-than-memorable sequel. I prefer to think that Jake Gittes was a profoundly changed man after what went down in this movie. It’s one of the rare pieces of hard boiled detective fiction where the detective takes the journey and doesn’t keep his cool detachment, even if it’s subtle here.
6. Dark City: I could make separate top-7 lists for detective movies, psychological thrillers and sci-fi. But all three of them landing in one movie like this makes this a great proxy for it.
7. Eternal Sunshine of the Spotless Mind: This is an odd one, I realize, and on purely cinematic terms it’s no masterpiece. It’s a very personal choice for me, however, and I have it here out of respect for what it means for me more than for what it is. I’ve written about it before, but this movie hit me at a perfectly imperfect time in my life when the decision between trying to deal with bad experiences vs. trying to utterly deny and obliterate them from one’s memory was more than just a theoretical one for me. It’s still something I struggle with.
Anyway, sorry to anyone who was expecting to see “The Godfather,” “Citizen Kane” and “Goodfellas.” They’re all good too, though.