In the last year we've learned all too well that Donald Trump is seemingly immune from scandal and impervious to shame. Not a week goes by that he doesn't say or do something that would end anyone else's political career. We've lost count of the things which have happened in and around his personal and political orbit that would stop a politician's agenda dead in its tracks. At the moment his top advisors are being investigated for conspiring with Russians to subvert our democracy. That's something which got people executed in my parents' lifetime.
Despite this, the Trump train, however slowed in the public opinion polls and however assailed by the public opinion pages, continues to chug, more or less, forward, announcing policy initiatives that will likely pass and which will shape our country for decades to come. No one in a position to stand up to Trump and say "no" seems willing to do so.
A little over a year ago I wrote a post about the troubling manner in which politicians and public figures talk about complicated subjects. About how they seem to increasingly rely on anecdote and references to their personal experiences when addressing matters of policy, ethics or morality rather than on facts or ideas. About how, for some reason, they could not talk about, say, sexual harassment without referencing their "wives and daughters" or they could not talk about taxes or social policy without making reference to some local farmer or businessman who would be affected.
On some level I get why they do this. People like stories and first person accounts. We respond to them well. On a geologic scale we're barely removed from a time when oral tradition was our primary means of understanding the world, so it makes sense that we respond to personal appeals.
Our public discourse seems to have gone too deeply into the personal and anecdotal, however, to the point where tales, rather than facts, data and ideas, have come to dominate the conversation. Yes, my friend, I'm glad that you care about the advancement of women now that your daughter is getting her MBA, but can we talk about the advancement of women who are not your daughter? Sure, I suppose I'd be curious to know how this new regulation personally affects Joe Smith from East Alton, Illinois, but it's probably more important to know what it means in objective terms -- defined by facts and figures -- for the country as a whole, wouldn't you agree?
The point of that essay was that we spend too much time creating narratives when it comes public life and policy, often baseless ones, and not enough time thinking. We spend a lot of time talking about our feelings too -- using the language of anger or personal offense for the most part in recent times -- but we do it in a rather self-centered way, lacking in empathy for anyone beyond ourselves or our immediate circle. That's an acceptable way to run a village, maybe, but it's no way to run a county of over 300 million people.
I wrote that post a month before Donald Trump was elected, in response to the 2005 "Access Hollywood" video in which he bragged about sexually assaulting women. I wrote it because everyone asked to comment on it referenced their wives and daughters and did little if anything to say, full stop, that such behavior should be condemned as unacceptable even if you don't have a wife or a daughter. Little did I know when I wrote it that such a scandal -- the sort of scandal which would definitively wreck any politician who came before him -- would be a mere footnote on Trump's way to claiming the presidency, regardless of how many wives and daughters were invoked.
Little, also, did I know that what would transpire since the election would validate a warning I first heard 26 years ago, which explains both that which troubled me a year ago and that which is transpiring today.
Back at Ohio State, in the early 1990s, I had a history professor named Alan Beyerchen. I wasn't a history major -- ours was merely an intro to western civ class and most of the curriculum was outside of Professor Beyerchen's specific area of research -- but he was more engaging and enthusiastic about teaching freshmen and sophomores than most professors I ever had. He'd often digress from the day's lecture to talk about larger cosmic issues. One he hit on, time and again, was about how history is animated by its actors defining their personal identity in opposition to that of their enemies (people proclaiming that which is "self" vs. that which is "other" explains oh so much over the centuries). Another one of the big cosmic issues he talked about was how, in his view, we seemed to be on the verge of entering a "high tech dark age."
Beyerchen seemed focused on what he felt as the then-primordial information age's attack on literacy and personal agency. He worried about us moving away from writing and books -- he was particularly upset at how poor his students' writing skills were, mine included -- and suggested that computers and the ability to edit without a lot of hassle was partially responsible. He talked about the prospect of virtual communities supplanting real communities, the ethical hazards of technological advances (which then, as now, were so often promised to be benign) and what all of that might mean for an enlightened civil society. He wasn't necessarily alone in these preoccupations, of course. A lot of people were worried about that stuff then, albeit on a much more superficial level than Beyerchen was. Just look at the science fiction of the mid-90s and all of its virtual reality and Internet panic as evidence.
It went deeper than that for Beyerchen, though. He wasn't some guy merely grousing about technology and all of its alleged perils. For him the most serious risk of the coming high tech dark age was an epistemic crisis. A crisis in which, due to the waning influence of institutions that characterized enlightened society such as libraries, universities and governmental bodies run by and for a literate, educated and engaged populace, simply agreeing on what truth and knowledge and information is would be a challenge. If knowledge was less etched in stone than transferred via ephemeral means, would it not risk becoming intangible? Mutable? And if it did, what value would it truly hold for people?
Once you're in that situation -- a situation in which people find it simple and even preferable to disagree on basic facts -- truth itself is a malleable concept. Once human beings aren't sure what is true, they tend to revert to superstition and fear. Once you have a population of fearful, superstitious people who don't know what is true, those in power are able to warp reality even more and are able to exert control over them more easily than they already do. If the people are afraid enough, they'll be quite happy to allow it.
That, for all practical purposes, is the definition of a dark age. It's a dark age even if we have a lot of shiny technology and even if we've eradicated the plague.
This afternoon I read something which makes me believe that the epistemic crisis which would usher in a new dark age is already upon us. It's from David Roberts at Vox, and it describes the way in which the right wing political and media establishment has rendered facts malleable and increasingly meaningless:
Roberts' concern: that Robert Mueller's investigation will prove a case of Donald Trump's participation in an illegal conspiracy to subvert our political system and that no one will do a thing about it. That the Republicans in charge of the legislative branch will shirk their responsibilities to check the executive because they fear political reprisals from a base that is intoxicated with a cocktail of misinformation and anger, served by the right wing media establishment.
It sounds right. It's not driven solely by technology, the way my old professor worried it might be, even if it's driven by it a good deal. Mostly it's driven by a craving for power and an utter lack of scruples or shame. Any way you slice it, it sounds like the stuff of a new dark age.
Last night at dinner, my kids -- who are always online and always see everything -- mentioned President Trump's irresponsible threats of nuclear war. They're bright kids who, I suspect, are about as well-informed as any other 12 and 13 year-olds, so they know the general outline.
I remember being pretty freaked out at the brinksmanship of the Cold War and, of course, "The Day After" scared the living bejesus out of me when I was around their age. So, despite their relative savviness and maturity, I was nonetheless cautious about how I talked about it, not wanting to upset them.
Then my son said, "I wonder what the last meme will be before the world blows up?" and he and my daughter began laughing their heads off about it. When I woke up this morning I saw that my daughter had sent me this, answering her brother's question.
If the planet does survive long enough for my kids to reach adulthood, it will be powered with disaffected irony. Not great, but I suppose there are worse things.
Yesterday, the Trump Administration placed a gag order on the EPA, prohibiting it and its employees from talking to the press, tweeting, releasing any statements, facts or studies. The reason, without question, is so that its new leaders appointed by Trump can introduce their package of anti-environmental nonsense into the agency -- alternative facts, if you will -- in order to better accomplish their not-so-secret mission of gutting environmental enforcement and regulation in this country.
This afternoon the person who runs the Twitter account for Badlands National Park tweeted about the environment -- carbon in the atmosphere and acidity in the oceans in the industrialized age. The tweets were not arguments or partisan. They did not include conclusions or prescriptions about policy. They were simple recitations of objective scientific fact.
There is a very good chance that this person will be fired as a result. When they are, it will likely be attributed to some administrative violation or the notion that any subject apart from the national park itself is outside of their mission. But we know what it will really be about. It will be for committing the sin of stating a fact in public that reveals the intellectual bankruptcy of the beliefs of our new government.
The writing of these tweets, which I predict will be deleted by some middle manager following orders, will constitute a career-forfeiting act of dissent and their author will, in turn, become a hero for writing them.
We live in a mad age.
UPDATE: The rogue tweets have now been deleted. They still live on in this screencap:
We’ve heard much lately about how liberals live in a bubble. About how they fail to understand people different than they are and, to the extent they do have impressions of conservatives, middle Americans and the working class, they come by virtue of caricature and exaggeration via stereotype and pop culture. They are told that they do not understand “Real Americans.”
It's a two-way street. Let's talk about both ways. And let's talk about our lack of community despite a fully-connected society.
I sort of owe my career to Andrew Sullivan. Not in any direct way. He doesn’t know who I am and never did anything to help me get a job. But he and other web-based political writers who flourished in the early 2000s provided a model for me.
The model was basically:
I wrote a web column covering national topics in 2002 and 2003 and didn’t think of it as a blog, but looking back at those old bits, they were basically blog posts. After a hiatus I began again in 2007. While there were several baseball bloggers around then, they were mostly team-specific or didn’t post as frequently as I did. While I respected their work and still do, I didn’t really emulate any of them. No, by 2007 I was consciously aping the political blogging style, only about baseball.
I modeled myself particularly closely on Andrew Sullivan. While I did and still do disagree with him politically on a whole host of issues, there was a lot about his style that appealed to me. He wrote in the first person a lot and did not hide the fact that he was a human being with his own interests. While he was and still is accused of completely reversing course on various topics, he didn’t really care, noting that changing one’s mind upon encountering new information or simply reconsidering old topics was a sign of intellectual strength, not weakness. He was, with some rather notable exceptions, more self-aware than a lot of his peers and knew that some of his readers wouldn’t care about whatever hobby horse he was riding at any given moment yet still kept riding them anyway.
A lot can be written about some of the awful arguments and positions Sullivan has taken over the years, but his approach as a blogger always appealed to me. Emulating it in a baseball context set me apart from my peers. I wrote more, wrote more quickly, more frequently and covered a wider array of topics than most people in the baseball blogosphere. To the extent I was able to leverage two years of independent blogging into a larger platform at The Hardball Times and then, later, at NBC, it wasn’t because I had a ton of friends in the industry or because I networked. It was based almost exclusively on being that weird lawyer baseball dude who updates constantly and talks about everything. It was because I was the baseball Andrew Sullivan. I owe a lot to him, even if he doesn’t know it.
Though I stopped reading Sullivan on a regular basis several years ago, I was sad to see that he quit blogging in 2015. And I am sadder still to see what he wrote today in New York Magazine:
I Used to Be a Human Being
In the article, Sullivan talks about how he burnt out on blogging and all of the online reading, reacting, arguing and writing it requires. About how posting every 20 minutes and obsessing over every twist and turn in a news story, often before anyone even knew what the story was, caused him to crash. His personal health was a factor as well – he suffered from multiple respiratory infections – but his “living-in-the-web” lifestyle, to use Sullivan’s term, was his real problem. He says it took a massive toll on his health, his personal relationships, his intellectual capacity, his writing skills and style and maybe even his sanity. This is, quite obviously, not ideal, and I’m glad that the internet detox on which he has embarked and the meditation regime and sabbaticals and everything else he has done has been good for him. Real life matters far more than four paragraphs of thoughts hastily posted to WordPress.
Of course, it would not be an Andrew Sullivan article if it didn’t include some broad overstatement, generalization and projection of his own feeling and experience onto the rest of us (an occupational hazard of all bloggers, but one which dogged Sullivan more than many). And here it is: too much technology and time online was not just something that harmed him, he says. It’s the scourge of the entirety of 21st century civilization:
Just look around you — at the people crouched over their phones as they walk the streets, or drive their cars, or walk their dogs, or play with their children. Observe yourself in line for coffee, or in a quick work break, or driving, or even just going to the bathroom. Visit an airport and see the sea of craned necks and dead eyes … this new epidemic of distraction is our civilization’s specific weakness. And its threat is not so much to our minds, even as they shape-shift under the pressure. The threat is to our souls. At this rate, if the noise does not relent, we might even forget we have any.
With all due respect to the man on whom I’ve modeled my career: this is fucking bonkers.
I will grant that the manner and to degree to which technology has changed our lives in a very short period of time is, frankly, staggering. I’ll grant that all of us could use more time unplugged and offline and away from screens than we spend.
I’ll likewise grant that people in Sullivan’s line of work are particularly susceptible to being crushed in the manner which he describes. I never was nor have I ever been quite as immersed in the “living-in-the-web” lifestyle as Sullivan was, but doing what I do for a living, as obsessively as I do it, from home, usually alone, I am likely on the far right portion of the, ahem, Bell Curve, when it comes to full Internet immersion. I have over indulged at times. I have had loved ones tell me, hey, you need to unplug, get off of Twitter and close the laptop for a bit. It happens to most of us, especially if we work online.
But Sullivan’s article reads like a harangue from a recently sober alcoholic, convinced that everyone else is destined to fall victim to demon drink simply because he did. It’s calm and measured tone just barely hides what’s really being revealed here: a man with poor work-life balance skills blaming technology for what befell him as opposed to his own inability to unplug and pace himself
Sullivan talks about how he posted seven days a week, every twenty minutes. I remember when he did it and it was insane. I used to do something close to it. It was five days a week for me and it was every thirty minutes – with my blogging partners chiming in once or twice an hour to give us close to the same frequency of Sullivan’s blog – but it was pretty similar. It was also entirely unsustainable, both in terms of content – there really isn’t enough good stuff to write about 40 times a day – but more importantly in terms of the writer’s stamina.
Eventually, I ratcheted back a bit. Instead of writing 20 things a day I wrote 12-15. Many days now I don’t write even that much. Partially because blogging has changed a bit over the years and partially because I have people who work for me whom I trust to handle nights and weekends and those times when I have life to live and errands to run. Mostly, though, because I realized a few years ago that there was no way I could continue that pace into my 40s while still being a sharp thinker, a present father and an all around healthy person. I still write more than most people in my field, but I write way less than I did a few years back. Both I and my writing are better for it and my readers have not complained about it.
I’ll grant that baseball is not as important as politics, but Andrew Sullivan’s blog was not defending us from invading hordes or keeping Democracy alive single-handedly. No matter how important the underlying subject matter, no one was ever going to save the world with a blog post. At the very least the world would have survived for a few short hours if Sullivan had taken his husband out to a nice diner during the Green Revolution or if he had unplugged one night and read a good book in 2008 rather than writing yet another post about Sarah Palin’s baby.
Ultimately, reading and writing about crap on the internet is a job. It can be an extraordinarily immersive job. One that, if you’re not careful, can cause you to lose yourself. But still a job. If Sullivan wasn’t killing himself with this job, I strongly suspect he would’ve been killing himself with another one. I suspect he’s just wired that way.
One final point: Sullivan’s article is illustrated with famous paintings, photoshopped to show their subjects using cell phones, such as Edward Hopper’s “Hotel Room,” at the top of this post. It’s cute, and you can see what he and his editors are getting at with the little joke. But it also proves too much.
Most of Hopper’s best works portrayed subjects who were isolated and lonely and detached. Amazingly, something besides the Internet was to blame.
The other day, a day before a Yankees playoff game, pitcher CC Sabathia left his team and checked into alcohol rehab.
If this had happened in, say, 2009, I am 100% certain that someone – a columnist, a radio host or a TV talking head, and maybe several of them – would’ve talked about Sabathia’s timing being wrong and about how he was quitting on his team or letting them down. And, of course, we would’ve no doubt heard some ignorant things about the nature of alcoholism and Sabathia’s weakness and toughness and stuff.
But generally, the opinion was this:
Good for CC. Glad he’s getting the help he needs. Baseball is not as important as one’s health and family. Thoughts, prayers and hope for him in these no doubt trying times.
This is a very good thing. Good perspective and evidence of an admirable empathy on the part of the commentariat. Empathy and perspective that, when I started writing about sports professionally six years ago, I don’t think would be anything close to uniform. Indeed, I question whether it would’ve been even a couple of years ago. We’ve come a long way.
Of course, even if the bulk of the professional commentariat has evolved on points like these, there are always going to be some sports fans who treat athletes like gladiators and get all pissed if they actually show human qualities. So in the wake of the Sabathia news I, not surprisingly, heard and read several people saying things about his bad timing or his weakness and who otherwise saw this only through the lens of their entitlement as sports fans as opposed to a lens of empathy for a human being going thorough a rough patch.
This sentiment came in the form of tweets and comments on blog posts. There weren’t a lot, but there were a decent enough number to where it can’t be said that only fringe loonies feel this way. I’ve been in the internet sports business long enough to tell the difference between fringe whackos and the merely misguided. This was the latter. And, as I often do when I encounter some misguided sports sentiment, I engaged with it. I responded to some. Tweeted in general about it a bit. Retweeted some of it to put the speaker on display and open them up to a wider audience so they’ll be forced to either defend or reconsider their views. A pretty standard practice in the world of internet sports arglebargle.
In response to this have come two columns taking issue with me and others who do this. One from Tom Hitchner, a reader and follower of mine and a blogger in his own right. Another from Dan Brooks, who followed on Hitchner’s. The upshot of both of their essays was that it’s wrong for professional writers with big followings like me to call out unpopular or misguided sentiment from random people on the Internet. Tom characterizes it as me “punching below [my] weight,” and asks “[d]o we need someone with Calcaterra’s credentials, audience, and power of expression to step in and crush [the commenter] like a bug? Whose side does that elevate, Calcaterra’s or [the commenter’s]?”
Brooks is more pointed in his distaste for calling out the arguments of the non-professional and non-widely-followed Twitter account or internet commenter. He says that to do so encourages judgmental sentiment and is an exercise in “the worst kind of piety:”
There’s value in refuting popular, wrong arguments. It’s not to my taste, but I’m willing to concede there might be moral strength in calling out people for believing wrong stuff. But looking for unpopular immoral arguments—the kind of arguments that need a search bar to find—so you can publicly rebuke them is the worst kind of piety. It’s the intellectual equivalent of being a pharisee. Punch your weight, as Hitchner says.
Each of their essays have a lot of good points to consider. But each of them presents something of a paradox as well. They tell us that (a) the practice of highlighting arguments advanced by non-professional types with few followers is illegitimate; because (b) It’s judgmental. But is that not itself elitist and exclusionary? If you take the substance of the argument out of the equation, are you not saying “you’re not worth listening to unless you have a certain status or a certain set of credentials?”
I won’t put too fine a point on that because I have a more important point to make here, but suffice it to say that getting into the business of deciding who is and who isn’t worth listening to based on their platform or follower counts can be a pernicious business. A business that, if everyone took to, would’ve kept me and many, many others from ever becoming sportswriters in the first place.
But there’s more to this than merely taking a democratic approach to internet and social media commentary. This goes to the nature of that commentary itself and the understanding that these seemingly random and bad opinions do not exist in a vacuum.
One thing I agree with Tom and Dan about is that there is a certain groupthink that exists these days, particularly on sports Twitter. There is in-group signaling and some thought-policing that happens. I don’t think it’s some toxic, “politically correct” sort of business in the way it is often described – if you don’t care about losing some followers you can say a lot of gonzo shit – but it’s certainly the case that the platform’s dynamic pretty quickly singles out and disapproves of less-than-widely-accepted sentiment, to put it as neutrally as possible.
But just because that sentiment is singled out and disapproved of among a certain class of Twitter users doesn’t mean that sentiment is refuted, diminished or let alone eliminated. Indeed, all that dynamic really does is cow people who are in the media business and who are more sensitive to public opinion than most. People like Twitter-savvy writers, TV folks, radio hosts and the like. Inasmuch as those folks are influencers of opinion, yes, it likely means that their, um, suddenly more enlightened commentary causes readers and listeners to follow their lead to some extent and, for lack of a better term, think smarter about stuff. But to believe that the Twitter-savvy media’s greater sensitivity to stuff like Sabathia’s alcoholism means that people in general are is to greatly overstate the influence of the Twitter-based sports commentariat.
Tom and Dan characterize the seeking out of unpopular opinion like some sort of archeological expedition, but it’s nothing of the sort. One need not dig down deeply to find people who think Sabathia is some worthless drunk who quit on the Yankees. They’re all over the place. My comment section at HBT is full of ‘em. Just because the folks saying these things don’t have little blue checkmarks by their name doesn’t mean they’re buried under the dirt. These are people echoing the sorts of sports opinions you hear at every bar, every office water cooler and on every call-in radio show in the country. And, even if they aren’t as immediately visible on Twitter, there are WAY WAY more of these people than there are of well-followed and allegedly influential sports bloggers like me.
Look at the most popular sports shows and personalities in the country, and who do you see? Stephen A. Smith and Skip Bayless spouting 95% garbage at “First Take.” Collin Cowherd doing, well, Collin Cowherd things at Fox. Mike Francesa at WFAN and the scores and scores of talk radio hosts who followed his lead into the business trading, for the most part, on the lowest common denominator. These people dominate the non-Twitter, non-blog portion of the sports discourse. And that’s the vast majority of the sports discourse.
These guys aren’t popular and highly paid because no one listens to them. They’re popular and highly paid because LOTS of people listen to them. People who don’t have a lot of followers on Twitter but who spend a ton of time consuming sports and sports commentary. Each of these personalities have orders of magnitude more influence than the allegedly right-thinking folks on sports Twitter do, and all of them together render the notion that someone like me is squashing anyone like a bug – to use Tom’s phrase – laughable.
In our little Internet/Twitter bubble, we don’t see a lot of them. They may comment on the occasional blog post but generally do not. They may respond to the occasional tweet, but generally do not. But they are out there. In great numbers. And no matter how far sports discourse has come in the past several years, they still dictate the shape of sports discourse as a whole. They’re the ones who allow the Stephan A. Smith’s and Skip Baylesses of the world to make the nice living that they do and who continue to make sports a safe place for bros and neanderthals who, even if they didn’t slam CC Sabathia all that much this week, spend a lot of time spewing misogyny about Jessica Mendoza. Or offering thinly-veiled racial critiques of Latino ballplayers. Or when a domestic abusing football player is suspended immediately worry what it means for their fantasy team as opposed to the human beings affected.
When someone like me challenges those folks it may seem like I’m punching below my weight. But I prefer to see it as taking on a far larger, far more formidable fighter and working the body a bit before going for the head. And as a wise man once said: kill the body and the head will die.
There was a little dustup this morning in the baseball blogosphere. Background: outfielder Josh Hamilton, a drug addict who has authored an amazing story of personal and professional redemption, recently relapsed. It was minor and his playing career will now resume, but it created a rift with this team, the Los Angeles Angeles. Yesterday the rift was resolved when the Angels traded Hamilton back to his old team, the Texas Rangers. You can read the background of it all here if you care.
Last night an Angels blog called Halos Heaven wrote an ignorant and hateful good-bye to Hamilton. It was vile, even by the standards of worst parts of the Internet, essentially predicting and almost wishing for, it seemed, Hamilton’s decline and death due to drugs. The company which hosts Halos Heaven – SB Nation – removed the blog post this morning, but you can see it preserved for posterity here. UPDATE: SB Nation and the guy who wrote that blog have “parted ways.”
This all interests me from the perspective of someone who, like the Halos Heaven guy, writes a baseball blog at a larger media company. A “vertical,” to use the parlance of the business. Just as that guy runs the Los Angeles Angels “vertical” for SB Nation, I run the baseball “vertical” for NBC. The biggest difference is that SB Nation’s entire model is verticals, essentially – they have hundreds of them across multiple sports – whereas NBC has a small number under the “Talk” brand like ProFootballTalk, HardballTalk, ProBasketballTalk, etc. while still doing lots of other things. Still, not terribly dissimilar in theory. And not uncommon in online media today. It’s been around quite a while.
The vertical model is useful. And robust. With it, a large media company can cover a lot of ground it wasn’t otherwise covering. People who use words like “scaleable” call this a “scaleable” model. (note: limit your interaction with people who use words like “scaleable” a lot). As opposed to having some central editor back at corporate actively managing and gatekeeping coverage in a zillion different disciplines, you get some “experts,” for lack of a better term, delegate and let them do their thing with much less day-to-day supervision.
But there are tradeoffs, of course. When you delegate you take risks. A big risk of the vertical model is that one of your vertical mangers may be a freakin’ loon who writes hot mess content like that fellow at Halos Heaven did. When that happens, it doesn’t just reflect poorly on the vertical. It reflects poorly on the entire company. In this case, SB Nation. The same scalability that works to the media company’s benefit comes back to bite it when things go sideways.
Of course, there’s a good way to protect against this. Not by having some editor look everything over first, for that defeats the purpose and kills the robustness of the vertical model. The protection comes from hiring adults to run your verticals.
You’d think this is obvious. I mean, who in any business would delegate responsibility to someone irresponsible or with no accountability? No one, really, except for companies which publish stuff on the Internet. There you still see this a lot. Companies getting contractors or even unpaid folks to provide content. Attracting writers who, quite understandably, are looking for any break they can get. Often media companies sell this to them as “providing exposure,” but let’s be clear about it: it’s just a way for a company to get someone’s labor on the cheap.
Sometimes it’s young people who, however talented and promising they may be, might not have the market cornered on good judgment just yet. It’s not just an age thing. Indeed, I’ve seen better work from younger people overall than I have from older people, at least in sports. And the Halos Heaven guy, I understand, is at least my age and may be older. But there is a certain risk in delegating to someone so green.
The far bigger reason you get questionable content, I think, is because a lot of it in this day and age is being produced by people who are not being paid a living wage or for whom the internet content biz is not their day job. These people, young and old, may be talented, but they can’t really be expected to have the same level of accountability as an experienced, dedicated and full-time person. If you have a term paper due or a end-of-year accounting to be done at the company which supplies your health insurance, it’s not hard to understand why that bit of sports analysis came up half-assed. People prioritize in life. And, as the saying goes, you get what you pay for.
I realize this is self-serving and/or ass-kissing given that they employ me, but there’s a reason why NBC’s sports verticals work. And why they rarely if ever have stepped on it with questionable content: NBC hired grownups. Guys like Mike Florio, Kurt Helin and me. Adults who take our jobs seriously because they are, you know, our jobs, not our hobbies. Because we have been tasked with some responsibility and strive to demonstrate it in equal measure. Not because we’re better or more professional people or anything like that, but because that’s just how the basic social contract works when it comes to employment in our society.
The other “Talk” guys and I may write stuff that people disagree with. Heck, we do it often. But based on my experience in the real world back in the day, I know that an employer can deal just fine with an employee simply being wrong about something. I lost cases as a lawyer and I’ve blown bits of analysis as a blogger. It happens and will happen again. But what an employer does not like is having to answer constant questions about what the hell you just did from people who normally wouldn’t be paying attention. And what an employee can’t come back from is being an agenda item at a meeting to which he or she is not invited. These are the measures by which a model based on delegation is judged internally. And these are the things that happen when you delegate to people who aren’t as invested in your company’s mission and future as you are.
So: scale away, media companies. Achieve efficiencies and synergies with robust models until your heart and bank accounts are content. But understand that when you do so, you’re handing someone the keys to a truck with your name on the side. Make sure you give those keys to someone you trust and make sure you incentivize them to be just as careful with your truck as you would be.
Dustin Parkes penned a thought-provoking essay today. It’s about the fate of writers in a world that seems to value longer, more in-depth writing and reporting less and less as time goes on and values shorter, bloggier, clickable content more and more.
Parkes has some recent experience with this. He used to write for the sports site The Score where he specialized in longer form writing. Deeper dives. A year or two ago, however, that site let Parkes and a lot of other good writers go, deleted their archives and has attempted to pursue a more flashy, gossip-driven and viral content existence. Parkes uses the term “snackable content,” which I believe was actually coined by people who like the shorter stuff, even if it sounds like something of an insult.
A lot of people who have done the sort of work Parkes did at The Score but who are finding it harder to make a go of it these days aren’t terribly happy with the demand for shorter, fluffier content. Indeed, many in journalism who have found themselves on that side of this content divide have taken to disparaging modern tastes and modern media and have chalked it up to the dumbing down of the culture. Parkes gives some excellent examples of this based on some recent controversial changes to The New Republic.
But rather than join in that chorus, as many a smart, deep-thinking writer has done of late, Parkes calls for an end to that. Or at least points out how useless it is for a writer to take that stance. And it’s not just a surrendering, hands-up, “well, the mob has spoken” kind of thing. Parkes acknowledges that journalistic form will, inevitably, follow the function its readers want it to serve:
“It’s absurd to imagine changes in the production and accessibility of writing not affecting how we read it … Being willing to experiment and innovate will propel us much farther than wallowing in the fact that current trends don’t match our sensibilities. As our reaction to the changes at The New Republic illustrates, it’s easier to bemoan what was great about the past than adapt to the future. We’d rather shame the people looking to make writing economically viable than consider how content is being consumed. And that’s to our detriment.”
Adapt or perish, Mr. or Ms. Writer Person, because this is a business.
It’s a sentiment with which I completely agree. As I found in my previous career, if you think you’re part of some greater noble calling which should be immune to commercial considerations, you’re gonna find yourself on the unemployment line eventually.
But knowing that you need to get with the times and actually doing it are two different things. Parkes spends a lot of time wrestling with it, but even he concedes at the end that it’s easier said than done. The path to being a decently-compensated writer in this new world is still shaking out, really, and that was the case even before Facebook started wading into things, which is likely to cause even greater disruption in existing models.
Though I got and have, somehow, managed to keep a job in the world of snackable content, I can’t say that I have any monopoly on wisdom here. Especially wisdom that allows writers to continue to keep working and keeps them from having to reduce themselves to the lowest common denominator to do it. But I can say what has worked for me over the past six years.
While my writing in this space often skews long and while I, personally, am quite comfortable with more in-depth analysis, the media consumption landscape doesn’t really tolerate that anymore. Unless you’re Gary Smith or unless you have a particularly compelling story, people won’t read 3,000 words from you on anything approaching a regular basis. And if people aren’t reading you on a regular basis, no one will want to give you a regular outlet for your work. Writing three cool things a year just doesn’t pay anyone’s bills.
But people will read 200 or 400 words over and over and over again. If you have a distinct point of view and a decent set of principles you can write 200 or 400 word pops every day – or multiple times a day – and manage to attract readers who keep coming back for those little snacks. If you keep your mind on what is important and maintain that distinct point of view and that decent set of principles, you can say a few things in the process that matter. At least in the aggregate.
Sure, there are some tradeoffs involved here. You have to pay the bills, so you you may have to play videos. Or write a list once in a while. You have to make jokes or embed Vines and assorted crap like that. And, apart from the rare indulgence, you won’t be able to hold up a brilliant 3,000 word essay and say “I did this; this is important!” But you will have a body of work which, while no one single thing may be earth-shatteringly important, amounts to something that you can call your own and which your readers can say gave them something valuable.
In my experience, I probably write something of any serious length a couple of times a year. A couple of times a week I may write something that exceeds 1,000 words. Mostly I’m writing 15-20 short hits, some of which are just links to other articles, some of which are jokes or pictures or videos. A few of which are short bursts of sharp opinion. All of that taken together provides something my employers can monetize and which my readers willingly and easily consume.
But I think it’s also fair to say that all of these short bits amount to something of substance. Yes, my readers come for laughs and videos and little snacks, but they also know that I stand for some things and that I can be trusted to offer some wisdom or insight on the things that are more or less in my wheelhouse.
Maybe it’s not as writerly or noble as the stuff a house columnist at a prestigious periodical produced in the 1950s through the 1990s. Maybe some of it is base and commercial and calculated to get people click, click, click. But it’s a way to get ideas out there while simultaneously giving readers and publishers what they want. And that’s about as good as someone can hope in these strange new times.
While I work for NBC, my office is really Twitter. Indeed, I spend most of my working day on Twitter, both monitoring the baseball news and tweeting like crazy. Insights, jokes, things even more useless than that. There are some days where I’d wager that I tweet more words than most writers write in actual articles or columns. I kinda have a problem.
Against that backdrop, my friend Ethan forwarded me this post from Dustin Curtis called “What I would have written,” which describes how Twitter kills his writing:
Twitter takes complex ideas and destroys them by forcing my brain to compact them into little 140-character aphorisms, truisms, or jokes. For every great tweet, there could have been four insightful paragraphs, but there aren’t, and never will be, because Twitter removes my desire to write by killing my ideas. Once I tweet something, I stop thinking about it; it’s like an emotional release of idea liability.
I couldn’t disagree more. For all of my tweeting, I don’t think that Twitter harms my writing at all. Quite the opposite, actually.
I use Twitter to workshop ideas. I’ll tweet some random observation or jokey thing and, occasionally, I’ll follow it up with a bit of elaboration. Maybe a four or five tweet stream will develop. In turn, people’s replies to the tweets – pro, con or indifferent – help me refine the idea. Or persuade me to chuck the idea totally if it’s just stupid or if I missed something.
At the end of all of that, if the idea is more than just a joke or a random observation, I’ll think “hmm, this is a post,” and I’ll then expand the tweets into a more fully-formed idea. The result, quite often, is a blog post I wouldn’t have otherwise had, all because I started tweeting crap that popped into my head. Here’s an example of something I wrote as a result of some tweets. Here’s another. Here’s another. There are probably three or four a week like that.
Twitter is a great tool. A writer spending all day on Twitter is like a guitar player sitting around with friends riffing. Most of the time nothing comes of it other than camaraderie and bullshit. Occasionally, however, a riff will be particularly good and he’ll make a song out of it.
The idea that Twitter kills one’s ability or desire to write is nonsense. If a tweet kills one of your ideas, it probably was a crappy idea anyway.