Some days I hear or read music news that makes me feel like Edmund, Lord Blackadder

Specifically in the context of the opening minute of this YouTube clip:

And more specifically his slow burn of frustration up to the point about 40 seconds into the clip where he says, “LOOK, cretins…”

My latest reaction along these lines was courtesy of a NY Times piece with the stunningly brilliant and utterly new observation “Hey gee, there sure are a lot of people who like vinyl out there.” (It’s become such a cliche that when I mentioned said phenomenon in passing in my Thursday post on single, album and Net sales, I did so under the full assumption that anyone interested in the general subject of music had already heard it plenty of times, and was likely very sick of it.) The article in question was linked over on ILM by Chuck E. with the preface, “How many times has this article been written in the past few months? (Gets dumber every time, I think.)” A follow-up comment ran, “Yeah, these articles keep treading over the same territory, but I never get tired of reading them … especially if they are so chock full of LOLs at the expense of corny indie types and clueless record execs.”

There’s plenty to pick apart in the piece that prompts such talk — a classic eyeroller being the amazingly hamhanded attempt to equate vinyl worship with going for local/organically grown food. (As a flagbearer for the latter several times over, I couldn’t even begin to imagining coming up with that comparison without the copious aid of drugs or money, or both.) The big winner, though, comes from someone who I recognize instinctively as a fellow denizen of my past life — the college-age music freak with a love for things indie or somesuch. Thus one James Acklin talking about bonding with a bright young lass over Broken Social Scene (of all bands! I guess the Arcade Fire would have been too obvious):

“There was this immediate mutual acknowledgment, like we both totally understood what we define ourselves by,” continued Mr. Acklin, who considers his turntable, a Technics model from the 1980s that belonged to an aunt, a prized possession. “It takes a special kind of person to appreciate pops and clicks and imperfections in their music.”

Needless to say it’s the latter quote, contemptuous and condescending, which brought the LOLs on the thread so far. Deservedly so, but let’s be fair to Acklin — years ago, John D. on ILM once described the genius of the first Christian Death album (overwrought, ridiculous, absolutely beautiful in its self-conscious extremity) as a classic example of the ‘dumb-blowtorch-of-youth’ syndrome. That sounds contemptuous and condescending in turn, but he meant it, rightly, from the point of view of self-recognition, that he’d been there as well and appreciated it for what it was, while not wanting to go back there. Acklin’s flash-of-pseudo-insight arrogance isn’t that far removed from plenty of sins we’ve all committed — and lord knows if I listed mine I’d be here forever (sure am glad I didn’t say something like THAT in the first two paragraphs of an NY Times thinkpiece when I was twenty, though — if he has a sense of humor about himself he’ll live with it).

That caveat said — such a stupid statement, and the cascade of similar sentiments, or at least parallel ones, building up throughout the article provoked my invocation of Blackadder there. You just can’t believe the willful ignorance at play sometimes, or the self-delusion, or more. Allowing for your own sins as always, as already noted, sometimes it’s just breathtaking.

For instance, let’s take this sentiment, which since it isn’t a direct quote leads me to wonder how much of this is a leap of interpretive faith by author Alex Williams:

Young vinyl collectors said digital technology had made it easy for anyone — even parents — to acquire vast, esoteric music collections. In that context, nothing seems hipper than old-fashioned inconvenience.

Hahah…no. This much more than whatever Acklin says is true arrogance, an amazing if utterly expected twist on the standard idea of obscurity equalling quality. It’s not enough now that you have something unique or obscure on vinyl as compared to all that nasty mass-market stuff out there — now that music’s around for everyone, you have to show you’ve sweated and somehow earned the right to enjoy music precisely because you had to scrounge for a physical manifestation of same. Confronted with a literalization of Marx’s dictum about all that was solid melting into air, solidity is now suddenly the most crucial thing about a release, its ‘realness’ as much as its ‘inconvenience.’

Thus, to follow that part I just quoted, in a truly dumb-blowtorch-of-youth move (but which also leads me to conclude that Williams took this one guy’s claim as being more universal than it really is):

“The process of taking the record off the shelf, pulling it out of the sleeve, putting the needle on the record, makes for a much more intense and personal connection with the music because it’s more effort,” said R. J. Crowder-Schaefer, 21, a senior at New York University who said he became a serious vinyl disciple a few years ago.

A whole ten to fifteen seconds worth of ‘effort’ validates an experience! LOOK, cretins…

And I realize that mocking laughter ain’t going to fly much as an explanation, so on the off chance this fellow’s self-googling leads him here (odder’s happened…) — look, dude, what matters to you? The song, or how you hear the song? The feelings or emotions or awe that a piece of music creates in you, or whether or not you found it at a thrift sale as opposed to purchasing it through iTunes? The artistic creation or its physical incarnation?

This actually opens up a huge can of theoretical worms, one admittedly long since opened in general. (Invoking Walter Benjamin’s “The Work of Art in the Age of Mechanical Reproduction” is almost as bad a cliche as these ‘vinyl’s back!’ pieces, and yet the man’s words remain still relevant even as we move from mechanical reproduction towards digital.) And there’s much to discuss and has been discussed in the act of how we receive and understand ‘art’ as conceived and formalized — what do we expect our books to look like? our movies and their presentation? and so forth. To go on would be more than I would want to do on a lazy Sunday afternoon, thanks.

Rightly or wrongly I keep feeling like I want to pull the ‘you kids, why back in my day…’ move, which I suppose is inevitable. And as I keep telling people, the fact remains that different generations will have different expectations and desires about ‘their’ art, how it is presented and conceived — mine isn’t the only way to look at things and should never be. Love vinyl all you want, folks — as someone who collects things like Folio Society books, I’m hardly averse to the idea of fancy presentations, quality materials, the whiff of exclusivity. We each deal with such things in our own way.

But it would be *utterly* ignorant of me — utterly, totally, completely — to make a claim along the lines of how, say, the fact that I have a sleek, high-quality-paper edition of something like Gulliver’s Travels or the complete work of Keats means I am somehow making a ‘much more intense and personal connection’ with those pieces and their authors than someone who reads them in a cheap paperback version or, in fact, googles and finds them online. (Which, of course, you can.) My hope — and really, my end belief — is that someone like Crowder-Schaefer will recognize that on his own quickly enough. I am a natural optimist that way.

Meantime, to conclude — you’ll note I haven’t gotten into the ‘but vinyl just sounds better’ argument. Frankly that’s a road even MORE long-trodden since the 1980s. So instead I’ll leave the final words to Scott Seward, again from that ILM thread, who’s around my age and has been a vinyl lover pretty much all his life, in response to Acklin’s ‘pops and clicks and imperfections’ appreciation:

ugh, i get really tired hearing about the allure of pops and clicks. blah. buy clean vinyl, you dolts.

Amen. And in the meantime, if you could put this bag over your head to pretend you’re the person I just had executed…

Tamatem ma’amrine — a Moroccan stuffed tomato recipe


Taken from Claudia Roden’s book Arabesque, a collection of Moroccan, Turkish and Lebanese recipes. I stumbled across this in there along with a killer photo of same and figured these had to be made!

The recipe as provided serves six, so adjust accordingly; I’ve also simplified the instructions a touch based on how I made them:

* 4 red bell peppers
* salt
* 3 tbsp extra virgin olive oil
* one can of tuna, flaked
* 2 tbsp capers
* 4 tbsp chopped black olives
* peel of 1/2 preserved lemon, chopped (optional)
* 2 tbsp chopped flat-leaf parsley
* 6 large tomatoes (beefsteak recommended — as you can see, I went with heirlooms of various sorts)

Roast the peppers at near max oven temperature for 30 minutes, turning over after 15 minutes. They should be soft with blistered/blackened skins.

Place the peppers in a covered pan for 15 minutes; when cool enough to handle, peel and remove stem and seeds, then chop into 3/4 inch or so strips or chunks.

Mix with all other ingredients except the tomatoes.

Cut a small circle around the stalk of each tomato and cut out a cap. (If using heirlooms, note that if you’re not careful you might cut all the way through to the bottom!) Gently scoop out the center and seeds, then stuff each with the mixture and place the cap on top of each.

Arrange in a shallow baking dish, bake in a preheated 350 degree oven for 20 to 30 minutes or until the tomatoes are a little soft. But don’t let them fall apart!

Serve hot or cold, but cold recommended.

Meanwhile, as a side dish a potato or carrot salad is recommended, but it being a hot day (which is why these tomatoes are chilling right now) I’m going with this melon/cucumber recipe provided with the basket the other day:

Melon-Cucumber Salad for 2

15 min | 15 min prep

SERVES 2

* 2 tablespoons salad oil
* 1 tablespoon lemon juice
* 1/2 teaspoon sugar
* 1/4 teaspoon salt
* 1 dash fresh ground pepper
* 1 small cucumber, thinly sliced
* 1 cup melon (1/4 inch pieces, honeydew, cantaloupe, watermelon)
* crisp crunchy salad greens

1. Mix oil, lemon juice, sugar, salt and pepper.
2. Toss cucumber slices, melon and oil mixture in bowl.
3. Chill.
4. Remove with slotted spoon to salad greens.

© 2008 Recipezaar. All Rights Reserved. http://www.recipezaar.com

So there you go!

Suspended leaf animation, kinda

So I’m walking along outside the library and I see this leaf apparently defying gravity. Must have been some spiderweb strand holding it in place by chance. I took a few photos and this was the clearest. Just wish the day had been a bit brighter!

A jury-rigged beet/cucumber pasta salad

Very jury-rigged — the pasta (which as you can clearly see is the same kind as used to make Spaghetti-Os) I’d already made and kept to the side, wondering what to do with it. Some random googling turned up this recipe, which was promising.

The trick lay with figuring out how to modify it — I’d no salmon around, but while I could have gone the mayo route as suggested, I didn’t have any and sour cream is just out. So it was a fairly basic version in the end, relying on dill, lemon juice, mustard, salt and pepper to carry it, along with a bit of olive oil. And it actually worked very nicely, filling but flavorful while the beets, raw instead of pickled, added good crunch along with the coolness of the cucumbers.

So while we’re all talking acceptance speeches and Palin and all that…

…a little reminder that popped up earlier this week about the wider picture, courtesy of NPR editorial director Dick Meyer via an opinion piece over at the LA Times:

As the nation’s attention reluctantly turns to the political parties’ conventions, with their scripted suspense and stage-managed sentiment, it is important to keep in mind that these are phony representations of American political life. But the slick video profiles, the teary appearance of a beloved party elder — these are not what is most phony about the conventions.

This gathering of America’s civic tribes — and the reporters who love them — in separate cities for days of synchronized cheering and jeering is the embodiment of a great American myth: that the nation is divided into “two Americas,” polarized between “red” and “blue” camps that have fundamentally different values and moral outlooks. Each of the nominees will tell our allegedly divided country that he, and he alone, can manage to unite America for the next four years.

The idea that there is vast war over the moral and spiritual compass of the nation is a dramatic narrative, and it has dominated popular political analysis for nearly two decades. It makes for potent, inflammatory political commercials. It just doesn’t have the added virtue of being true.

This argument is not a new one, and perhaps the best aspect of Meyer’s piece is its presentation of names and sources to rely on for further study, ranging from James Davison Hunter, who popularized the ‘culture war’ term in the early nineties, through Morris Fiorina, whose book Culture War? The Myth of a Polarized America addresses and challenges that very concept (and has gone through updated editions — I should know, having ordered them for classes for Reserves!). Meyer sums it and the work of Arthur Brooks up with a simple deftness:

In fact, it’s because we agree on so much that our elections are so close. Fiorina’s “sorting” theory of voter behavior explains it with a certain simple elegance: Voters dislike both parties equally….Extremists, however rare, are becoming more common and, importantly, more rabid….Extreme liberals and extreme conservatives are now essentially dead to one another, as Tony Soprano might have put it….Increasingly, they are also the people who host television and radio talk shows, who publish blogs and who make civic noise.

This all fits in line with my belief that the problem with much political discussion and debate in the era of the Net in particular, but paralleled by the rise or reincarnation of a variety of media outlets seen to be ‘for’ a certain kind of voter (in some cases openly, in others in crypto-fashion), is this kind of ultrademonization, usually predicated on the belief that the ‘other side’ all thinks alike and are zombies that will follow the absolute worst impulses which can be imagined, and don’t really represent ‘the people’ in any event. This of course prompts those on the ‘other side’ to illustrate the differences and debates and details while in turn saying the first side are in fact the ones thinking alike etc. etc. Lather, rinse, repeat.

This isn’t meant to be a scientific pronouncement, merely an observation based on some years now of self-conscious image making and branding on the part of those larger groups and affiliations dedicated to political points of view. It still seems to hold true, though, and will continue to do so for some time to come. Meanwhile, the largest voting group in America remains the nonvoters — consider these numbers from the 2004 elections, where 4 out of 10 eligible voters did not vote at all in what was supposed to be a passionate national referendum on Bush’s presidency — and while the big premise of this election is that things are noticeably changed with the excitement Obama has generated in particular, let’s wait and see how it all plays out in the end. Emotionally I’m down with that claim; hardheadedly, I wonder a bit.

And I wonder because of something I’ve talked about in the past — what I see as the essential conservatism — not in a political sense, more in a vested-interest/not-upsetting-the-apple-cart sense — of the American populace. Meyer, perhaps less loadedly and more appropriately, uses the description “pragmatic, moderate and independent,” which also applies to a large extent, but not wholly so. This description would appear to limit itself to the voting populace, rather than the non-voting bloc I’ve indicated, whose reasons for nonparticipation vary but at base rely on the fact that America is an incredibly stable society and has been for a long, long time. Stable does not mean fair, equal to all, or perfect, I should underline. It’s something else altogether. But if a large enough group of the populace has the belief and continues to maintain it that can be summed up as ‘things are still ticking along so I’m going to concentrate on my own affairs because that’s all I can do,’ then that serves as a check for all the dreams and goals and stances on issues, from whatever side.

That may sound depressing or defeatist. It isn’t meant to be, rather it’s meant to be a recognition of things, about how despite the big issues and life-changing events, many things don’t change, or won’t change immediately. Almost seven years after 9/11 and its supposed changing of everything, it’s fascinating to see what hasn’t changed now, or how quickly certain beliefs reasserted themselves. I remember at the time — maybe even on the day itself, as some way of coldly distracting myself from the horror — telling others how the event would be used by all sides to regrind already finely-honed axes, and we’ve seen that happen pretty readily. It’s just one example of many.

But to end on a note that hopefully puts all these thoughts into some perspective — and to talk more about the subject line of this post! — something has struck me over these last couple of days, in the stories of people like Bertha Means from Texas, who remembered and fought against segregation and bigotry as openly practiced in her community, seeing in Obama’s speech the culmination of something amazing, an once-unexpected and now seemingly logical triumph. I could never claim that sense of seeing such change as it directly affected me, seeing a certain promise finally come true and possibly even truer in the future. Mine is the reflective dispassionate understated conviction of relative privilege in this society, tempered further by my age — her conviction is the ground-floor-up version, with barriers negotiated and fought against every step of the way, moving to greater and greater heights, to claim the American promise in full at last after decades of seeing that promise continue to develop due to her efforts. And she is far from alone.

Meanwhile, say what you want about Palin — personally I’m all fine with her trashing the oil-compromised feebs that ran her state beforehand, though given nearly everything else about her I can’t say I’d ever vote for her or the broken, self-loathing party she now must represent nationally, not that I was in any rush to do so to start with — the fact remains that in this country one hundred years ago, outside of a few states and territories she wouldn’t’ve been able to vote at all, while fifty years ago the idea that she might have been a vice-presidential candidate simply a pipe dream. Things have changed and the American experiment continues to unfold, and if they’re twenty-four years behind the times in nominating a woman near the top of the ticket — and allowing for the fact that Palin got her chance via choice and persuasion rather than slugging it out in the primaries, a chance and route Hillary Clinton could have feasibly achieved — then even so, someone, somewhere in the GOP recognized it was the 21st century. If they’re punting, they’re doing so figuring that their base will still cheer them on.

Conveniently enough, over on ILE John D. posted something a little bit ago that sums things up very handily on this front:

…it’s meaningful whether she’s a terrible candidate or not. Am I going to vote for the ticket because there’s a woman on it? No, of course not, she’s insane and McCain is awful. Does it have some historical weight for the right-leaning party – the one with a huge evangelical base, many sub-pockets of which genuinely believe that this country would be better off if women didn’t work at all – to put a woman in the #2 position? Of course it does. They haven’t done that before. It’s indicative of general progress toward eventually not being a total embarrassment among democracies in having only been led by men.

And so a convention to see through and then two months to go. It’ll be a hell of a ride, at the least.

WordPress.com Political Blogger Alliance

Forget Facebook the movie — how about Facebook the musical?

So last night I discovered this and boggled:

Welcome. I’m Aaron Sorkin. I understand there are a few other people using Facebook pages under my name — which I find more flattering than creepy — but this is me. I don’t know how I can prove that but feel free to test me.

I’ve just agreed to write a movie for Sony and producer Scott Rudin about how Facebook was invented. I figured a good first step in my preparation would be finding out what Facebook is, so I’ve started this page. (Actually it was started by my researcher, Ian Reichbach, because my grandmother has more Internet savvy than I do and she’s been dead for 33 years.)

There was much laughter over on ILE when I started a thread about it. And scorn and more besides. But then poster Moley suggested:

It should be a musical. What would it be called? Oh yeah, Facebook!, of course.

I responded:

No way, they film this as a straightforward comedy/drama, and then the musical adaptation off-Broadway is called Poke!

And we were off and running. I’ve contributed various bits of doggerel though I’m fondest of these two:

“I’m nervous, I’m anxious!
I can only hesitate!
Tell my friends, all at once
About yesterday’s date
Will they read, do they care
For my status updaaaaaate!”

“If I knew then
What I know now
Those deep feelings that say it all
Don’t ask me when
Don’t insist how
Just believe I’ll write on your wall…”

And others have chimed in, but I think Thomas’s two contributions are the latest winners:

I was dreaming of a day when I could be free
And no new notifications would be waiting for me
So I’ve poked my final poke and now I’ll live beneath the trees
Aaaaaagh wait just gotta have my turn on scrabulous (R.I.P)

You must change your status
in order to update us
on the little things that we don’t need to know
hey! you have new photos
of your brand new coat! oh!
and your brother’s band is cancelling their show

Further contributions welcome.

Over at Idolator, some good talk about the majors and the single format

Some VERY good talk. Chris Molanphy is the site’s regular chart-watcher and his analysis has always proven very insightful; this entry, discussing recent moves about Kid Rock and now Estelle singles not being available on iTunes, is however a cut above. Read the whole thing, of course, but to extract a key point:

As I explained in a recent “100 and Single,” Capitol forced fans of Hammer’s “U Can’t Touch This” to buy his Please Hammer Don’t Hurt ‘Em album by only releasing the song in the spring of 1990 as a 12-inch vinyl single. The result: a 21-week-chart-topping, 10-times-platinum album. Their tactic that fall with Vanilla Ice was different, but similarly diabolical: the Queen/David Bowie-fueled “Ice Ice Baby” was released in the popular cassingle format and charged up the Hot 100, but once SBK knew the song was poised to top the chart, the “Baby” single was deleted. The result: the song spent only one week at No. 1, but on the album chart Ice’s To the Extreme shot to No. 1 and stayed there 16 weeks, shipping 4 million copies out of the gate and eventually going septuple-platinum.

In short, EMI produced two back-to-back smash albums, first by withholding the big hit from the most popular singles medium altogether, and the second time by pulling the big hit from the market just as it peaked.

I have long posited 1990 as the year that launched the Great War Against The Single, a decade-long campaign that saw endless casualties (not least, the consumer’s wallet) and didn’t end until the Rebel Alliance that was Napster and the Versailles Treaty that was iTunes.

Now, in 2008, it appears that Atlantic is attempting to start the War all over again, and they’re doing it by replicating EMI’s first two strategic moves from 1990, verbatim. (Insert joke about history repeating as farce here.) If Kid Rock’s “All Summer Long,” never released on iTunes, is Atlantic’s new “U Can’t Touch This,” then “American Boy” is their new “Ice Ice Baby.”

Besides spelling this out in detail and noting exactly how the single fared as a discrete entity/priced object in the nineties, Chris also concludes by noting that the cat was out of the bag anyway, since file sharing is going on regardless, while the opportunistic presence of a near-Xerox cover version of Kid Rock’s hit on iTunes is a classic example of gaming the larger system.

Still, there’s something larger to dwell on here which isn’t quite spelled out — earlier in the piece, Chris quotes an official Warner Music Group release via the Wall Street Journal where the idea of withholding singles to spark up album sales ties into a strategy “uniquely tailored to each artist and their fan base in an effort to optimize revenues and promote long-term artist development.” Garbage language, of course [EDIT: and on Idolator, commenter the rich girls are… unpacks some of the slimy assumptions at play] yet consider what the kernel is at the heart of it — what many of us see as shameless fleecing on the part of the labels (and on the part of Kid Rock — and lord knows that song of his is one of the biggest fleeces out there) is actually being matched by a consumer base willing to pay full price for the album as a product in toto.

It’s been one of the hard and fast axioms of this decade, especially (and logically) on the Internet, that this kind of listening and economic model is outdated or somehow just ‘wrong’ in a conceptual sense. This has been articulated in various ways, whether we’re talking about the idea of iTunes as mass success via song by song purchases or creating one’s own playlists or the rampage of mp3 blogs or more besides. A full study isn’t something I’m interested in right now but there’s something about how this Kid Rock model has worked in particular — an album by an artist pushing a romanticized portrait of the past via an older artistic/economic model, something that is ‘retro’ in its very nature — which amuses. In a strange, disconnected way, this is almost the highest profile incarnation of the supposed rush back to vinyl releases going on, the implication that music must have value in the strict sense of physical possession of the object, paid for as released and priced in the current market.

What could be happening here, far from being an attempt to simply turn back the clock to the days of integrated music-release/sales control, is what’s been talked about elsewhere on Idolator and around, namely that the major labels must learn to adapt or die, and that in the shape of Kid Rock — and, potentially, Estelle — they have examples where they can get the sales they want in the shape they want in a market situation where they won’t have the full range of income they had any more. Diminished expectations but functional activity nonetheless — but the point being: they’re getting their sales. People are buying that Kid Rock record. Are we supposed to tell a consumer they can’t or shouldn’t? More than ever it is a conscious choice on the part of the consumer on how to get a song, and if the most notable option in recent years towards getting that song has been blocked and people are instead purchasing the album, we may cluck our tongues, but in a marketplace that isn’t ignorant about file-sharing and the Internet — the whole Napster to-do is almost a decade old! — one can’t write all this up to consumers suddenly forgetting that the Net as an option to hear a song is available.

I could go on but this is enough of a ramble already. And I’m not trying to spell this all out as some major positive all around — this is more of an observation, perhaps a very ham-handed one. Personally I’ll be interested to see in a time of economic slowdowns exactly where things go from here on a number of fronts (and stepping over once more to a side subject, if oil prices go through the roof again, as seems likely, I’ll be interested to see what the vinyl fetishists do when they start having to pay through the nose). But possibly what we are seeing is less last dying lash and more next step towards viable multiplicities. Possibly.

For better, for worse and…for worser

Okay, I’m sorry, but no:

Cartoonist Lynn Johnston can’t bring herself to abandon her fictional family. For years, the “For Better or for Worse” creator mulled retirement, then lightened her workload by creating flashbacks and repurposing the archives of her popular comic. Finally, she knew she needed to conclude the Patterson family’s 29-year saga.

This Sunday’s cartoon is an adieu of sorts to readers, but not a final farewell. She announced this month that she would retell her strip’s narrative, beginning Monday, by taking her continually aging characters back to 1979, but creating new artwork and some dialogue. Her syndicate says it’s the first time a mainstream cartoonist has set out to tell the same story twice.

….on Monday, the strip will time-travel back to 1979 and do it all over again, but with new drawings, new conversations, new wrinkles. (And in some cases, fewer wrinkles — John and Elly Patterson will return to parenting tykes.)

“It’s going back to the beginning when Michael and Elizabeth were very young,” Johnston says of the approach, which she is dubbing “new-runs.” “I’m going back to do it how it should have been done. . . . I’m beginning with all this knowledge, so it’s a much more comprehensive beginning. I only have an insular world of characters [from 1979] to work with.”

As far as Johnston knows, “new-runs” — in which a strip’s continual story line is retold — have never been attempted by a syndicated cartoonist (“Nobody has done it before — most people die or the strip ends,” she says).

“All of September will be brand-new material,” Johnston explains. “In October, it will be [a ratio of] 50-50. The color Sunday comics will be all-new material. . . . I think it will be 50-50 for the first year, at least.”

I really don’t have the words.

That said, I do have some, I guess. Thing is, there is something perversely tempting about all this from a creative point of view, the whole ‘well now that I know what I do know, I’m going to get it right this time.’ Trust me, I feel that way about most of my writing work in general (though sometimes I can be surprised — yesterday I was somewhat flabbergasted to receive a huge compliment about some decade old AMG reviews from one of the two lead musicians of the band in question — we’re not talking U2 famous but this is a very well-known and respected group indeed — and I admit to have been dancing on air a bit since).

And I want to take Johnston’s words on it all at face value. Still, there’s something just so…well, again, where are the words, I don’t quite have them. But I’ll try.

It’s the classic ‘if I only knew then what I knew now’ deal, and I don’t like it. In my life there are mistakes and regrets and I don’t pretend to have been prey to dwelling on them, but I try not to do so exclusively. This just seems like a strange way to literally rewrite history, and by doing so in such a programmatic fashion. One of the distinct advantages of a story like Johnston’s strip, one that is open-ended and lets the characters grow and age, is how this by default forces them into being in and of their times. They may react in ways that are classically mainstream as such, but that is life, and the whole point of the world of the Pattersons in the strip is that they were almost overarchingly so mainstream in a comfortable sense.

By default, Johnston’s decision moves her work from being an ‘of the time’ story to one where every move is a signifier of a different time — the contemporary becomes the retrospective, and if one has been working with characters for thirty years reacting to the changes around them, going back and trying to place oneself in a mindset without that knowledge is seemingly impossible. (If anything the fact that Johnston says she is coming from a place with greater awareness is all the more troubling — every word and frame now is done with the knowledge of not only what happens to the characters but their setting and society, their very place. Is this creativity or day-by-day nostalgia, and what is the purpose of such an obsessive retracing? Given the nature of the strip for its entire existence, why not live and work in the now?)

This all said, what about the strip itself? I suppose if I’d followed it more recently I’d care more — but I did follow it for a long time, actually. Stepping back a bit, I first remember reading comic strips in the mid-seventies or so, with Peanuts unsurprisingly being my way forward (though I’m sure I was reacting to the stellar run of early TV specials first), though I can’t be sure. A slew of strips first caught my eye in the late seventies and early eighties when living in the Bay Area, most notably The Far Side, which rapidly became a deserved family favorite, while For Better or For Worse caught my attention too.

I’m not sure why, per se, or what attracted me. It’s almost hard to put into words now — but I suppose because it was easy enough to understand, even for an eight year old kid, and allowing for the fact that the family situation was, after all, pretty similar to mine — married well-off white couple with older brother and younger sister, plus a dog. In fact one of the things that most ended up defining the strip in the end, its Canadianness as perceived, didn’t strike me at all until many years later, late eighties I think, when a character idly mentioned something somewhere that made me realize “Oh…wait, this isn’t set in the US?” (The fact that one of the characters was named Gordon should have given it away earlier, I admit.)

And so while I don’t remember ever laughing out loud over it, I liked it well enough and as a matter of course followed it, even as over time I was following — far more completely and obsessively — strips like The Far Side as mentioned, plus Bloom County, Calvin and Hobbes, FoxTrot (Jason Fox IS ten year old me, believe me), Dilbert and The Boondocks, among many others. But those were the ones I actually bought book collections for — For Better or For Worse, never. Yet I knew the basic storyline and idly followed things and still remember the huge kerfluffle over Michael’s friend coming out as gay (needless to say I had no problem with that) as well as Farley the dog’s death (okay, I had a HUGE problem with that — I’m just a sentimentalist when it comes to dogs, as I’ve talked about here before) and…I think that’s about all I do remember.

As with so many things one picks up out of habit, letting it go can be a protracted process, and one that you only figure out in retrospect more than anything else. Similarly with this strip (but also a lot of comic strips — I don’t regularly read them much any more, and like TV it’s a shift in habit that never felt monumental, more simply a sense that I was done with them as a matter of regular interest), so aside from a random glance or two I never quite knew what was going on beyond that point, and happily so.

So when I looked at this related article to the one above, detailing the obsessive habits of those who follow the strip just to mock it — I can relate to this impulse as well on other fronts, believe me — the part that made me go ‘?!?!’ the most was this:

Their son Michael hit it big with a best-selling novel (About what? We never learned) and he and his wife, Deanna, bought the old Patterson family home, somewhere in the suburbs of Toronto. Little sister April Patterson’s band, the Archies-esque 4-Evah, broke up, then got a new singer, making them 4Evah & Eva. Elizabeth (a.k.a. Lizardbreath) gave up her new life teaching native people in the Canadian hinterlands to move home and marry Anthony, her boring high school boyfriend.

What, was this becoming The Royal Tenenbaums? (Don’t answer that.) The most confounding (and, from where I sit, at the least troubling and at the most insulting) detail was Elizabeth’s fate if only for the implied grappling with the traumatic issues revolving around the First Nations and how the Canadian government and society treated them in the past, only to apparently ditch that for conventionality and a comfortable existence back in the comfortable Ontario exurbs of suburbs or whatever. Of course.

So I don’t know. I think I’m content to let my final memory of the strip be Johnston’s tribute to one of her mentors, Charles Schulz, on the day when a slew of comic strip writers paid a similar joint tribute to him (it had already been planned for that year but his death beforehand made it a true memorial):

Snoopy forever

And I’ll try and keep this example in mind if I ever want to go back and ‘improve’ something I’ve done once it’s formally published. Take note what’s been done — and then do something else instead.

[EDIT: A friend posted this interpretation of things elsewhere. Very, very silly.]

So about that new Verve album

Forth!

Having posted a couple of times in the past on the reunion and all, it behooves me to say something on Forth, which got its formal release this week (I think — release dates are all incredibly fluid in my mind, a combination of the conquest of the Net model and the fact that I get so many promos well in advance these days).

Admittedly yesterday was either just the right way to hear it or the worst way possible — flat on my back on my couch after having come home from work and feeling a miserable pinched nerve or something in my right hip. (Better now, thankfully.) That plus feeling very tired meant that I was drifting in and out of dozing during my first listen, but hey, I’m not formally reviewing this one, and sometimes you just want to absorb something as you do.

What I remember from the first few songs was the clear impression that this wasn’t a Verve album but a Richard Ashcroft solo album backed by Verve. Which arguably was the case with Urban Hymns, but here everything felt a little more formal, maybe too formal. If you look back on UH from eleven years distance (eleven years!), it’s actually a varied album within its particular scope — songs like “Chasing the Butterfly,” “Neon Wilderness” and “Velvet Morning” don’t have much immediate resemblance to each other or to the big hit singles either.

Here, in contrast, everything felt a lot more focused and point-by-point, and tended to resemble each other from what I foggily recall. I think it was friend Stripey who said to me that this would be Simon’s album more than anyone else’s and I think she got it right, since there seemed to be a clear sense of his bass as anchor throughout, over which Richard and Nick did their respective things. Enjoyable but at the same time a little flattening.

That said, the end of the album found the wings being stretched more by everyone, and while I did my fair share of mistaking lyrics (I kinda hoped “Valium Skies” was actually called “Valley and Sky,” which seems very early Verveish), it reminded me of nothing so much as the Chameleons’ reunion album a few years back, where while they weren’t THE CHAMELEONS in their classic overdrive sense most of the time, they had found a new balanced point to meet at that suited everyone well. (Similar could be said of Crispy Ambulance or For Against’s reunions…heck, this has just been a good decade for bands coming back that keep a sense of time and expectations in place.)

I’ll need to hear it more of course, and doubtless I will. Over on ILM Marcello snarkily noted “I understand the notion of long-term loyalty to an artist even when the records are crap” (though he then went on to invoke the Game as a positive counterexample, and I couldn’t imagine a more tedious wannabe at this point in any genre), but given diminished expectations — this was never going to be A Storm in Heaven and Richard still thinks of himself as some sort of prophet/myth figure, somehow — I’ll still take it. But I’ll take a proper LA appearance that isn’t part of the Coachella machine more, thanks, so if you guys could come back…

So is there anything going on in Denver?

Or will there be anything going on in Minneapolis/St. Paul soon enough?

Swiftly approaching the two-months-to-go point with the election and far from my getting a sense of urgency about everything, time seems to be stretching out further and further. The conventions aren’t helping, and they wouldn’t anyway. It’s now perfectly accepted knowledge to say that the conventions aren’t about anything actually being decided, but about parties partying, so kvetching about them as they are is kinda pointless — near Andy Rooney/Dave Barry territory, I figure.

But the whole theatrics of what must be going on right now amuses. I can’t imagine sitting through it all, though — my lord, what a pain that would be, especially after the Olympics. So instead I vaguely scan headlines and see photos like this that make me go ‘whuh?’:

One more time?

As friend Mackro immediately compared it to:

Robot ROCK

I was thinking a combination between the Oscars and an old Nine Inch Nails concert setup myself (somehow that seems tailormade for a performance of “The Hand That Feeds”), though I’m sure the GOP set will be even MORE ridiculously garish, somehow.

The reason why the conventions still hold the imaginative power they do, for all their lack of anything substantive, grows out of their place and reputation as make-or-break spots for candidacies — two books I highly recommend have a convention as a key event. The first, Kenneth Ackerman’s stellar Dark Horse, is hands down one of the finest popular studies of American history in recent years, covering as it does the candidacy, election and assassination of James Garfield, in doing so fully fleshing out the mass of cross-currents and political influence among many different figures in politics at the time that has been obscured with time. The details of how Garfield ended up winning at the Republican convention in 1880 — thus the title of the book, since his victory was far from foreseen — is fascinating stuff, very undemocratic in any sense of the term, but everything to do with gladhanding, patronage and much more besides.

The second shows how eighty years later the function of the conventions was already starting to turn into something else but still was the place where everything finally got settled in the back rooms. Theodore H. White’s The Making of the President 1960 has long been held up as a model of an election year study, even as time has readily demonstrated White’s own clear subjectivity on many points. Much about the book still recommends itself, though, not least being something that was a study of Kennedy and his organization and approach before his demise and almost reflexive canonization, and as such the study of the Democratic convention and how the Kennedy crowd assured his nomination in the end, if a popularized account, is nonetheless key reading, especially in an era before the current superdelegate oddities for that party.

What seems to have been the last actual ‘all the way to the convention’ nomination battle in recent times, the 1976 Republican race, occurred when I was alive but unaware of what it was all about (hey, I was only five), and since then my encounters with the conventions have all been about a lot of well-meaning cheering and dullardry. (I vaguely remember that the 1984 Democratic had Hart sticking through to the end but Mondale was well in the lead anyway.) I think it’s somehow appropriate that one of the last things I remember from any convention was a speech by an up-and-coming young politician that helped bring him to wider attention for his own successful nomination four years down the road. But the parallels with Obama end there, because Bill Clinton’s speech at the 1988 Democratic convention was one of the most singularly boring things I’d ever seen.

It ran about thirty-five minutes or so — and if it was on YouTube I’d link it, but not surprisingly it’s nowhere to be seen — and I just remember thinking “Who IS this guy? And why am I watching this?” I remember hearing some audience restlessness as it went, and very distinctly remember him saying something like “And in closing…” and then having to stop because of the huge amount of cheers that these words evoked. He looked a little pained.

That said, I also half remember that he ended up as a guest on Johnny Carson or the like shortly thereafter precisely because of this flop of an appearance and was able to poke fun at himself and go on from there, leading of course to a more successful series of speeches and more four years down the line. (Not to mention a more famous talk show appearance from that time as well.) Not a burden Obama has to deal with, at the least.

WordPress.com Political Blogger Alliance