Two Things That I Like But That Are Now Dead

The first thing that I like but that is now dead is The Simpsons.

(This isn’t really going to be ‘about’ The Simpsons in the end, so if you don’t particularly care about The Simpsons just replace “The Simpsons” with “[other popular American TV show]” every time you come across “The Simpsons,” and overall the point of most of this should still make sense.)

The Simpsons is my favorite TV show ever. I have seen every episode like three times, and there are like 500+ episodes. Do the math. I own seasons 1-13 on DVD. I own a ~1,000 pg. hardcover episode guide. I own another shorter paperback episode guide that has mostly the same material as my hardcover episode guide, for no good reason. I have lots of Simpsons comics. I have Simpsons Monopoly. I have Simpsons Clue. I know how to pronounce “Matt Groening.” I was Bart for Halloween once. Etc.

A week or two ago The Simpsons aired its 500th episode. It was really bad. For a hardcore Simpsons fan like myself, it was even depressing. The episode ended with this ‘cute’ tidbit:

Well, you ‘got me,’ Simpsons: I am now talking about how much your 500th episode sucked on the internet. (But I did get some fresh air before writing this—Wednesday was really sunny and warm and I spent most of the day in the arb.) Another ‘cute’ tidbit: note the “the most meaningless milestone of all!” tagline in the first image from the episode’s opening credits.

Without even going into detail about how exactly the 500th episode was bad and unfunny (and it was indeed VERY BAD AND UNFUNNY), just by pointing out these two ‘cute tidbits’—which aren’t really just ‘cute tidbits’ but are in fact a sort of insidious televisual rhetorical sleight of hand (or so I’ll argue in a sec.)—just with that, I can tell you how I know The Simpsons is no longer a legitimate piece of televisual art and is now just a sort of pathetic, desperate, please-like-me-why-don’t-you-like-me…TV…thing.

TV generally gets a bad rap (deservedly, perhaps), but at its best I consider it a ‘legitimate form of art.’ For a while—like seasons 1 through ~12—The Simpsons was one of the best pieces of televisual art. It was the quintessence of ‘sitcom.’ It was the apotheosis of TV. Its writers were basically better, smarter than 51% of the writers I now read as an English-B.A. candidate. I’ve learned more from seasons 1 through ~12 of The Simpsons than from semesters 1 through 6 of my undergraduate education. That might be inaccurate. But I’ve definitely learned more about good writing—like narration, plot, characterization, puns, etc.—from The Simpsons than from any undergrad English class. That’s not hyperbole.

But the way I can now tell, officially, that The Simpsons is dead as a piece of art, that it’s now just a garden-variety bad-rap-getting primetime-TV P.O.S. is that it’s stopped simply funnily dealing with plain-ole’ middle class familial affairs and started inciting a sort of convolved rhetorical game in which it tries to convince you that you’re sort of a P.O.S. for watching The Simpsons (e.g., “get some fresh air why don’t ya!”) and that The Simpsons is a sort of a P.O.S. itself but that you should laugh ‘with’ it about how much of a P.O.S it is. Or, it’s not really a P.O.S because it knows it is now, at episode 500, a P.O.S. Or something.

What The Simpsons’ cute shots at itself (“Log online! Make fun of us!”) do, really, is set up a rhetorical situation in which it is impossible for you to simultaneously criticize them seriously and intelligently and for you to remain a serious/intelligent art-viewer/person—it makes these things mutually exclusive. The Simpsons wants it to be impossible for you to hold them accountable for being a P.O.S. (read: w/ double entendre, “point of sale” and “piece of human refuse”).

If you think The Simpsons isn’t trying to convince you that it hasn’t turned into a P.O.S. because it is itself ‘candidly admitting’ that it is in fact a meaningless excremental hunk of TV—and even admitting that it knows it’s a meaningless excremental hunk of TV—with cutesy posturing like “most meaningless milestone of all!” and “logging onto the internet and saying how much this episode sucked,” don’t. Don’t let the posturing fool you: these ‘admissions’ of suckiness are Psych101-style reverse-psychology defense mechanisms. They’re rhetorical moves. Most of all, they’re B.S. The Simpsons has turned into, like, that fat kid in fifth grade who made fun of himself so he seemed okay with his being fat and seemed totally not sensitive about it and seemed to totally not care that has friends called him a ‘lard-ass’ everyday but who actually went home everyday feeling terrible and was actually totally sensitive about his weight and cried alone in his room every night while eating ~6 Twinkies and feeling just downright miserable/downright full of Twinkies.

Self-deprecation as a rhetorical move implicitly shields against any real, bona-fide criticism. Think of it this way: If the fat kid calls himself fat and then you call him fat afterwards, your insult loses a lot of its sting, now doesn’t it. It seems pointless. It seems dead-horse-beating-y. You seem like a less intelligent human being for redundantly beating fat dead horses unoriginally redundantly. You seem to lack wit. Likewise, if The Simpsons calls itself sucky before you can log onto the internet and call it sucky, then you’re backed into a sort of rhetorical corner if you want to air some serious criticism. Because The Simpsons has beat you to the punch. Why would you criticize something that’s already been effectively criticized? Sure, you can go ahead and voice your criticism anyway, but the response The Simpsons has set you up to be met with is something like “we all already know The Simpsons is sucky now; even The Simpsons calls The Simpsons sucky now; stop stupidly saying unoriginal things we all already know already, stupid.”

The Simpsons and the fatty both want to look like pachyderms; they want to look like they don’t care whether you like them/think they’re overweight or not. (Except the fatty probably doesn’t want to look like a pachyderm, in terms of size…) The universal definition of ‘cool’ seems to be something like ‘nonchalance’ or ‘indifference’ or ‘not caring,’ so it’s easy to understand what The Simpsons and elephant-sized kid are going for by affecting anti-/apathy towards themselves and their artistic/cardiovascular health: They want to be liked. They want to be cool. They want to be popular. And to be cool and to be popular, they have to convince you that they don’t particularly care about being cool and being popular.

Why? Because I can guarantee you that your idea of ‘cool’ is somehow related to ‘nonchalance’ or ‘indifference’ or something like that. What’s more cool than someone who doesn’t caring about being ‘cool’? What’s more uncool than someone who seems to live and die by your judgment of them? Ergo if something seems to not care about itself or how you will judge it, you will judge it better. It’s a sort of paradox: the only way to be judged well is to disregard how well you will be judged.

So: in order to be judged well, The Simpsons pretends to not care about being judged unwell.

But of course, The Simpsons truly does care about what their fans think—to drop the personification for a second, let’s acknowledge that what ‘The Simpsons’ is really is a bunch of human beings who probably work hard at their TV jobs and who are probably basically decent people who care about the quality of what they produce and wouldn’t feel good, about themselves as human beings, if they mostly produced televisual excrement. In fact, I know they care because why else would they dare you to criticize them? Think about it—if they didn’t care about being criticized, why would they even bring it up, why add the little endnote? Because they just ‘don’t care’ SO MUCH that they, like, HAVE TO let you know about how much they don’t care? Seems unlikely. There’s a difference between genuine coolness/nonchalance and affected indifference.

When The Simpsons says, “Go ahead; criticize me,” I see not a cool, indifferent piece of bona-fide art but a scared, desperate-to-be-liked prepubescent fatty. I see a sales pitch: “Like me because I’m cool.” “I’m cool because I don’t care if I’m still cool at age 500.” “Being a nerd who posts 1,000+ words online about not liking me instead of going outside or something is nerdy and uncool.” “Watching ‘meaningless’ TV and getting excited about a ‘meaningless milestone’ for your favorite TV show is dumb, RIGHT? *wink**wink*.” Etc. It seems to me that one of the big differences between a piece of bona-fide art and a piece of commercial crap is how much effort the thing apparently devotes to selling itself. The Simpsons used to not need to sell itself. It used to just sell other things—like Butterfingers. But now it’s such a bad show that, instead of convincing people to watch it by being good/funny, it convinces people to watch it by half-assedly reverse-psychologically convincing them not to hate it.

None of this is really only pertinent to one specific episode of The Simpsons, by the way. I just used the 500th episode of The Simpsons for this post because it was temporally and personally relevant. But all TV’s been making fun of itself for decades. E.g. there’s the 80’s Married…With Children, which has been described as “a sitcom-parody of sitcoms” (i.e., a sitcom that makes fun of sitcoms). For another example there’s televisual demigod David Letterman, whose jokes’ butts are usually his show/himself.

I’ve never actually watched Letterman or Married…With Children, because I am less than one-hundred years old, but the point is The Simpsons wasn’t the first TV show to try what I would term ‘the self-deprecating-fat-kid technique.’ Other shows have done it, and have done it better.

The second thing that I like but that is now dead is David Foster Wallace.

David Foster Wallace was an American author. He killed himself not too long ago, sadly. His ‘birthday’ was a couple weeks ago, around the time The Simpsons 500th aired. One of the things David Foster Wallace wrote was an essay about U.S. television and irony, which I’m admittedly pulling ~90% of my ideas from here (not to mention the examples of Married…W/ and Letterman). When I saw The Simpsons pull a ‘self-deprecating-fatty,’ I immediately thought of David Foster Wallace. I thought of Wallace quotations like:

“And make no mistake: irony tyrannizes us. The reason why our pervasive cultural irony is at once so powerful and so unsatisfying is that an ironist is impossible to pin down. All U.S. irony is based on an implicit ‘I don’t really mean what I’m saying.’ So what does irony as a cultural norm mean to say? That it’s impossible to mean what you say? That maybe it’s too bad it’s impossible, but wake up and smell the coffee already? Most likely, I think, today’s irony ends up saying: ‘How totally banal of you to ask what I really mean.'”

And:

“It’s of some interest that the lively arts of the millennial U.S.A. treat anhedonia and internal emptiness as hip and cool. It’s maybe the vestiges of the Romantic glorification of Weltschmerz, which means world-weariness or hip ennui. Maybe it’s the fact that most of the arts here are produced by world-weary and sophisticated older people and then consumed by younger people who not only consume art but study it for clues on how to be cool, hip–and keep in mind that, for kids and younger people, to be hip and cool is the same as to be admired and accepted and included and so Unalone. Forget so-called peer-pressure. It’s more like peer-hunger. No? We enter a spiritual puberty where we snap to the fact that the great transcendent horror is loneliness, excluded encagement in the self. Once we’ve hit this age, we will no give or take anything, wear any masks, to fit, be part-of, not be Alone, we young. The U.S. arts are our guide to inclusion. A how-to.  We are shown how to fashion masks of ennui and jaded irony at a young age where the face is fictile enough to assume the shape of whatever it wears. And then it’s stuck there, the weary cynicism that saves us from gooey sentiment and unsophisticated naivete. Sentiment equals naivete on this continent…”

Those two quotations are two things I really like.

2012: The Year of the Flop

Broadway flops were brought to national attention with Mel Brooks’ film The Producers, in which a producer, Max Bialystock, and his accountant Leo Bloom set out to produce the worst Broadway show possible.  The idea is that it is possible to make more money with a flop than with a hit.  Of course, their master plan crumbles quickly when the show, Springtime for Hitler, enjoys unexpected success.

There have been many composers and lyricists who have accomplished Max and Leo’s goal without even trying.  This year, two of these enormous flops are receiving eagerly-anticipated revivals in New York.  There is an entire subset in theatre culture that is obsessed with big flop musicals.  There was even a book written about these shows, called Not Since Carrie by Ken Mandelbaum.  The title refers to possibly the most famous flop, and one of the two musicals that is being remounted, Carrie.  Yes, that Carrie.  It is a musical based on the novel-turned-movie by Stephen King about a troubled telekinetic teen.

When Carrie was first on Broadway in 1988, it ran for a total of 16 previews and 5 performances.  On this blog, I have often alluded to the innate smallness of the theatre community.  Word of mouth alone seems to have ruined this musical.  It was doomed from its out of town try-out in Stratford, where the public was up in arms that the Royal Shakespeare Company was putting on this show.  Unlike most shows, as Carrie moved forward in development and got closer to its Broadway debut, it got worse.  The revisions and directorial choices weakened the show.  Rather than juxtaposing Carrie’s inner angst and outer powers with a relatively normal society, the show put Carrie’s classmates in costumes that resembled Grecian goddesses and workers at a leather bar.  Carrie’s powers were barely even hinted at, so by the Act One finale, when her hands were literally on fire, members of the audience unfamiliar with the movie or book were perplexed.

While the show failed commercially, it immediately became a camp classic.  Secretly recorded video and audio circulated the theatre community, and those who were not at the show wished they could have witnessed the wreck.  Even now, some salvaged clips remain on YouTube for those fans who never thought this day would come.  They day they can finally see Carrie for themselves.  Mandelbaum says that the thing that separates Carrie from the other flops in his book is that it was so hot and cold as far as quality is concerned.  The mother’s big ballad, “Open Your Heart,” is still held up to be a beautiful piece of music, but there was also a song (which the 2012 review hints may have been cut, or at least pared down) about the pig slaughtering, with genius lyrics like “kill the pig, pig, pig” while oinks resonate through the theatre.  And did I mention the show ended with Carrie ascending an enormous stairway to Heaven?

MCC’s production of Carrie was successful before it even began.  This is the definition of a cult musical.  You go to be able to say you went.  There are those who are avid fans of the pirated cast recording from the 80s, but many audience members just want to see what everyone has been talking about for the past nearly 25 years.  The New York Times review praises Marin Mazzie who plays Carrie’s religious fanatic mother, but otherwise gently condemns the musical.  Many positive changes have been made since the 1988 disaster, but it seems that Carrie is just not one of those shows that works.  Regardless of reviews and actual merit, the production has already extended its limited run an extra four weeks.  Everyone wants to see the world’s most famous flop.  I know I would.

The second revived flop of 2012 is a little more tragic.  Merrily We Roll Along was a show written by Broadway God Stephen Sondheim and originally directed by the equally terrific Hal Prince in 1981.  The show followed the professional and personal journeys of three best friends backward from 1980 to 1955.  This show had a bit more success, but about the same amount of chance as Carrie.  It ran for 52 previews and 16 performances.  And unlike Carrie, Sondheim and Prince made significant improvements to the show during their time on the Great White Way.  However, word had already gotten out and the show was really doomed before it began.  There are tales of walk outs from the first preview, audience members who left completely confused about what story they just watched and who the characters on stage were.  After closing on Broadway, Sondheim continued to make revisions to the show with librettest George Furth.  So when the show opened at New York City Center through its Encores! program this year it was a very different show than that first preview in 1981.

However, it still fell flat.  This production had everything– a stellar, super exciting, young but not brand new cast; a great concept; talented musicians, directors, artists– but it was still missing something.  According to reviews, this seems to be one of those shows that people want to work but it just might not be possible.  Personally, my Merrily cast recording is practically worn through I’ve listened to it so many times.  This is a show I want to see succeed.  But I’ve also never seen it in full production.  Critics acknowledge that the music is beautiful.   It’s Sondheim, for God’s sake.  Unlike the writers of Carrie, who were not entirely inexperienced but certainly did not have the breadth of experiences Mr. Sondheim has, Merrily has everything going for it, except for the show itself.

Sometimes there is just a disconnect between what should work and what does.   That is one of the most terrifying and exciting parts of art- seeing what works.  These are just two examples of when things went terribly bad.  But now they have a whole community willing to embrace them and spend hours online developing fan sites, sharing bootleg footage, and devoting all sorts of time to the shows that didn’t quite make it.  I find flop culture quite fascinating.  I first got into it when dramaturging a show called [title of show].  One of the characters, and the writer himself, has a collection of Broadway flop Playbills.  There is a song that names probably fifty flops, and it was my job to look up each title and find as much information as possible.  It’s an interesting phenomenon, especially in this age of super commercial, decade-long running shows, that there are still these fleeting pieces of theatre that run under 50 performances and are never heard from again.  Carrie and Merrily are the lucky ones, but when will the world hear from Buck White again?  Probably never.

And then I think…how did we let Cats happen for so long?

A Wolverine Abroad: A church? In Italy? No way!

That’s right folks, I finally went somewhere while being here. What can I say, it cost money to travel. But, as I sit here eating Nutella and biscotti, I realized that I forgot to write yesterday. I know how important I am in all of your lives (note the desperation in my voice) so I wanted to fix it. As part of my Art History class here, we went to Padova to see the Scrovegni Chapel, the interior of which is completely decorated by Giotto’s frescos. These works reveal a lot about the style of the time, the skill of the best artists around, and the politics circling the pre-medieval world. I would also like to get this out of the way before introducing the subjects, just as a disclaimer. I don’t really like early middle ages art and all of the stuff up through the Gothic period. I enjoy learning about it, about the different developments and styles that were happening, about the masterpieces and great artists. But as art, it is not pleasing to me. I find the use of gold very gaudy and I prefer profound allegory to obvious symbolism. Whew! Now that we have that out of the way I can tell you about how I respect these works for what they are.

This chapel, painted in 1303, was donated by Enrico degli Scrovegni. His family, guilty of being usurers (loansharks) were in danger of eternal damnation, and so to save them and he built this chapel in the name of his late father. In order to show that he was repenting for his sins, he had Giotto depict a series of stories and symbols related to sin and redemption, and he placed the opposing symbols of vices and virtues across from each other.

The real masterpieces, however, are those that line the upper portions of the walls. There are three series of reliefs. They follow three bible stories; that of Mary’s parents, that of Joseph and Mary, and that of the Passion. These frescos were made in the period of Giotto’s maturity, and because of their time, they are true masterpieces and incredible proof of the development of depth and light in art stiles.

Though the perspective had not been invented yet, it is possible to see how Giotto has learned to show depth. It may not be perfect, but you are able to see how the hands on the chapel in the first picture appear behind and in front of the house. Or in the second picture how the outstretched arms of the lamenting man actually depict space. I know this sounds kind of silly, it kind of is, but for this period this was new and incredible. The idea of real space and true forms was becoming popular, and artists like Giotto were the first to master these concepts. Movement also became part of the style, which you can see in most of the figures, and especially the angels flying overhead. Trust me, these were the new big thing. Although he is not my favorite, I do like the fact that he stopped using so much gold in everything. Thank you Giotto!

And that’s enough of your art history lesson for today. I’m wondering, do you think food counts as art? I think so. I think for next week I’ll try and find some really interesting and artistic food to write about. It is Italy after all. I eat all the time here. It’s great, but it’s getting ridiculous!

Ciao ciao!

Danny Fob

Your Wolverine Abroad Blogger

Alarm Will Sound

Happy March everyone! And happy end of spring break! My spring break was wonderfully productive and was in no way filled with dormancy.

As we all make the sad, sad transition back into school, I want to talk a bit about my favorite topic, the world of classical music. But! Before you stop reading and write me off as an old fashioned boring old guy in a suit that goes to opera and comments on the divinity and “high class” of the music that I’m listening to! I want to talk about where classical music intersects with popular music, where classical music is TODAY and why YOU should care. Because you should. It’s cool stuff.

So I want to start with how I got into this world of contemporary music by highlighting one of my favorite groups, a new music ensemble called Alarm Will Sound.

Alarm Will Sound could be called something like a “chamber orchestra,” if you wanted to give it a name. It has string, woodwind, brass, and percussion instruments, and they are all arranged around one conductor. It’s a “chamber” group because it has so few numbers: only one or two musicians per part (compared to a full symphony orchestra, which has quite a few musicians playing the same part…). All the musicians in Alarm Will Sound are classically trained—all have graduated with at least one degree in music. In fact, they started as a group of students at the Eastman School of Music. But what makes Alarm Will Sound different, in my mind, in their willingness to explore and be open to all kinds of music. As it says in the intro video above, AWS’s members have experience in all kinds of music, from world music to jazz. And this shows in their output: in 2005 they released an album of Aphex Twin covers, called Acoustica. Go back and read that again. Cover album. Aphex Twin. Y’know! The electronic artist, who made tracks like this…

and Alarm Will Sound’s cover?

This is a “classical” ensemble that sees electronic music as just as valid as anything that has Beethoven’s name on it. It’s such a wonderfully forward thinking strategy! And the things they produce are just really cool.

They also came out with another album in 2009, called A/Rhythmia, a project highlighting rhythmically challenging works. And this is where I first discovered AWS and fell in love with their music. In particular, I really dug this piece, called Yo Shakespeare by Michael Gordon….

But before I start to melt into uncompromising praise for a group of musicians I’ve never seen live, I’ll just take note of what an ensemble like AWS says as to the current state of music. To me, it says that there are no longer any boundaries. Genre is a meaningless term. All music is good music, and all music deserves a listen. So check out this “classical” music and see why I love it so much.

And I’ll leave you with one last video, Alarm Will Sound’s performance of Paul Dooley’s Point Blank. Paul is a doctoral student at the School of Music ,Theatre and Dance. Go Blue.

Illness in Art

Illness, whether a mental or physical debilitation, has been the subject of countless works of art throughout history.  It has been pictured scientifically, religiously, sympathetically, heroically, and any number of other variations.  In relation to the artistic discourse, the ways in which illness is depicted reflects historical stigmas as well as broad human emotions.  Much of what we know about responses in society to illness, like the Plague, are documented in art but it is often used to evoke an appeal, like in works by Picasso or Basquiat, to universal distress.  To examine this, works concerning illness spanning several centuries will be analyzed, as well as texts related to art in illness and artists that suffer from illness themselves.  In order to do this, it is important to look at these works of art comparatively, thus many works will be compared to others in their same time period and other eras.  Illness is complicated through art because it can take something fairly scientific objective and turn it into a work that is subjective, propaganda, or even just reflect it back objectively.

Though art traces its roots back much farther than the Middle Ages or the Renaissance, some of the most terrifying and prolific images of illness came from these eras.  An example of this would be Pieter Bruegel the Elder’s Triumph of Death.  The “Triumph of Death” (or the “Dance of Death”) motif was a common one, arising from medieval times with religious reinforcement.  In the face of the Black Plague, this theme was given ample commissions by the Catholic Church as a memento mori, reminding the public of the pains of Hell and the rapidity that death can come in.  The hysterical lust for repentance during the outbreak and spread of the Bubonic Plague reflected the religious fervor that gripped Europe, and this is partially why the Triumph of Death’s savage depiction of illness is important documentation.  In Bruegel’s painting, finished circa 1562, Death is seen ravaging every social hierarchy, from peasants to emperors.  Some “Triumph of Death” works from this period even went as far as to include Catholic bishops among those being cut down by Death (represented by skeletons).  In relation to illness, this painting shows the intense religious reaction to a fear of sickness.  Through depicting illness and death, Bruegel examines people’s frantic and desperate desire to escape the inevitable but not without religious propagation.

In the same vein as Bruegel’s piece, artists continued to use their talents to the liking of higher authorities.  The subject of illness and death seemed to be a point of supreme sympathy or relatability for the masses because it continued to be the center of many works commissioned by governments or churches, possibly because of its ability to tap into the fears of every person; dying is inescapable for everyone and you’ll be lucky if you don’t suffer greatly while doing so.  In the 18th Century Neoclassicism began, unlike its state commissioned predecessor Rococo, to use illness to illustrate “civic virtue” in relation to the rise of Republicanism in Paris.  Themes included bodily sacrifice for the state, like in Drouais’s The Dying Athlete, David’s The Death of Socrates, or Regnault’s Liberty or Death.  It was considered a great honor to die for the Revolution and those running it spent lavish amounts of money in order to propagate this ideal.  Jacques Louis David even proposed parading the decomposing body of one of the Revolution’s “martyrs,” Jean Paul Marat, in the bathtub he was murdered in through the streets of Paris (though the body atrophied beyond recognition before this could be carried out).  Unlike the previous religious implementations of illness into art, the Neoclassic and later Romantic usage was meant to display a choice: Republicanism or an un-honorable death, or Republicanism with a heroic death.  However, both religious and Jacobin propaganda stressed

bet365 uk online casino welcome bonus code ukonlinecasinobonus.co.uk bet365 uk online casino welcome bonus at first deposit

unwavering devotion.  French Revolution artists would have been fools not to draw from Christian artwork though; centuries of blind Catholic following in France was a resource widely tapped into by the likes of David and Ingres.  There was already the perfect model of martyrdom and illness: Jesus.  Probably the most famous work to come out of the 18th Century was David’s Death of Marat, styled after countless Pietas, most famously by Michelangelo.  Along with the death motif the painter made a point of displaying Marat still in his bathtub.  Marat spent most of his time in that bathtub because he had a very sensitive skin condition, something that increases the perception that this was a cruel murder against a helpless victim.  This shows how, among other things, illness can be manipulated and exploited in art.

Toward the middle of the 19th Century state commissioned art began to disappear.  With the restoration of the Bourbon Dynasty in France, less concentration was paid to funding political art in an attempt to disassociate themselves from the still leftist and revolutionary artists prevalent at the time, like Daumier, who was imprisoned briefly for unkind caricatures of Louis Philippe in the form of a sickly, rotting pear (which also caused a nationwide ban on any depictions of pears) or Delacroix.  A new form of bourgeoisie was also in place, one offended by the growing influx of peasants drawn to the city during the Industrial Revolution.  With this brought unclean and unhealthy living environments, along with a fantastic rise in prostitution and an artistic desire to depict these realistically, hence Realism.  Emile Zola, an ardent supporter of Impressionist Realism, remarked upon hearing outcries from the upper classes over Manet’s masterpiece Olympia “Why not be honest?”  Olympia, a sardonic response to Cabanel’s saccharine Birth of Venus, shows a thin, pale, and bold prostitute lacking the typical voluptuous body that was associated with beauty.   This pursuit of the real, an unglorified view of the sickness the Parisian poor were experiencing, became a fixture in Impressionism.  This was done, almost to a grotesque point, by Degas in his sculpture Little Dancer of Fourteen Years 9.  Called by one critic the “Flower of the Gutter,” Degas disgusted critics with his bony, often speculated as anorexic, ballerina.  His original sculptures of Little Dancer of Fourteen Years was not a bronze cast as the current ones are but a wax sculpture with genuine human hair  and displayed in a glass case.  Unlike today, displaying a sculpture in a glass case was not common practice for artists in the 19th Century and was perceived as a reference to medical displays.  This was a scandal; prior to Impressionism artists did not attempt to exhibit realistic interpretations of the lower classes.  Full scale paintings were previously reserved for heroic battle scenes, aristocrats, or works of civic virtue but to devote them to the flâneur and prostitutes, and not kindly done, of Paris was an outrage.  Other then class relations, Realism changed how illness was dealt with.  It was no longer used as a tool to instruct or cause fear for the masses, but was instead a point of power and change in shocking the bourgeoisie.

Anthropogenics

There are upon this earth a great number of places where there is beauty to be found in the intersection of the human and the natural. There is urban decay, where manmade structures are gradually reclaimed by the elements, the thing around which urban exploration revolves. There is the landscape redefined, modernized, the urban landscape. And then there is something else, something that encompasses that and more, something that is in a way their opposite, at least as a manner of perception. The anthropogenic landscape is one that seems now natural to us. Of course everything has to some degree been touched or altered by human actions; the pristine is now rare, valued, and, in many cases, on its way to becoming commoditized.

Where man has been, it tends to leave an indelible mark (though that too is subject to temporal perspective). It is often on the overlooked fringes of ordinary civilization and in its in-between spaces that the most unexpected things might be found. It’s the sparsely-populated stretches between cities and towns, it’s the desolate-looking land peppered by isolated industrial complexes. It’s the pin-straight best soccer predictions lines of a planted forest, the threads of roads snaking across the desert, the carven bulk of a terraced mountainside. Or perhaps on a smaller scale, it’s where tire tracks appear in gravel or a cluster of rubbish bins sit in a field, where a house sits perched on the top of the bluff.

Anthropogenics curates images of these places, “depicting the human-made, human-marked, post-natural, contemporary landscape,” framing the ordinary in such a way as to make them appear to be more than that. Its collection, it says, is “borne of the belief that “pretty” landscapes lack interest… the appeal of landscapes and photographs of landscapes is in the ways in which humanity has altered, or even created, them, not the ways in which we find them pleasing to the eyes.” While it might not be entirely fair to say that the unaltered landscapes conventionally prized for being aesthetically pleasing “lack interest,” it is undeniable that they receive far more attention than the human-affected. Pristine alpine meadows and city skylines alike have been much photographed in their many iterations. What lies in between often goes ignored, which Anthropogenics seeks to remedy.

The public contributes images through their Flickr pool, where more images can be found.