Beauty and Control

The Garden of Eden was a beautiful paradise. It was abundant in fruits and peaceful creatures and was perfectly manicured. But when lost to evil forces, it grew long weeds and was consumed by nature. As a result, Quakerism holds nature as a place of evil and a home to devils. Nature is something to be conquered and controlled. By uprooting forests and cultivating gardens that conform to human visions, we create beauty. From a Quaker’s perspective, beauty lies in refined control. The uncontrolled is wicked and seen as hideous. Nathaniel Hawthorne, in his tale of “Young Goodman Brown”, exemplifies this belief. A “good” young man ventures into the forest one night and meets the Devil. He loses his innocence and enters a place deemed unholy by his community. The beauty of his innocence is lost upon exploring nature.

The interesting quirk of Hawthorne’s story is that young Brown is not the only member of his community in the woods. Dozens of familiar faces are participating in sinful activities, running amok in the uncontrolled wilderness. Young Brown comes to realize that the people who seemed so refined in town actually indulged in their wild side. The beauty he once knew became tarnished. But perhaps a new beauty arose?

Henry David Thoreau, in his famous Walden, describes the wilderness as a place of remarkable beauty. To Thoreau, the forests were a place of God, not the Devil. Transcendentalism holds nature in high-regard, as a beautiful and wild place. Although it is uncontrolled, it is gorgeous. The lack of human intervention–the lack of control–perhaps made it so.

Apart from spirituality, Quakerism and Transcendentalism represent distinct arguments between beauty and control. For Quakers, beauty results from control. For Transcendentalists, beauty arises in the absence of control. Which side defines the relationship today? While Transcendentalism is a more contemporary belief–as demonstrated through conservation policies, the National Park system, and /r/EarthPorn–there are numerous components of modern life that contradict this idea.

So, what is beautiful?

Natural beauty–lakes and mountains and forests–are often icons of wondrous allure that we claim to appreciate. But does that not contradict our affinity for synthetic beauty? We wear makeup daily, go on diets to cultivate an ideal body, and awe over accomplishments in human invention–skyscrapers, artificial intelligence, and sports cars. We admire the intricacies of watchmaking–the controlled and precise machines are beautiful. Our autonomy over our environment is something we hold in high regard, but the wonders of nature can leave us breathless. When searching for “beauty” in Google Images, we see dozens of white women wearing makeup and colorful natural scenes.

This dichotomy is interesting. We have contradictory views of beauty and control, and there is no consensus. We cannot control the Sun but it is a life-giving beauty. Our natural skin tones, hair textures, and personalities are what make us most beautiful. But there is a beauty in what we create–from the aesthetics of a smart phone to a symphony orchestra. Beauty and control are a two-way street.

We admire that which we cannot control and marvel at those we can.

Creative Self-Destruction: Axiom or Oxymoron?

The only people for me are the mad ones, the ones who are mad to live, mad to talk, mad to be saved, desirous of everything at the same time, the ones who never yawn or say a commonplace thing, but burn, burn, burn like fabulous yellow roman candles exploding like spiders across the stars.

~Jack Kerouac

I have previously discussed the classic Rock star image of Jim Morrison, an artistic career eclipsed early by drug dependence. This contemporary example holds parallels across the artistic canon, from Vincent Van Gough to Edward Allan Poe. These artists not only share an inner despair, but also seem to draw inspiration from this despair. Is it fair to elucidate a correlation between artistic creativity and a self-destruction – or is this a fallacy?

The connection between creativity and self-destruction appears across cultures and times. In ancient Hindu tradition, for example, the God Shiva the Destroyer is also the god of art. Interestingly, Shiva is part of a triumvirate of Gods who epitomize the existential cycle of the universe – it is Shiva the Destroyer and not Brahma the Creator whom the Hindus have bestowed upon the cultural pedestal of artistry.

The Tantric school of Hindu theology regards Shiva and his wive Shakti as the flip side of the primal energy which constitutes life – the dichotomy of potential and kinetic energy, fuel and flame. Hence, in order to create light and energy, something must burn.

From Western Philosophical tradition, Georg Hegel posits a negative vision of imagination as “the night of the world” – a psychological ability to deconstruct the phenomena of reality the spectator perceives into new forms within the mind. Rather than create new forms of reality, the mind deconstructs what it has seen in order to re-constitute, or even lay bare, the reality presented before it.

More recent cultural critic Slavoj Zizek cites Hegel to argue the act of symbolizing an idea holds basis in a death drive, or desire to replace the object and transmute a piece of the author’s own life force into the symbol. The death drive draws from an inner psychological impulse to reject the stagnant cultural traditions which surround the author with new forms of expression heretofore nonexistant – an act so transgressive that it is perhaps necessarily self-destructive.

Moreover, philosophers such as David Hume have long argued not only that the artist is predisposed towards self-destruction, but also that their audiences tend to prefer tragic art – the paradox of tragedy.

Miranda Sings: Alter Egos and Women in Comedy

In case you hadn’t noticed, I love Jimmy Fallon, and during midterms week I may have slightly overdosed on YouTube videos during study breaks…or instead of study breaks. Oops.

But no, I’m not going to talk about Jimmy Fallon yet again, he was merely the mechanism for how I found out about my current topic.

Sasha Fierce. Lemony Snicket. Gorillaz.

What do all three of these things have in common? It’s not music, because Lemony Snicket isn’t a musician, he’s an author. At first glance, it may not be obvious, but when you think about it, they all do have something in common.

They are all alter egos. Think back to when you were a kid, reading A Series of Unfortunate Events (or, if you’re like me, you were probably reading them in the recent rather than distant past). Do you remember how the mystery about who Lemony Snicket actually is intrigued you? Do you remember wondering if this was actually a true story because the narrator was so convincing?

I don’t know what it is about alter egos, but they always seem to fascinate me, especially when they reach a certain level of dedication. When I met “Lemony Snicket,” or rather Daniel Handler, I was fascinated by his willingness to play with this alter ego to entertain all of the kids sitting in front of him on the carpet of the library we were in. And I was thrilled when I walked up to have my book signed by him, only to get witty sarcasm and a note in my book that said “Jeannie! Hi! How are you? Me, too.” Alter egos are simply fascinating to me.

Which is why, when I first saw Miranda Sings playing pictionary on Jimmy Fallon, I became mildly obsessed with her.

The skit is hilarious, but where Jerry Seinfeld and Martin Short were obviously making jokes, Miranda was not. She was withdrawn, and yet I found her the best part of the skit. Instantly I looked her up on YouTube where most of her audience comes from. I scrolled through the videos and though I didn’t automatically realize it, I intuitively knew that this wasn’t a real girl, this was a character and there was a “real” Miranda somewhere.

But I couldn’t find her real YouTube. If you’re familiar with the way YouTube famous people promote themselves, you’ll know that typically the YouTuber will have the “famous” channel, the channel for skits and parodies and music videos, and then will have a separate channel for behind the scenes content as well as personal vlogs for those who are interested. This is meant to separate the two “lives” of the YouTuber in a way that TV and film rarely does – it separates the creator from the creation, pulling the curtain back and showing the audience that yes, these are real people rather than just funny script writers/actors. So as I scrolled through Miranda’s videos, I tried to find a link in the description for the real Miranda channel, the one that isn’t playing to the camera. There was none.

I tried the website, figuring in some small part there had to be a note that said “Miranda Sings is the creation of Miranda Smith, an actress from Atlanta, Georgia” or whatever. There was none. Her entire YouTube channel was completely in character, and her bio was simply her character talking about herself (like she does on YouTube). There wasn’t even a hint for who she was.

This intrigued me further. It’s one thing to have an alter ego, like Sasha Fierce. But there wasn’t a whole lot of mystery; Beyonce was still Beyonce, and she just became Sasha for a short time. Miranda, on the other hand, seemed to do everything in character, purposefully keeping her true identity a secret.

imagetall01

Unfortunately, after about five more minutes of searching, I typed “Miranda Sings” into Google and one of the suggestions read “Miranda Sings real name” and the first result that came back was a video by Colleen Ballinger entitled “Becoming Miranda Sings.”

As you can probably guess, this cracked the code, although I still found her video to be hilarious as she still keeps the character a mystery. Colleen in the beginning claims her and Miranda are “good friends” and once she “becomes” Miranda Sings, she says “Colleen who was in the beginning of this video with me will be in my shows with me,” referring to the Colleen/Miranda comedy tours she takes.

The mystery was solved, and I began watching Colleen’s videos, finding her to be a lot more tolerable than the…um…special Miranda.

And yet, I’m still willing to believe in the mystery behind the ego. I know who she is now, but that doesn’t ruin Miranda’s videos for me. In fact…it makes me like her more.

As I was watching Miranda videos, looking at comments on the Jimmy Fallon video (Miranda’s first big television debut), and thinking about her “acting” with Jerry Seinfeld, I not only gained respect for her as an actress/comedian, but also started thinking more about comedy than I ever had before.

I knew that comedians like Tina Fey and Amy Poehler often talked about the gender inequality in television and media as a whole, but I never stopped to think about women in comedy because I never wanted to be in comedy. But as I thought about it, I realized that the majority of famous stand-up comedians are male, and here I’m talking about stand-up as a genre rather than stand up as a gateway to acting in comedy. When Amanda Seales went on CNN to slam some dude about catcalling, I looked up her YouTube channel and watched her hilarious stand-up. And that’s the only female stand-up comedian I think I’ve ever watched. Ever. Maybe this isn’t telling because I don’t really watch stand-up ever, but when I think about stand-up Bryan Reagan, Louis C.K., and Dane Cook come to mind, rather than Margaret Cho (bless her) or Sarah Silverman.

I know I talk about female equality a lot in my blogs, but it’s only because I’m not only passionate about it but I also see women disproportionately represented in the arts. Like I said, I’ve never wanted to be a comedian, but I have huge respect for them, especially the ladies of SNL (you kill it Leslie Jones), so seeing a young comedian like Colleen makes me so incredibly happy. It’s also interesting that she isn’t doing stand-up (though that could be part of her live show line up), and to me, her character work would shine somewhere like SNL. However, for now, I think she’s happy with YouTube.

 

Epeolatry

Babblative adj. tending to babble, prattle; loquacious.

Words are so fascinating. I think we often take for granted the sheer number of words that exist in the world – there are over 1 million words in the English language alone, and an estimated 7,000 languages in the world. Many are oddly specific – if you’re ever looking for a word to describe something relating to or resembling a hedgehog, just slip the word erinaceous into your sentence. Ever feel so sick that you have a manic urge to dance? Me neither, but apparently it’s called tarantism, and it was very popular in the 15th century.

How about words in other languages that get even more specific than English words? The Georgian word shemomedjamo describes that phenomenon of when you accidentally eat a whole food-thing (i.e. that pie that your Aunt Jan brought to Thanksgiving dinner that you polished off all by yourself while your sister started in on the dishes.) It’s that experience when you’re so full but the food is so good and before you know it there is nothing left.

If you need any more convincing that words are pretty weird but also incredibly interesting, here is a list of some of the words I found scanning the internet. I challenge you to use one of these in a sentence, and take note of the bewildered looks you get when you do.

Abecedarian: of or relating to the alphabet, alphabetically arranged

Sobriquet: a descriptive name or epithet, a nickname

Foofaraw: frills and fancy finery; a disturbance or to-do over a trifle

Embrangle: to mix up in confusion; to make complicated; to bewilder

Prolegomenon: an introductory discourse, especially a formal essay introducing a work of considerable length or complexity

Kaelling (Danish): a woman who stands at her doorstep yelling obscenities at kids

Jung (Korean): a feeling stronger than love that is only proven through surviving a difficult argument

Ohrwurm (German): a tune or melody that infects a population

Verbivore: lover of words

Epeolatry: the worship of words

If you want to read more about cool words, check out this article about cool words, this article about words about words, and this blog entirely dedicated to language!

The Peter Pan Complex

As some of my previous posts have alluded to, I have a really hard time letting go of more “childish things.” I love the bright colors and simplistic shapes of cartoons, I love the games on the backs of cereal boxes, I still eat pancakes with my hands (I swear the rip/dunk method provides the most precise pancake to syrup ratio), I still value many of my stuffed animals as thinking/talking friends, I mourned the day when the people at the dentist’s office stopped letting me pick out a toy, and I’ll never lose my fascination and awe over train sets and bubble wrap. In a lot of ways, I’ve always felt the pressure to give these things up and embrace the rationality and convention of adulthood. I’d hide my stuffed animals under my pillow in my dorm and I’d pick up my fork and knife to eat my pancakes when out to breakfast like a “normal” person. There’s just one problem, I’m not normal. None of us are really “normal.” We all look back on some of the crazy things we did as kids and laugh at our naivety without allowing ourselves to recognize that that kind of play was some of the most fun we’ve ever had. Letting go of childish things is necessary in many aspects of adult life, but in many ways it stifles the imagination and hinders creativity. When I was a kid I made up a game with my best friend where we collected different colored beads and took care of them like pets. I mean come on, what adult person do you know that would ever see a bead as anything other than a bead? I think we need to stop stigmatizing child-likeness in adults. Adults need to learn to play and dream and love like children and the only way they can do that is by allowing themselves to act like children from time to time. Peter Pan was way ahead of his time when he warned us of a time when we’d stop seeing the magic. However, I don’t think we have to stay young forever to do this, we just have to allow ourselves see the world they way we used to — boundless, wonderful, and full of possibilities.

Homo Ludens

Johan Huizinga coined the term “homo ludens” with his book of the same title in 1938. Home ludens describes the “playing man” as a concept of necessity in culture. It describes man’s primary response to play and its applications throughout human (and pre-human) history. In the creation of systems and interaction, ludic design draws its origins from Huizinga’s work. The recent gamification craze in digital culture has made this approach widespread. While learning games, such as those developed by LeapFrog Technologies, have been popular among children, ludic design has been incorporated into many adult-centric systems. Since all humans have a natural affinity for play, this design strategy can have a broad impact.

Dozens of industries have adopted gamification to achieve their goals–from fitness programs to the U.S. Army. Jillian Michaels, for instance, incorporates goal-tracking and motivation to provide a fun means for people to get in shape. To recruit young men to join the forces, the U.S. army uses video game interfaces similar to those in popular first-person-shooters. Countless systems use the awarding of digital badges to highlight accomplishments or provide rewards–both within the game system or in real life. Most of these use cases have been successful in their endeavors, capitalizing off man’s playful nature. But ludic design isn’t bulletproof.

What if nobody wants to play your game?

Gamification has its problems, like anything else, but its widespread fame has led many people to use it incorrectly. When the games get too complicated, they lose their appeal to a general audience. When the games have too many ulterior motives, they no longer become fun. Countless implementations of ludic design have fallen down this path. A good ludic design must stay true to its core principle: playfulness. When too many items are added to the agenda, the game becomes heavy and bloated. Heavy and bloated things don’t play well–usually they’re too tired to move. Light and playful things, though, are plenty active.

When keeping the play in games, ludic design can be successful. The gamut of web and mobile applications demonstrate this. But homo ludens should not be limited to screens either. For the ascreenual pixelphobes out there, gamified physical or social systems can transform mundane tasks into enjoyable playtimes. When looked at broadly, much of gamification’s scope has yet to be explored. There are countless industries in need of fun and artistic innovations. Providing fun outlets for people to indulge their inner child can provide value on multiple fronts: economy, happiness, and discovery. Homo ludens can be engaged more–we just have to bring him out to play.