November 23, 2015

Unexpected parallels between famously troubled musicians and 19th century Russian novelists (Part 2/4)


LDRvsPushkin    Aleksander Pushkin (1799-1837);  Lana Del Rey (1985-)


Lana Del Rey’s attitude towards her own death and the association between ‘living fast’ and ‘dying young’ is well documented through both her song lyrics and personal interviews. In an interview with The Guardian in June 2014 she stated that, “I wish I was dead already,” and then confirmed that she thought of early death as glamourous. She somehow associated this foreboding feeling with her inextricable but stressed marriage to her career. “I do [wish I was dead]. I don’t want to have to keep doing this but I am,” she said. In short, Del Rey has intimated a view of early death within her future, largely in light of the empathy she feels toward other musicians who tragically died young. This dismal attitude is central to her personal brand and goes well with the lifestyle she advertises, which is all about paradoxically seeking freedom through self-destructive enslavement. And yet she finds inadequate release in the music she cannot help but make, thereby wishing for a greater extrication from reality through some aspect of the reckless lifestyle that she apparently leads, a type of lifestyle that also tends to accompany young, glamourised deaths.

Pushkin, in a way, also prophesied his own death in a textbook example of how ‘life imitates art.’ In Pushkin’s novel, Yevgeniy Onegin, Lensky, a romantic young poet, is killed in the duel that he challenges Yevgeniy to after Yevgeniy’s public flirtation with Lensky’s beloved Olga, a woman whose feelings run much shallower for Lensky than his for hers. Under eerily similar circumstances, Pushkin was also mortally wounded in a duel over his straying wife, Natalya Goncharova, a great beauty who became a figure of scandal due to her affair with her brother-in-law, Georges d’Anthès. Ridiculed by high society as a cuckold, Pushkin challenged Natalya’s lover to a duel, which he lost. In the aftermath, while Pushkin lay dying for two days, he reportedly recalled a premonition that he had about the number six in relation to his death: the tragic duel that bore so much resemblance to his own happened in chapter six of Yevgeniy Onegin. And while the tendency to conflate fact and fiction runs deep through the veins of Russian literary studies, of which self-mythology is a noted characteristic, the circumstances surrounding Lensky and Pushkin’s death are indeed uncannily alike, regardless of whether the report of Pushkin’s last words is true or not.

Next (part 3/4): Amy Winehouse and Mikhail Lermontov



SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
November 20, 2015

Unexpected parallels between famously troubled musicians and 19th century Russian novelists (Part 1/4)


Lana del Rey                            Lana Del Rey


What makes a martyr? According to Merriam-Webster, it’s “a person who sacrifices something of great value and especially life itself for the sake of principle.” And while religious martyrs are perhaps the first kind that comes to mind when we think of this word, significant consideration can also be lent to artists of the past. These are the ones who have been famously labelled as tortured geniuses of practically mythic proportions, extraordinarily talented libertines fuelled by a greater cause, who suffered as a combined result of bearing the burden of their talent and the inability to save themselves from the destruction that their genius pulled them towards.

In the next three parts of this series, I’ll explore the various similarities between three Russian literary giants and three of who could be construed in some ways as their modern-day, musical counterparts: Lana Del Rey, Amy Winehouse, and lead singer Layne Staley from the band, Alice in Chains.

The three aforementioned writers are considered to be the founding fathers of Russian literature: Aleksander Pushkin (1799-1837), Mikhail Lermontov (1814-1841), and Nikolai Gogol (1809-1852). To understand how mentally fraught these writers were one must first understand the history of Russian literature. It’s remarkably brief. It only entered into its own national and global consciousness about two hundred years ago. Before that, Russian intellectuals wrote in French, German, English, or other European languages in large part because they were schooled in those languages, spoke several of them fluently, and could connect with other great thinkers of the Western world that way. However, after the expulsion of Napolean (1812-1815) and the concomitant emergence of Russia’s great poet, Pushkin, the rise of Russian literature was unprecedentedly rapid and intense.

The burden that therefore fell on the first writers of that new literary tradition was a heavy one. They somehow had to articulate that which distinguished their literary heritage as rooted in Russian songs, prayers, and folklore, and combine it with borrowings from their up-to-date knowledge of other European literary traditions, colloquial Russian, and Church Slavonic language. In short, they were responsible for creation of their national literature. The great weight of this immense undertaking, in conjunction with their intense personalities, gradually manifested itself in the personal struggles that Pushkin, Lermontov, and Gogol each faced, which eventually led to their demises: both Pushkin and Lermontov died in dramatic duels while Gogol starved himself to death.

The hauntingly strange part, however, is that the ways in which they died bear striking resemblances to the deaths and attitudes towards death of Del Rey, Winehouse, and Staley. And while the term ‘genius’ is a relative one that may not apply to all the figures discussed here, the duty of carrying on the sacred flame of literature or music is what unifies them and provides the lens through which we will conduct our examination.

Next (part 2/4): Lana Del Rey and Aleksander Pushkin



SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
November 4, 2015

Book Review


After reading Ann Reavis’ Italian Food Rules, a book that aims to explain many of the differences between Italian-American cooking and authentic Italian cuisine, I was curious to see whether the ‘rules’ presented in the book matched up with those of an actual Italian person who’s been well-exposed to Italian-American food. Enter Mario.

Mario Merone is originally from Ariano Irpino in the southern Italian region of Campania and has worked as a playwright, actor, director, and drama teacher all over Italy and other parts of Europe. In 2013 he moved to New York to pursue his dream of joining an American theatre company. Like many actors, Mario works part-time as a waiter—in his case at Piccolo Café, an Italian-American café and restaurant with a location near my apartment, which I like to frequent especially when I’m trying to get some reading done. Its tiny, narrow interior limits the number of customers, which makes for a quiet and intimate atmosphere, and the wait staff has always been kind enough to charge my phone and seat me next to the heater in the winter.

With his first-hand knowledge of culinary interpretations in both Italy and the U.S., Mario very kindly accepted my request to evaluate some of the ‘rules’ listed in Reavis’ book and either confirm or correct them. He credits his grandmother for first introducing him to Italian cooking and inciting his culinary interests thereafter:

I love food and I love to cook. My culinary experience began at my grandmother’s house in 1980. Even though I was only a year old, I started to taste the delicious food made by Nonna Carolina, and day by day, she taught me the secrets of what tasted good.

1.) Italians only drink tea when they are sick (they much prefer coffee). 

False. Coffee is the number one choice, but a lot of people also drink tea even when they’re not sick.

2.) Don’t dip bread in olive oil. Bread with olive oil and balsamic vinegar is not served automatically in Italian restaurants. 

Partially false. In Italy, bread alone is served automatically in restaurants, but without the olive oil and balsamic vinegar. However, they can provide those condiments upon request, and then you can dip your bread into them.

3.) No pizza for lunch. Pizza is meant to be eaten for dinner, with friends (as opposed to a family outing), after 9 PM, and at a pizzeria. One should also eat pizza with a knife and fork, unless you are a guy and in Naples.

False. You can have pizza for lunch and/or for dinner with your family or friends, at any time (not just after 9 PM). Usually, pizza is served already cut into slices so that you can use your hands and enjoy the delicious taste. But you are also welcome to use a knife and fork if you’d like!

4.) Don’t put ice cubes in beverages. This is because Italians believe that icy cold beverages are bad for your digestion. They can even cause congestione, an abdominal cramp, that can kill you. Italy is a land of simple drinks–wine, beer, water, none of which require ice.

False, mainly for the wrong reason. Putting ice in beverages isn’t common in Italy, but the reason is because ice changes the real flavour of the beverages. People like to keep the taste undiluted.

5.) Spaghetti is not served with meatballs. That is an American thing.

True. “Spaghetti with meatballs??? Are you kidding me??” That was my first reaction…

6.) Don’t eat eggs in the morning. Eggs are not part of an Italian breakfast, as usually people opt for a cappuccino and a pastry. They might even stretch to some fruit or yoghurt. But no eggs.


7.) Italians don’t drink orange juice in the morning because they believe it is too acidic and bad for your digestion. 

False. We drink fresh orange juice in the morning. I’m one of them.

8.) Bistecca alla Fiorentina should only be eaten rare.

Following the tradition, yes, but it’s up to you. There is no prohibition to ask for Bistecca alla Fiorentina cooked ‘medium.’

9.) Different dishes are meant to be served on different dishes. For instance, a dish of potatoes is served separately from a dish of meat. The different foods therefore don’t touch each other. 

False, because it depends on the food combination. For example, you are not supposed to serve something like tomato sauce with potatoes on the same dish.

10.) Don’t use bottled “Italian salad dressing.” Real Italians dress their salad simply: olive oil, red wine vinegar or lemon juice, salt, and pepper are added sequentially to the salad. These ingredients are not shaken into a vinaigrette and added all at once. Vinaigrette is a French invention.


11.) Don’t dip biscotti in coffee. The only drink biscotti should be dipped into is Vin Santo or vino santo

True. But you can dip biscotti into a cappuccino, latte or caffe llatte. By the way, I saw someone once who dipped biscotti into coffee, and he’s still alive…

12.) An actual panino is a small roll (not sliced bread) containing two to three ingredients. Most panini are not usually heated under a panino press, as they are in the U.S., and butter is never used on them either. 

False. We serve them using sliced bread, too. We also heat them under a panino press. But it’s true that we don’t use butter. Ever.

13.) Eating melons without prosciutto is considered somewhat dangerous to Italians. The logic is that if a ‘cold’ food like melon is eaten without a ‘hot,’ balancing food, like a salty meat, the body will be ‘chilled,’ which leads to the dreaded congestione



For a complete list of Piccolo Café locations, see here


SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
October 31, 2015

The Charm of the Grotesque (and Egregiously Tacky)



A few years ago, I had a bedazzled phone cover that looked a lot like one above. Certain friends of mine with a predilection for frilly, fluffy, sparkly things would call it cute upon seeing it, but others would find this both amusing and ugly, as I myself did. So why did I like it? Eventually, I came to the conclusion that its appeal lay in the very fact that it was so overtly tacky it actually managed to achieve some kind of unconventional charm.

It sounds strange at first, if anything because it seems totally counter-intuitive to our aesthetic sensibilities, which we regard as usually attracted to the beautiful, symmetrical, and clean. But it is not so strange when we think about the appeal found in the art of the grotesque, a term coined in the 16th century that refers to styles of art which incorporate elements of the bizarre, fantastic, ugly, uncomfortable, and horrific. First revived during the Renaissance by the school of Raphael in Rome after a landmark excavation of grottoes in which such decorations were found, the grotesque style quickly became popular in decorative art and architecture throughout Europe and remained so until the 19th century.

But how did those who found it appealing explain its appeal? Like several terms that first appeared in the field of art but were then brought over to other branches of the humanities to be discussed, one of the first attempts to understand the term, ‘grotesque,’ occurred in literature, more specifically, in Michel de Montaigne’s (1533-1592) Essays to denote a burgeoning genre. In it, Montaigne described the genre as closely related to satire and tragicomedy because of its ability to effectively communicate grief, pain, and comedy all at once. Thomas Mann (1875-1955) later concurred with this feature of simultaneous discomfort and delight, thereby calling the art of the grotesque a “genuine anti-bourgeois style” in Past Masters and other Papers (tr. H.T. Lowe-Porter). Mann’s view was in the context of his broader opinions on modern art, the key characteristic of which he regarded as the refusal to acknowledge tragedy and comedy any further because “[Modern art] sees life as tragicomedy, with the result that the grotesque is its most genuine style.”

The funny thing is, though, that because of this tragicomic trait, the grotesque could very well relate to what we deem as cute. As tragicomedy tends to elicit a feeling of sympathy towards the pathetic, it creates an imbalance of power in which we viewers have the upper hand, and according to cultural theorist Sianne Ngai in Our Aesthetic Categories: Cute, Zany, Interesting (2012), so do cute things, which evoke “helplessness, pitifulness, and even despondency” out of the fact that they are “small, helpless, [and] deformed object[s].”

Arguably, then, the reason why things like my egregiously tacky phone case are somehow charming is indeed because they are cute— oddly enough by way of the pity they inspire in us, much like the art of the grotesque. Who would have thought that my giggly friends with a penchant for stuffed animals were right all along?

Screen Shot 2015-09-29 at 9.29.14 PM

SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
October 19, 2015

Jelly/Jam Regulations in Germany


As I am writing this I’m on the way to southwest Germany, where I hope to get my fill of Baroque architecture and scrumptious desserts. So, in anticipation of the gastronomic delights that I fully expect to encounter, here is an interesting piece of remote German food trivia that I came across during my German culinary ‘research.’

In Germany, jellied spreads are subject to certain regulations that define exactly what those spreads should be called based on a 1982 EU council directive called the German Konfitürenverordnung (Jam/Jelly Regulation). This, along with its later amendments, set standards for condiments relating to jams, jellies, marmalades, and chestnut purées.

To expound, jams in Germany largely fall into two categories: Marmalade* (commonly translated as ‘jam/jelly’) and Konfitüre. The difference is that a spread must contain at least 20% citrus fruit in order for it to be considered ‘Marmalade.’ Everything else must be called a ‘Konfitüre,’ which must contain a minimum of 35% non-citrus fruit. However, if the non-citrus fruit content exceeds 45% of the spread, then it must be called ‘Konfitüre extra.’

And yet there are even more rules than that. In the case of certain fruits, there are limitations on how much fruit can comprise the content of the spread. Quince or currant Konfitüren, for example, must contain no more than 35% and 25% fruit, respectively. I wonder what dire consequences of adding more fruit the EU council had in mind when they set that restriction.

The funny thing is that even though ‘Marmalade’ may seem like the more foreign word in German to English speakers because it exists in English, whereas ‘Konfitüre’ does not, ‘Marmalade’ is arguably the more ‘German’ word, having existed in German for about 200 years longer than ‘Konfitüre.’ Indeed, prior to 1982, ‘Marmalade’ was much more prevalently used than it is now as it was taken to mean any jam with whole or large pieces of fruit suspended in jelly. After the regulation was introduced, however, ‘Konfitüre,’ the French-adopted word, took over in prevalence.

Essentially, this regulation changed the meaning of the German ‘Marmalade’ to suit the meaning of the British ‘marmalade,’ of which the inclusion of citrus fruit comprises an integral part. However, it is not uncommon to find Marmalade containing non-citrus fruit in German grocery stores— especially in farmer’s markets and organic food sections. This is because the Konfitürenverordnung only applies to large manufacturers. Everyone else is free to produce as much non-citrus fruit Marmalade as they please.

I wonder if the definitions of ‘jam’ and ‘jelly’ in English ever received such scrutiny from government organizations…


*Note: all nouns are capitalized in German.


SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
October 15, 2015

play/movie review: Macbeth

Marta SpendowskaMarta Spendowska. Abstract Lands: Rivers.

A very well-read friend of mine, who lives in London, went with me the other day to see the new Macbeth movie in a charming little Everyman Cinema on Baker Street (incidentally where Sherlock Holmes was supposed to live). It was a short, powerful movie on the whole, with beautiful cinematography, costumes, settings, and a score. I also enjoyed the acting, although I think it could have done with slightly less strained whispering.

The movie made me think about certain ideas that hadn’t occurred to me when I last read the play in high school. Two, in particular, stand out:

  • Differing attitudes towards the concept of masculinity

While I had studied Lady Macbeth’s purposeful rejection of conventional femininity and her belief that it breeds weakness, I hadn’t yet thought about attitudes toward masculine gender roles. Not only does Lady Macbeth pressure her husband to kill King Duncan by questioning his manhood (“When you durst do it, then you were a man. /And to be more than what you were, you would/ Be so much more than the man.”), but she also wishes for the abandonment of her own womanhood in her quest for power (“Come, you spirits/ That tend on mortal thoughts, unsex me here/ And fill me from the crown to the toe top full/ Of direst cruelty…”). She goes so far as to say that she would even kill her own child, against any innate maternal instincts, if it means the procurement of a more masculine and therefore powerful position because, to her, masculinity and capability are one and the same. Both husband and wife desperately cling to each other while drowning in a sea of their insecurities before, finally, Lady Macbeth’s suicide and Macbeth’s death at the hands of Macduff highlight the ultimate Shakespearean triumph of ‘true’ morality and masculinity. This is because Macduff, by contrast, is self-assured in the full knowledge of himself and his limits. When told to take the brutal murder of his wife and son “like a man,” Macduff replies, “I shall do so, / But I must also feel it as a man.” In stating this, he asserts that sensitivity of feeling is not necessarily linked to weakness of character. Even though he feels deeply, these emotions do not undermine his ability as a successful warrior and mentally strong hero of the play. If anything, they strengthen his resolve.

  • Was Macbeth suffering from PTSD (Post-Traumatic Stress Disorder)?

Never before had it occurred to me that Macbeth might have been especially impressionable at the beginning of the play because he was psychologically unsound at the time when his wife pressured him to kill King Duncan—the first murder that snowballed toward others and, eventually, Macbeth’s own undoing. In fact, the reason why he is so easily able to assassinate Duncan in the first place is because Duncan has come to personally thank Macbeth for his bravery in the recent battle, which was fought on Duncan’s behalf. Still freshly traumatized from all the carnage, Macbeth exhibits a mentally unstable disposition throughout the play, even at times when he should be theoretically at ease without murder actively preying on his mind by dint of urgency. The banquet hall scene following his coronation is one such example. Although he has just been given worrying news that Fleance, Banquo’s young son, has managed to escape while Banquo was being killed, Macbeth’s ensuing frenzy of fervour in front of all his guests can hardly be seen as a normal reaction. After all, the last thing a king would want to do is show his subjects how unhinged and unfit to rule he is. Lady Macbeth, in response, tries to quell the situation by explaining to their guests that her husband usually has these fits and that this is completely typical of him: “Sit, worthy friends: my lord is often thus…Feed, and regard him not.”

It was only when I saw this scene acted out that I started to wonder whether Macbeth’s madness can indeed be considered as only belonging to the same type as his wife’s—namely, the lunacy that comes from being unable to overcome vast guilt. While there is ample evidence to show how he, like his wife, cannot reconcile himself with having so much blood on his hands, I think that his madness is exacerbated by an untreated, underlying layer of trauma, which is found in the behavioural symptoms of PTSD that he exhibits  but is not found in his wife’s cold-blooded attitude towards murder. Symptoms shown by war veterans often include paranoia and irrational panic as though they are constantly in danger, as well as depressed moments in which they seek isolation. Alongside his outburst in the banquet hall, Macbeth also has times when he withdraws from everyone, thereby showing both symptoms just mentioned. For example, In Act 3, Scene 2, Lady Macbeth finds him brooding by himself and asks, “How now, my lord! Why do you keep alone[?]”

All in all, I would see the movie again and highly recommend you see it, too!


For more on the locations of Everyman Cinemas, see


SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
October 11, 2015

A history of the colour purple

purple“Infanta Style.” Paolo Roversi for Vogue Italia (1997).

Having been in the land of the well-loved British royal family for a few days now, I’ve been thinking lately about the history of the colour purple and its widely known association with royalty. Rulers ranging from Roman emperors to Queen Elizabeth I apparently passed sumptuary laws prohibiting the colour to be worn by anyone but royalty. But why was it considered so special?

The reason had much to do with its exclusivity. For centuries, the ancient Phoenician city of Tyre held its reputation for being the sole producer of “Tyrian purple” due to the locally found species of sea snail (now known as Bolinus brandaris) that was used to make the precious dye. The process for which to harvest the dye was extremely laborious: manufacturers first had to crack the snail shells open and extract a purple-pigmented mucus before exposing it to sunlight for a carefully set amount of time. To yield just one ounce of usable dye it required about 250,000 snail shells.

That’s a lot of snails. In fact, if you think about how a single standard serving of Escargots à la Bourguignonne is commonly served on a plate containing six grooves for six snails, that means that the same amount of snails needed to produce one ounce of purple dye would have also been enough to feed about 41,667 people each a restaurant-serving of escargots instead.

In the words of Plato, “The measure of a man is what he does with power,” and here, the extent of that power even affected the realm of molluscs. Imagine that.

Screen Shot 2015-09-29 at 9.29.14 PM

SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
October 2, 2015

Technology-assisted dating in the 1800s and 2010s copy

Recently, I read two books that address mankind’s experience when faced with finding romance through new technologies, albeit one instance in the 1800s and another today: Tom Standage’s The Victorian Internet (2014) and Aziz Ansari’s Modern Romance (2015).

The bulk of Ansari’s book is focused squarely on how dating in today’s digital age is markedly different from the past when one was forced to date in-person and/or through communication formats that required a more physical, personal presence. Mobile or app texting has become such an essential communication tool in dating that we have now developed two selves: a “phone self” and a “real-world self,” whereby our phone selves are defined by the perceptions that others have formed of us based on what texts we sent them.

His main critique of this medium is that “texting facilitates flakiness and rudeness and many other personality traits that would not be expressed in a phone call or an in-person interaction.” And because of this ability to communicate with very little accountability or linkage to our real-world selves, people capitalize on the fact that they can get away with bad behaviour so easily. As such, they become much more uninhibited about the way that they express themselves, which often leads to results that come off as inconsiderate or hurtful. This is especially true when paired with the fact that conciseness as a form of casualness is highly prized in texting and therefore an added pressure that encourages us to exclude explanations and present ourselves in ways that we otherwise might not, verbally or in-person, in order to preserve our self-images.

However, in light of Standage’s The Victorian Internet, that these bad behaviours are exacerbated by a protective layer of impersonality is not an entirely novel happenstance in the history of human interactions through technology. More specifically, the technology he refers to is the electric telegraph, the precursor to today’s Internet, which connected the globe and “unleashed the greatest revolution in communications since the development of the printing press.” Throughout his book, Standage explores the range of wrong-doings that accompanied the advent of the telegraph and cumulatively views them as a demonstration of society’s tendency to blame problems that are facilitated by new technologies on the technologies themselves.

His chapter entitled, “Love over the Wires,” is devoted to examples of romances that were made possible by the telegraph. These romances ranged from inconsequential flirtations between people who had never met, to long-distance dating, to marriages. As Standage writes, “Spies and criminals are invariably among the first to take advantage of new modes of communication. But lovers are never far behind.” Within the first few months of the telegraph being opened to the public, “on-line wedding[s]” took place, and strict regulations between female and male operators were enforced to keep flirtations at bay. “Romances of the Telegraph,” an article published in Western Electrician in 1891, detailed numerous examples of couples who met each other over telegraph lines and sustained their relationships that way. Predictably enough, the varied results of such unions were strikingly similar to the varied results of relationships that happen over text today. Both flaky and considerate people who dated using fast communication tools existed in the Victorian age, just as they do now.

It makes me wonder to what degree the human experience has changed with regard to courtship, irrespective of what new technologies we’ve employed to assist it. Although the world is infinitely more connected now, and there are many more options as a result, the concept of ‘The Game’ has always existed: the vacillation between expressing our true feelings and hiding them in order to uphold the self-images that we’ve created and would like others to see us as. We humans are proud. We do not like being vulnerable or losing, so we play games with each other. The only difference is that now we are able to play as many games as we want, with as many people as we want, whenever we want. This was not the case in the Victorian age. It seems that the dating process is far more exhausting today, the main reason for which can be largely attributed to the burden of immense choice.

On a more positive note, here is some pertinent advice from the Dalai Lama, whose insightful wisdom even on popular culture has impressed me ever since he remarked that Hollywood is “very bad for [his] eyes and a waste of time.” Concerning technology and our livelihoods’ dependence on it, he recommends the following: “I think technology has really increased human ability. But technology cannot produce compassion. We are the controller of the technology. If we become a slave of technology, then [that’s] not good.”

And that’s why he’s the Dalai Lama.

Screen Shot 2015-09-29 at 9.29.14 PM

SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest
October 1, 2015

Isn’t it kind of narcissistic to write a blog about yourself?

bonnassieux_modestie_Ly                                         Jean-Marie Bonnassieux. La Modestie (1846). 


Isn’t it kind of narcissistic to write a blog about yourself?
Such was the question I confronted myself with over and over before starting this blog. Like many people, I was conditioned from an early age to believe in an intrinsic bond between modesty, morality, and professionalism. Now, however, with the widely documented success of professional bloggers who earn a living off of ‘just being themselves,’ it appears that the notion of self-promotion, especially with regard to one’s professional image, has changed. Being your own brand is now encouraged. Whereas members of my grandfather’s generation believed that ‘what do you defines who you are,’ now the commonly encouraged belief is that ‘who you are defines what you do.’ And while it’s easy to dismiss this as a result of an ever-growing cult of individuality, I wonder to what extent the public presentation of one’s personal self can be rationalised as morally praiseworthy, acceptable, or questionable.

Man Booker Prize-winning novelist and art critic, John Berger, in his 1972 seminal documentary and companion text of critical essays on Western cultural aesthetics entitled, Ways of Seeing, similarly observed this phenomenon of disseminating private opinions for public consumption, albeit largely in the context of image mass-reproduction rather than the mass-distribution of personal information via the internet. In the final paragraph of his last essay, which concerns issues of glamour and publicity, he writes that, “Publicity is the life of this culture—in so far as without publicity capitalism could not survive…Capitalism survives by forcing the majority, whom it exploits, to define their own interests as narrowly as possible.”

A key characteristic of capitalism has always been the emphasis on individual gain rather than communal progress, and here, Berger presents the idea of using publicity to gain an advantage over the rest of society by editing ourselves into exclusive forms for public consumption. But beyond the issue of capitalism, what actually catches my attention most is how Berger believes that people are now forced to “define their interests as narrowly as possible.”

It’s an interesting thought because it resonates with those presented in a book published forty-three years later by Aliza Licht in May 2015 about using self-promotion to build one’s own brand, and in doing so, creating a niche market for which there is only one product available: ourselves. In Leave Your Mark, she encourages readers to reflect on the key characteristics that make themselves unique in order to present that edited image to the world. The mentality from which she advocates this operates on the assumption that people are “innately judgemental” and view each other in “one-sentence description[s]” anyway, so we might as well play an active role in creating that image for our own benefit. In short, she promotes the reduction of our complex selves into their best, most essential forms in order to be comprehended faster and more easily by the public. Exclusivity and the usage of it for upward mobility—in this regard, doesn’t building one’s own brand seem in line with the selfish mentality that Berger is wary of after all?

Perhaps. And perhaps I am playing into that game, whether I like it or not. But the desire to share our thoughts with others comprises an innate part of our sociality as humans, and it is hard for me to see anything wrong with that. As Kurt Vonnegut said, “Practice any art…fiction, essays, reportage, no matter how well or badly, not to get money or fame, but to experience becoming, to find out what’s inside you, to make your soul grow.” And as the internet is but one tool to share our thoughts with others, I see this endeavour as a way to practice collecting my thoughts and writing—something I do professionally, but here on a more casual platform and with a wider audience given the wider range of content.

And that is all I have to say about the conception of this blog. I hope you enjoy it!

Screen Shot 2015-09-29 at 9.29.14 PM


SHARE THIS POST ON: Twitter | Facebook | Google + | Pinterest