January 17, 2011
Warning: The preamble is quite a few paragraphs. I will address technology and learning, but you have to survive my personal anecdote first.
For the second year running, I have returned home for the holiday season, this particular visit a good deal longer than the last. During last year’s visit, I commented both through this blog and in-person to my friends on how stress-relieving the experience of returning home was. From a postgraduate program’s (probably fairly light) workload to more difficult issues of personal relationships, I utilized my time at home as a period of safe exploration and evaluation. Without worry of money, without need to patronize friendships (if you know what I mean), without distraction of having to do something: every element was aligned to provide a calm, introspective environment to think within. Here’s what I said then:
“I appeased my eyelids and sated my deprivation, sleeping in my balmy bed ten hours at a time. Brewing cups of coffee for Ida while pouring over the Register Star’s crossword on Tuesdays and Thursdays retained the summertime’s sublimity. Sitting on the back porch and watching the snow fall with stars still visible, cigarette or tea in hand, sedated the disquiet of my mind or otherwise. I adore home (despite rumor of the opposite). I claim this quotidian registry, this traipsing through time, as my own. [What a sexy phrase!]
“The sense of a coming home feels so strong to me. The universe placates me when I am here and enacts a masterful performance in which it corrects itself. I am calm at home. I am rational at home. I am carefree at home.”
Oh, how things have changed this year on this trip home. That “coming home” feels strong still, but I am not calm. Nor am I rational. Nor am I carefree.
I am anxious and aggressive, occasionally depressed(ish). This juncture is not familiar, so I must understand. And I never thought I would say it, but a lot of these feelings I blame on the currently-in-the-spotlight rhetoric seen on American television. I really do, though my full explanation will not attempt to defame further the media/rhetoric/Palin/Beck/etc.
Perhaps I can lay blame with the actual duration of my visit, which is more than double its predecessors length at nearly two months. Prior to coming home, I was abroad for eleven months. Eleven months: where international politics inspire a lesser interest due to unfamiliarity, American glitz and flashbulbs seem far away, and news casts are all controlled experiences (within a frame or internet window I pursue). My trip last Christmas lasted too short a time and involved too much catching up with friends to allow me to become entrenched in this “media storm.” With this trip spread amongst the nearly two months, the time in front of or in proximity to this rhetoric is extended. Even before last Christmas, I was at NYU and happily out of touch with current politics (or, rather, as involved as I chose to be). My point being that I have not encountered this brand of American culture in four years now.
And, to be fair, media has become derogatory, and I have no intention of making it seem more so. Especially since I consider my real issue here to be the way we collect, distribute, and (are forced to) perceive information, news, etc. Everything feels newly aggressive, similar to the way a recently famous celebrity is pursued by tabloid fodder and E! channel reporting, I would imagine. Content is too quickly changed, stories are too soon forgotten, and my life suddenly seems crushingly crowded. It is all so invasive. I cannot find a moment where I can disengage the machine of images, stories, ads, and opinions: I swear, Ray Bradbury is likely giggling to himself right now, while Orsen Wells cracks a smile in his grave! Technology is killing my mood. There I said it.
But, and here’s what I really wanted to get to, there may be something to the thought that we are being punished by our own media technology and format. In a thought-provoking BBC News op-ed by Alain de Botton, entitled A Point of View: Does more information mean we know less?, we are thrown into the debate right at the start: “We pay a price for all the information we consume these days – and it’s knowing less.” What an eristically beautiful start! The problem is twofold, de Botton implies
, or at least I think he does. First, the epidermis: the vast supply of readily available information. Second, the dermis: our psychological struggle to cope with our perceived lack of knowledge.
By addressing the epidermis, I simply point to those oft-criticized-but-always-utilized social media outlets, our manufactured need to possess our friends through their activities. I never needed to know that you were very hungover after the office party so much so that you couldn’t brush your teeth without vomiting. But now I do, and I actively can be a better friend with that knowledge. As de Botton writes:
“There is a deeper issue at stake – the feeling, so rife in modern secular culture, that we must constantly keep up with what is new. The obsession with current events is relentless. We are made to feel that at any point, somewhere on the globe, something may occur to sweep away old certainties. Something that if we failed to learn about it instantaneously, could leave us wholly unable to comprehend ourselves or our fellow human beings.”
Even more entertaining sites, such as the #5 website in the whole world, Wikipedia, provide so much information that suddenly seems relevant to my existence. Wiki-chaining or wiki-crack may sound unfamiliar, but are activities we take part in
daily weekly , but probably daily. Both terms describe, depending on the nature of your habits, the process of becoming lost in Wikipedia, going incidentally from one intriguing entry to another almost by accident. And yet, with every entry I pass by, I remember very little.
I would be the first to admit how much I adore my (wiki-)crack, which makes me more sure that de Botton may have been right on the money. In a stunningly astute observation, he writes, “In the end, all modern artists share something of the bathetic condition of chefs, for whereas their works may not themselves erode, the responses of their audiences will. We honour the power of culture, but rarely admit with what scandalous ease we forget its individual monuments.” That anticlimax, where one’s state of mind feels it possesses some kind of advanced knowledge that quickly disappears by a sense of lack and time elapsing, is my (probably not) unique feeling of the past few months. A part of my mind says that any time I have spent in heavy academic work has produced the same feelings of stupidity or ignorance. But there is something more sinister and aggressive, more intrusive in the manner of this occasion’s ignorance. As I mentioned, I do not control the functions but am led toward the guillotine on a path laid in fiber-optic cables.
And that, no doubt, is founded upon the dermis, our psychological struggle. Yet this is where de Botton and I disagree. Comparing our information age to the experience of being a landowning medieval citizen, de Botton waxes philosophic on a time of quiet reflection and hermetically-learned universal truths. He writes:
“We occasionally sense the nature of our loss at the end of an evening, as we finally silence the TV after watching a report on the opening of a new railway or the tetchy conclusion to a debate over immigration.
“It is then we might realise that – in attempting to follow the narrative of man’s ambitious progress towards a state of technological and political perfection – we have sacrificed an opportunity to remind ourselves of eternal, quieter truths which we know about in theory, and forget to live by in practice.”
Rightly, one commentator called Allectus has noted that “In the age of Jesus genuinely new things could be decades apart. Life was harsh, full of suffering, superstition, ignorance, and often curtailed. By comparison to suggest our safe contemporary lives are diminished because we have the “option” of living with less reflective time is laughable.” Allectus uses de Botton’s appropriation and romanticizing to draw his own (more correct) conclusion.
de Botton drowns his belief in the price we pay for “promiscuous involvement with novelty” with sentimental, inappropriate commentary on the ineffable role of religious order, but does little more than confound his argument. “Matins have here been transubstantiated into the breakfast bulletin and Vespers into the evening report,” he says, suggesting that information blasts have eliminated the purposeful reflection and introspection of prayer and community. So too much information to get at those tricky little gems of hidden truth, right? He then goes on to say (and ruin his argument), “The news occupies in the secular sphere much the same position of authority that the liturgical calendar has in the religious one.” To me, this undercuts his explanation. If news has the authority of holy days and a resonance with secular people, it must be (as he argues) sufficiently drawn out and unchanging with the years. It would not oversaturate, therefore, the collective conscious of the population; these news events would be purposeful, but intentionally lay.
Sometimes, I worry more about newscasts than I do about the full onslaught of .coms, newspapers, and magazines. Newscasts, and the other media outlets to some extent, fetishize particular events and imply that these certain things you should care about. There is little control in how newscasts function on a personal level: I cannot choose what news interests me, what I would like to hear more about, and how far reaching the journalists go. Instead, I am given my daily dose of medicine as prescribed by media heads as to what will make me a cultured and informed citizen. Paring down my individual experience, the news emphasizes things I normally suppress, like shocking tragedies and random
I-can’t-believe-they-even-use-facebook-on-a-newscast facebook opinions, which I often disagree with and just make me angry.
In any case, de Botton founds his ideas in the psychology of rarity, which seems out of touch and faulty. “We are often urged to celebrate not only that there are so many books to hand, but also that they are so inexpensive. Yet neither of these circumstances should necessarily be deemed unambiguous advantages.” Really? Do tell. “We feel guilty for all that we have not yet read, but overlook how much better read we already are than St Augustine or Dante, thereby ignoring that our problem lies squarely with our manner of absorption rather than with the extent of our consumption.” With what I have said above regarding
crack Wikipedia, I think his point is somewhat valid on this matter. As such, I read up a bit on intelligence and our achievements in this age, weighed down though we are by information…apparently. In another BBC News article by David Shenk, called Is there a genius in all of us?, he notes a University of Otago study showing how “IQ scores themselves have steadily risen over the century – which, after careful analysis, he ascribes to increased cultural sophistication. In other words, we’ve all gotten smarter as our culture has sharpened us.” Genius, the article finds, or any other form of talent is not innate, but learned. Fascinating! Shenk writes:
“A century ago, geneticists saw genes as robot actors, always uttering the same lines in exactly the same way, and much of the public is still stuck with this old idea. In recent years, though, scientists have seen a dramatic upgrade in their understanding of heredity.
“They now know that genes interact with their surroundings, getting turned on and off all the time. In effect, the same genes have different effects depending on who they are talking to.
“All of these abilities are dependent on a slow, incremental process which various micro-cultures have figured out how to improve. Until recently, the nature of this improvement was merely intuitive and all but invisible to scientists and other observers.”
These facts put together stand to topple de Botton’s assertion (and my half-belief) that we are overpowered by our access to information. The study shows how surprisingly adaptable the post-post-modern man can be, suggesting we can absorb information successfully. More successfully than our revered predecessors even with all that time on their hands. As Robert Sternberg states, “Intelligence represents a set of competencies in development.” We can cope. Science says so!
What I think de Botton really tries to get at is our great need to appreciate the little things again, to be able to recognize quality when it is there. And I fully agree with this. “We should stand to swap a few of our swiftly disintegrating paperbacks for volumes that would proclaim, though the weight and heft of their materials, the grace of their typography and the beauty of their illustrations, our desire for their contents to assume a permanent place in our hearts.” It is the recognition and esteem of things that must be adjusted, he pleads us to comprehend: “The need to diet, well accepted in relation to food, should be brought to bear on our relation to knowledge, people, and ideas. Our minds, no less than our bodies, require periods of fasting.”
July 3, 2010
The chicken or the egg?: Are blondes born aggressive or did they have aggression thrust upon them?
Society’s worship of the all-powerful blonde bombshell seems to have manifested itself into a connected super-ego developed by all participants, blonde or bottle, throughout the world. In a study, which (honestly) doesn’t seem entirely conclusive or far-reaching, researchers found that women with fair hair color tended to display more hostility and a “warlike streak” when fighting battles to get their own way. Many believe that this bitchy type of behavior results from a feeling of deserving more, an expectation to be treated a certain way, leading some to argue that we place too great an emphasis on the beauty of blonde.
What is it about the blonde that so excites us?
In the 6th century, Pope Gregory the Great saw fair-haired/skinned slave boys in the markets of Rome. When told they were Angles, he replied “They are not Angles, but angels.” Connecting this appearance to divinity and virginal cleanliness no doubt caused some of the peroxide adoration that was to follow. It is interesting to me how, at that time, blonde represented the exotic other.
Blonde as foreign has some genetic credence, too. It is a rare gift to be blessed by the flaxen fairy. Incredibly so, it would seem, leading German scientists (of course) to speculate that natural blonde hair will be extinct by the year 2202. The blonde gene is a recessive trait; thus, a child may only have blonde hair if the gene is present on “both sides of the family in the grandparents’ generation.” And that is becoming tricky indeed.
But is that the appeal of the hair color: does the color’s rareness trigger our more primitive natures to network-mate-diversify and improve our genetic makeup? It sounds more like a Wall Street portfolio than natural processes. The blondes themselves have reason to feel empowered, then. They are the juiciest apples on the Darwinian tree.
But let me conclude with an interesting point. Reading up on Prader-Willi Syndrome, I find that PWS sufferers often have unusually blonde hair and blue eyes. The Syndrome itself, based on a chromosomal disorder, is noted for its insatiable obsessions with food and unpredictable rages. You see what my assumption is here, that a forceful compulsion to get what one wants may not be blamed on nurture and the way we view and treat blondes. Instead, I wonder what behaviors are really rooted in: genes or jeans?
Perhaps on the 10th anniversary of the sequencing of the first human genome, we can hope the science will reveal something enlightening and longer lasting than a semi-permanent dye.
July 3, 2010
Shall we ever be satisfied in our search for the Bard?
18,000+ results on Google were produced by my use of the exact phrase “real Shakespeare”. 18,000. While I recognize that the academic field of Elizabethan drama both is crowded and fraught, there are not that many bits of scholarship roaming around, begging to reexamine the author’s identity for, let’s be honest, a rather (wondrous, yet still very) small set of work. I suppose tracing these thousands of leads would be a revelation in comprehending the necessity of identification. But I have neither the patience nor the nerd-power required of such a task. Instead, and wisely I might add, I found that a great many of the source pages of these words (“real”/”Shakespeare”) shared a great deal in common. Ready for it?
Amazon.com; Half.com; Ebay.com. Authorial agency (or perhaps conspiracy would be better) is big business!
Coerced pen names, the scandal of noble obligations, the intriguing figures assigned authorship of the entirety of The Western (English) Canon. It has all the intrigue you’d expect of daytime soaps, sounding more like a lesbianic lovechild of Jackie Collins’ and Philippa Gregory’s novels than the conspicuously-coiffed image WE ALL KNOW TO BE THE TRUE BARD! [Those who know the field of Shakespeare portraiture I hope will appreciate that my carelessness in that last sentence is not out of malice or non-familiarity with them. To be clear, I am entranced thoroughly by the Cobbe, hate the Chandos, and have mixed feelings about the engraved Droeshout, in spite of Johnson’s claim that it looked about right.]
Though it seems the Shakespeare we (actually do seem to) know wasn’t always so plain. That image-ideal we have stocked in our head has shifted since the turn of the 19th century, when perhaps the distinctions between the Bard and the bogus were slightly blurred. Smithsonian Magazine (which I have previously praised here heavily and will continue to do) has recently released a piece on a full-fledged Shakespeare hoax, which began in 1795. Quite an interesting story actually and one that I had never before heard. [Read the full article here].
Oddly enough, the story surrounding this centuries-old canard appears similarly soaked in soap-opera melodrama. Apparently, a young William-Henry Ireland, seeing his father’s obsession with the Bard’s masterpieces and history, set out to “discover” some previously undocumented pieces of Shakespeare memorabilia. In the playwright’s own hand. Let’s try to understand the thought process happening here with some examples.
- His father, Samuel, reportedly possessed a “silver-trimmed goblet carved from the wood of a mulberry tree that Shakespeare was said to have planted in Stratford-upon-Avon.”
- His father and his mistress, William-Henry’s mother, whom acted as a live-in housekeeper named Mrs. Freeman in the household, denied that William was their child.
- William-Henry was not the sharpest quill in the drawer.
Hence the daddy-drama begins.
“In a burst of manic energy in 1795, the young law clerk produced a torrent of Shakespearean fabrications: letters, poetry, drawings and, most daring of all, a play longer than most of the Bard’s known works. The forgeries were hastily done and forensically implausible, but most of the people who inspected them were blind to their flaws. Francis Webb, secretary of the College of Heralds—an organization known for its expertise in old documents—declared that the newly discovered play was obviously the work of William Shakespeare. “It either comes from his pen,” he wrote, “or from Heaven.”
Aside from the scant physical evidence, what could convince these experts of the authenticity of the documents? A good discovery story, of course. “He said he had found the deed while rummaging in an old trunk belonging to a Mr. H., a wealthy gentleman friend who wished to remain anonymous. Mr. H., he added, had no interest in old documents and told him to keep whatever he fancied.” How, um, convincing, I think.
Compiling more and more documents in Shakespeare’s own hand was time consuming, so he ambitiously promised his father a new play never before seen but recently found in the trunk. So, naturally, he saved time:
“The young man wrote the play on ordinary paper in his own handwriting, explaining that it was a transcript of what Shakespeare had written. The supposed original document he produced later on, when he had time to inscribe it on antique paper in a flowery hand.”
Really, late 18th century antiquarians? Really? One man, whose theatre would eventually mount the newly rummaged play, wisely observed that no one could doubt the authenticity of the items when they looked so ancient, despite his doubts about the play’s writing style.
You know, the combination of academia and pop-culture (think: Harold Bloom) is repulsive…unless it is the History & Discovery Channels, which I adore. I think this story functions more as the latter, thankfully.
January 10, 2010
Threats to western embassies in Yemen, the collapse of national security policies, Iran resisting enforced outsourcing for uranium enrichment: 2010 will be a feather in the cap of shock-and-awe journalists, I suspect. Those other stories, my little non-abrasive gems, will likely get lost in the bombardment. Hey, when the world is facing Armageddon (for the fourth or fifth time this millennium at last count), there is little time for pansy stuff like history or culture. Fair-trade gems shine nicely enough, but have you ever seen a 32 carat blood diamond dazzle? Don’t let GQ fool you, Porsche pushing ponces, bigger is better. Shaw said, “We learn from history that we learn nothing from history.” God knows what else the old cad wrote, but this quote fits my context. Thank goodness for short-attention-spanned journalism.
And then you find some wonderful link tucked away at the bottom of the BBC’s main news page on their website connected to a deserving, under-read ditty. Yes, I forgot: reports can be interesting and (seemingly) insignificant to our world’s political health. The categories are not mutually exclusive. In any case, I found one such link. That happened today when I discovered the article “Neanderthal ‘make-up’ containers discovered“. Calm down. What did you expect them to be armed with…pistol cartridges? chocolates? This is a late Paleolithic species, and Maybelline has been flogging their goods since back when homo first got erectus. Of course the Neanderthals had makeup.
I know you are giddy at the moment, but let us think through this story a bit and remain unbiased by time and our conception of the Neanderthal. [After all, carpet-kept Cleopatra used loads of mascara (reportedly beneficial to their eyes when loaded with silt derivatives), the Picts only have their name because of their (unconfirmed) bodily markings and/or modifications, and the stars of True Blood are covered in gobs of (tacky) white makeup. Why should the Neanderthals have done differently?] Previously, evidence produced in Africa revealed that Neanderthals used a black pigment, manganese, as body paint. New artifacts from Spain indicate that additional pigments and powders were utilized also. Residual “lumps of a yellow pigment” and of “red powder mixed up with flecks of a reflective brilliant black mineral” were discovered on two shells, apparently the cases for such mixtures. Shells could also be worn as jewelery. It is unclear what purpose this makeup may have served, though scientists guess that it was likely related to ceremonial practices, not necessarily Friday night, let’s-go-out-and-get-laid decoration.
I take your point, though. A Neanderthal face painted in makeup to the point of obscuring the inevitably protruding bone structure would do little to tempt a bitch in heat no less than Candida. But the point that researchers are attempting to make is that the Neanderthal is not the barbarian of our forefathers. They are trying to rescue and reinterpret our cultural understanding of the species, the homo sapiens’ nearest cousin (though they share 99.5-99.9% of our genome, they are not our ancestor!).
These are not the Ice Age cave dwellers of Geico commercials. One study finds that Neanderthals possessed the FoxP2 gene, shared with modern humans but not chimpanzees, suggesting that they may have spoken exactly as we do. According to other reports, several Neanderthal specimens have been found to have the mutated gene responsible for red hair. Yes, ginger bison-beaters. Which may explain, if you believe another study, part of the reason the species died off, inexplicably. Research may suggest that Neanderthal extinction occurred in response to climactic changes. Despite surviving several ice ages, the destruction of forest woodlands seems to have killed this species that hated the cold. So they moved to Spain, the site of the latest known skeletal remains, to soak up some sun. These fair-skinned gingers not only suffered as a result of their inability to deal with the changing, colder climate but probably had major sunburn issues as a result.
It’s just funny to me how much we have in common with our older selves, the earliest homo sapiens. Reports indicate very little (if any) interbreeding with Neanderthals. [Homo sapien racism: if it looks different, smells different and eats big ass mammals, it’s subhuman.] And by cutting down the trees and finding better means of butchering animals, we turned the funny-looking hairy people into pariahs of the ancient world. Even ancient homo sapiens (the tree killers of the Paleolithic Age) contributed to global warming. Ah, trends…
Darwin would argue that, without the appropriate means of adapting to a changing environment, this is nature’s Dear John letter to Neanderthals. Nature itself becomes Hexxus, and the Neanderthal is a FernGully with no fairies (though giants, if you believe Geoffrey of Monmouth) and no happy endings. Going back to Methuselah, we may assume that Nature is our benign life force, testing out species by trial and error to find the perfection. Sorry, that doesn’t offer any comfort to you (secret, hiding) Neanderthals. And it does make you wonder whether we are a success or too-late-to-be-realized error. Damn.
In any case, Neanderthal articles cast off sticky situations with the Middle East and impending elections of 2010. I appreciate them all the more for it. And unless articles like this continue, a law must be enacted to force journalists to choose a new profession. Perhaps Mrs. Warren’s?
[Points if you managed to spot the four G.B.S. allusions.]
January 4, 2010
And here’s why. Elephants (and possibly all creatures if a new study is representative of greater bestial activities) have been found to possess a structured system of communication that can be interpreted in human terms. This should not be surprising (it’s been done with dolphins before). But since the refined physical rituals performed by the animals seems so similar to our own, whatever we can understand from elephants will certainly be revealing. To be able to hear and to comprehend these animals through our own language, our own understanding: what an amazing thought.
I see the flaw as well as you. We inhabit a certain time and a particular ideology not necessarily shared among the entire population. We are the “west” after all, a strain of humanity different from the “rest,” we are told. So, yes, it is problematic to apply our own customs and emotional tethers to these animals, cute though they are. What gives us the right to categorize and enclose an elephant’s voice, which is the inextricable essence of being this animal?
I don’t know. And I don’t care. At least we are studying them, learning about them, and not killing them. [Researchers estimate that ten percent of Dzanga’s elephants are killed each year for their ivory.] The conclusions drawn may be meaningless. But, as the advert says, every little bit (of knowledge) helps.
Leave that for a second. Let me give you a little background provided by a lovely 60 Minutes report given tonight.
Cornell University sponsors this study, which will eventually produce a dictionary of the language of elephants. You heard me correct: they have and are continuing to interpret the vocalizations of elephants based on years (decades even) of physical observation. Nearly twenty years (and an expectation of at least fifteen more from the main researcher) have produced astounding results in understanding pachyderm communication. Scientists can now distinguish the various calls of the animals: the cries of protest, the loud rumble that means hey, etc.
Until recently, it was unknown that elephants had a secret infrasonic language. Much of their communication is at a frequency too low for humans to detect. Now the cleverly-titled Elephant Listening Project at Cornell uses computer-based spectrograms to record and analyze these sounds and interpret their meanings as well. It is believed that these low-level rumbles can be heard by elephants over a mile away. Impressive.
But what I am most intrigued by is the elephant death ritual. After filming the natural death of a baby elephant in 2000, scientists observed that the elder elephants would approach the body and touch it again and again, trying to make the elephant arise. After a long period, they would line up and pass the dead body, a funeral procession for pachyderms. When close to the body, each elephant would touch or smell it and then cry or roar aloud. This period lasted for several days. It is tragic and beautiful. I really think you should see it.
Bob Simon’s report on these magnificent creatures can be found on the 60 Minutes website. Click here to watch.