As someone who has been a gamer since I first laid hands on the original IBM Personal Computer in primary school, the question of what actually makes a game “fun” is something that I often ponder. This recently popped on QT3 as a loosely related tangent in a discussion on whether or not gaming can be an addiction. One particular poster made an observation so insightful that I simply need to put it here:
What games actually do, imho, is give you sheer, unadulterated happiness.
How? The reason is simple. A psychologist called Mihaly Czikzhentmihalyi (sp?) discovered (I think in the 60s and 70s) through extensive questionnaires with statistically quite large samples, the secret of ordinary human happiness, and it’s laughably simple – basically, if you go through life setting measurable goals that are just outside your comfort zone to attain, and then attain those goals, and then move on to pick a new, slightly higher-level goal, etc., etc., etc., you will be happy.
It’s exactly this progression of increasing powers and ever-increasingly-difficult goals that games give you in a miniature, abstract form, and that make them so addictive – little jags of happiness as you set and attain mini-goals, constantly excelling yourself in skill, the attainment of lewt, the discovery of new stuff, etc., etc.
Of course, theoretically, we should all be getting that kind of happiness in real life, through our careers, family, etc., and most of us probably do, but sometimes life isn’t so forthcoming, things are difficult, and it’s nice to have a happiness-producing substitute.
In a response to a question from me, he posted a link to Wikipedia about the research he cited which is here. Pretty interesting stuff, no? I hope to expand on this later.
This post is an expansion of comments that I’ve made in response to posts made in De Minimis. In a way, it seems odd that I would need to make this post at all. After all, everyone instinctively feels that becoming wealthier is a good thing, right? So what possible arguments might one advance to claim the opposite? There are many levels to the critique made in De Minimis, and in his defense, he appears to acknowledge that this is a train of thought that is still in the making. Still, as I understand it, the argument against economic growth falls largely into the following two groups:
- Economic growth is bad for the environment and depletes the Earth’s finite resources in an unsustainable manner.
- Striving for material wealth may not necessarily bring about the desired happiness and the stress and conflict this cause may actually turn out not to be worth the struggle.
Continue reading Economic growth is good
A while back, I blogged about how philosophy is embracing empirical experiments. A couple of experiments, one by the University of Minnesota and the other by the University of British Columbia, make for a great example of this. Both experiments had similar aims: to examine what effects belief in free will has on human morality and were structured similarly. The experimental subjects, mostly college students, were separated in two groups. One group was given text to read that expressed skepticism on the subject of free will, arguing that human actions and decisions were mechanistically determined by a variety of genetic and environmental factors. The other group was given either a neutral text in the case of the first experiment or a text that explicitly endorsed and defended free will in the second experiment.
After reading the texts, the students were given the task of completing a test. In both experiments, the students were given the opportunity to cheat on the tests, while being erroneously led to believe that their cheating would not be detectable. The results were that students who were given texts that were skeptical on the subject of free will were more likely than the others to cheat on their given tests. The researchers wisely caution against reading too much from these results, but at first glance, they appear to confirm concerns that advances in our understanding of how our minds work have far greater long-term ethical implications that the more publicly known worries over genetic engineering and nanotechnology.
Quartertothree regular and EA producer Jim Preston tackled this very question recently in a thoughtful essay on Gamasutra that’s worth reading both for anyone seriously interested in video games and the question of what constitutes art. He claims to have been inspired by two things: freelance game reviewer Tom Chick’s review of Bioshock which answered the question simply by saying, “Games are this” and renowned film critic Roger Ebert’s review of the recent Hitman film (based on the video game series of the same name) in which he boldly claims that video games will never become an art form.
You really do need to read the full essay to appreciate it, but Preston basically argues that it’s pointless for video games to aspire to the status of Great Art as it is popularly conceived through the process of reasoned debate. Instead, he argues that things become art by gradually sublimating into the consciousness of the mainstream and acquiring a revered status in the minds of the people who like it. Eventually, the people who do like it will place it in a context, as in a museum or a concert hall, in which it becomes publicly acknowledged as art.
The importance of context towards interpreting whether or not something is art is reinforced in an intriguing story that Preston references. On the morning of January 12 2007, the Washington Post organized a little experiment. They arranged for Joshua Bell, one of the greatest living violinists in the world, to play six classical pieces representing perhaps the greatest musical achievements in Western culture on his invaluable 1714 Stradivarius violin in the L’Enfant Plaza metro station in Washington for 43 minutes. Hidden cameras and reporters for the Post carefully recorded the reactions of the passersby of the morning rush hour. Out of the 1,097 people who walked by during that time, only two people truly recognized the quality of what they heard and only a handful of others stopped what they were doing for a few moments to listen. Bell earned a total of $32.17 in tips, excluding another $20.00 from the one person who recognized him. The irony of course, as Preston intended to point out, is that Bell is the kind of performer who can earn $1,000 a minute by playing in the right context to the right audience.
The Telegraph recently published an extract of an interesting new book by Damian Thompson on what he calls counterknowledge: wacky ideas and theories that are unsupported by empirical evidence, but are believed by many, and thanks to modern telecommunications technology and the internet, are flourishing as never before. Some of the examples cited include: the conspiracy theory that the Bush administration planned and executed the 9/11 terrorist attacks; that the plot of the popular novel The Da Vinci Code in which Jesus and Mary Magdalene sired a dynasty of Merovingian kings is true, and that the Catholic Church knows about this and is covering it up; and the fatwa issued by Islamic leaders in northern Nigeria stating that the polio vaccine is really part of a U.S. plot to sterilize Muslims.
In many cases, you might think that the theories spouted by these cranks are harmless enough, except that sometimes they’re so widely believed that they cause serious harm, such as the spread of polio caused by parents who now refuse to vaccinate their infants. And when even a minister in President Sarkozy’s new French government, Christine Boutin, remarks that it is possible that Bush might have something to do with the 9/11 attacks, you realize that this isn’t a problem that’s confined to poor countries or uneducated people. I can think of plenty of other examples such as South African president Thabo Mbeki’s reluctance to believe that the HIV virus is the cause of AIDS; the formerly popular idea that the 1969 moon landing was faked in a movie studio; and, yes, even Chinese beliefs in qi, feng shui and any number of other superstitions.
As the extract notes, there are a number of reasons why such beliefs can take hold including encouragement by Left-wing multiculturalists who insist that minorities have the right to believe in things that are patently untrue and postmodern philosophers, again usually Left-wing, who insist that science and technology are products of Western culture and are not objectively true. In the developing world, opposition to Western science is seen as opposition to the political, intellectual and scientific elite of the Western world and a way of upholding the dignity and validity of their respective cultures.
In response to the conspiracy-minded, I refer to the now familiar quote by Robert J. Hanlon: “Never attribute to malice that which can be adequately explained by stupidity.” As my recent viewing of a particular South Park episode indicated, believing that the Bush administration orchestrated the 9/11 attacks means ascribing an almost superhuman level of competence to President Bush and his officials. As for those who argue that science is a cultural phenomenon that is inherently Western and therefore not objective, I say that reason, logic and the scientific method are the best tools that humanity has to discover the truth and they belong to all mankind. As Ayn Rand would say, our reason is the very thing that makes all of us human. Only a fool would deprive himself of science’s benefits just because someone else got it right first.
What is the nature of the guilt that your teachers call his Original Sin? What are the evils man acquired when he fell from a state they consider perfection? Their myth declares that he ate the fruit of the tree of knowledge – he acquired a mind and became a rational being. It was the knowledge of good and evil – he became a moral being. He was sentenced to earn his bread by his labor – he became a productive being. He was sentenced to experience desire – he acquired the capacity of sexual enjoyment. The evils for which they damn him are reason, morality, creativeness, joy – all the cardinal values of his existence. It is not his vices that their myth of man’s fall is designed to explain and condemn, it is not his errors that they hold as his guilt, but the essence of his nature as man. Whatever he was – that robot in the Garden of Eden, who existed without mind, without values, without labor, without love – he was not man.
– Ayn Rand in Atlas Shrugged
[This is Part 2 of a planned 3 part series on Ayn Rand and her philosophy and its influence on my life. You can read Part 1 here. This part covers some of Ayn Rand’s early life and details more of her philosophy and how it directly influenced my personal development.]
In many ways, Ayn Rand’s life showed a determination and even an obsession as strong as any of her fictional characters. Born in 1905 to a middle-class family in St. Petersburg, Russia, she witnessed firsthand the horrors of communism when her family’s pharmacy was seized by the Soviets in the revolution of 1917. At the University of Petrograd (the city’s new name given by the Soviets in place of St. Petersburg), she studied history, including American history, and became an admirer of American ideals. In 1925, she finally received permission to travel to America, on the pretext of visiting relatives, but by then she had already decided never to return to Russia.
Continue reading Ayn Rand and Me (Part 2)
A thought-provoking article entitled “The New New Philosophy” published recently in the New York Times Magazine covers a recent trend among philosophers to embrace empirical experiments. As the article notes, philosophers have traditionally tended to be rather snooty towards during actual empirical work, preferring to think of themselves as pure workers of the mind who need nothing but pencil and paper and a comfortable armchair.
Of course, philosophy has tried to be more scientific before, notably postmodernism’s ridiculous efforts to dress up their nonsense in scientism, in order to steal back some of the glory and respect that philosophy has lost since natural philosophy became science. But in my opinion at least, this new acceptance of empiricism is much more likely to yield interesting results. As I alluded to in my review of Irrational Man, philosophers have tended to assume than mere reflection is sufficient to reveal the secrets of the human psyche, while ignoring how discoveries in neurology have allowed us to examine in ever more intimate detail how the brain really works.
The article offers the opinion that while empirical work may raise interesting new questions for philosophers, it would not be able to settle them. I think that this is somewhat over-simplified. At the very least, objective knowledge of the processes that drive emotions, intentions and thought itself would seem render invalid many lines of philosophical inquiry. For example, Ayn Rand believed that reason should precede emotions such that we should use reason to determine how we should feel and then adjust our feelings accordingly, while neurologists now believe that feeling itself is a part of the reasoning process. Similarly many philosophers believe that existential angst is indicative of a great gap in human existence that must be filled either by religion or some other ideal. Perhaps it is only indicative of those philosophers’ lack of access to anti-depression medication.