Presented without comment #15

Fear and Loading in Game Journalism‘ by Christian McCrea at The Escapist (circa 2007).

Chuck Klosterman’s notorious piece for Esquire wants to know where gaming’s Lester Bangs is hiding, and many people have cited the piece for its assertion that most game writing is stuck describing technology. In response, some other manifestos and declarations bemoan the New Games Journalists for not describing technology enough. However, it is right at the end of Klosterman’s piece that he hits the nail on the head: “If nobody ever thinks about these games in a manner that’s human and metaphorical and contextual, they’ll all become strictly commodities, and then they’ll all become boring. They’ll only be games. … This generation’s single most meaningful artistic idiom will be – ultimately – meaningless.”

When game writing is at its best, it puts play before the game. Gaming doesn’t need a Lester Bangs. It doesn’t need a Hunter S. Thompson. It needs anybody who has the inclination to make simple, human connections between technology and human truth.

A planet without Square-Enix‘ by Tim Rogers at Kotaku.

In case you’re looking to me for an explanation of what happened, here it is: a man had a brand new video game in his hands, still shrink-wrapped and in a double-taped plastic bag, and he already didn’t care about it anymore. He was already thinking about something else — about The Next Big Thing, which was more or less The Thing That Hooked Him All Those Years Ago, Only Shinier. This is the type of human being corporations like Square-Enix are manufacturing.

digital authorship, computers and writing at #cwcon‘ by Alex Reid at Alex Reid dot net

What you could see really happening at C&W was an explosion of Twitter. Twitter was certainly a presence last year, but this year it seemed like there were multiple people tweeting nearly every panel. People chimed in from a distance. Conversations cross-polinated across panels. What we can see with Twitter at C&W is the possibility for highly productive, real-time digital collaboration. Of course the final product won’t be on Twitter, but Twitter can provide a kind of rhetorical lubricant.

Meanwhile if 10 of us write a book together, maybe it sells 5-10,000 copies rather than each us writing one that sells a couple hundred. Not as an essay collection of course, because those don’t sell, but as a collective author, some kind of institute or think tank perhaps. I’m not sure. What I am sure of though is that we need a new model of scholarly work and dissemination.

It thinks- Some Reflections on Blogging‘ by Levi Bryant at Larval Subjects.

The mediums we use are not mere props or tools that we deploy for ends that we already possessed or intended on our own, but rather change us. For this reason, it is better to say it thinks rather than I think. This can be dimly glimpsed in the case of blogging or of comment sections on blogs. It is not that I share my thoughts, and then that others share their thoughts. To be sure, something like that is, of course, going on. But there is also a much more diffuse, distributed mind at work on a blog and across blogs. The others that speak and participate are a part of the thinking. The mind is not so much something in each of these speakers, but rather is that assemblage of participants.

Stereotypical Update Post

Every so often I look over my blog – just skimming the posts – and as I was doing so lately I realised that it’s turned into a whole lot of ‘presentation without comment’. So let’s do some good old fashioned blogging. What have I been up to recently?

I haven’t really written a whole lot lately for the blog partly because I’ve been kept busy – I just finished a draft of a paper I’m presenting in July at the Videogame Cultures and Future of Interactive Entertainment conference (in Oxford, UK) which I’m actually rather happy with (p.s. if you’re in the UK and wanna hang out in the first week of July get in touch!) . When I finished writing it I realised that the ideas contained in it are kind of exploding out into a shimmering array of glistening new ideas waiting to be plucked from the air and placed into a big ol’ PhD chapter. So that’s nice. I’ve have more ideas about what to write about, hooray!

I’ve also been doing a lot of reading and thinking about philosophy, in particular philosophy of mind, cognition, reality, etc. Mostly this has been for the paper mentioned above (Andy Clark & David Chalmers and theory of the ‘extended mind’ is super interesting; Paul and Patricia Churchland’s ‘eliminative materialism’ is stellar, even if I do have reservations), but I’m also sinking my teeth into Quentin Meillassoux’s After Finitude which I’ll spare you the evangelism of, but in my view it’s the most amazing philosophical proof of… reality? Proof of something like that. At any rate, if “proving” reality via accessing an “absolute” (that is, accessing the in-itself as exists when unafflicted by being given to-us) sounds like rubbish then it’s probably not your thing. But if you get off on thinking about the nature of thought’s relation to the absolute then this amazing text will become your bible.

I bought an iPad recently! It’s pretty neat, no denying. Is it living up to the purposes I bought it for? Absolutely. It’s a fantastic device to hold in your hand and enable you to read medium-large amounts of text. It beats the pants off a laptop screen by a mile. It’s also been interesting to see how it’s designed singularity eases something like the distraction burden on my mind/concentration. When I’m on my iPad I know that I won’t/don’t have to concentrate on subordinating those instinctive impulses to check Twitter or Facebook or my email when they arise (usually at the most inopportune time – right when I’ve hit a wall) because… it’s a much harder (or at least, much different) thing to do to swap out to those things when you do want to check twitter, etc. Perhaps it’s not even so much that it’s harder, but that it’s more complete of a switch when you do jump over to a different app.

I’ve got a book chapter out! Halo and Philosophy is apparently shipping to some pre-orderers already, and you can get that through your favourite online book retailer. My chapter is called (imaginatively) Halo and Music and is a refinement of my undergraduate research. Don’t think that you have to be a Masters or PhD student to get published, cause clearly some collections will take just about anyone! So that’s a thing. Still, I’m no Trent Polack.

Lastly, I’ve also been re-thinking my commitment to the ‘no comments’ policy. Do I still think they’re on-balance not worth it for this space? Yeah, I do. But… as ben abraham do net changes (hello continuous stream of ‘presented without comment’ posts!) so too should my conception of what this place is for. So I’m thinking (and I’m by no means committed) of opening up comments on the blog section, and making it more ‘blog-like’. The irregular, long form academic and pseudo-academic writing will remain comment free, however I may move it over to individual ‘pages’ so that each one can exist on it’s own as a standalone piece. This is already what Ian Bogost does (essentially) with his website. So what do you think of that idea, dear silenced audience? Let me know on twitter or Facebook or email me if you feel strongly either way about commenting (contact details are on the sidebar). I wonder if I’ve won over any serious fans of my no-comment policy who would object to the change.

 

Presented without comment #14

Where The Hell Am I? Some Remarks on the History of SR and OOO‘ by Levi Bryant at LarvalSubjects.

I think, even where it doesn’t know it, OOO and SR are responding to these ontological transformations where we’re no longer quite sure where we are, what we are, and what’s calling the shots. On the one hand, there’s the looming ecological catastrophe hovering over our collective heads that has the effect of diminishing attention to the signifier, ideology, meaning, and representation and of drawing attention to cane toads, weather events, bees, farming techniques, lawn, energy production, etc. On the other hand, there are all these new strange technologies that we are increasingly melded with in ways that aren’t entirely within our control. These things call forth, beg, plead for a non-correlationist ontology that overcomes the narcissism of humans wishing to treat all of being as externalized spirit. This, I think, is part of why OOO/SR came into being

ALL REAL ATEMPORAL SHIT. NO AUTHENTICITY.‘ by Adam Rothstein at POSZU.

Any particular atemporal trend will end up named, stamped into a commodity, and sold, until stretched into a thin veneer of shiny, zombified goo. But that’s okay, because we already have a friend that we met in a comment thread, that can get us that real shit. The Real Shit, because it is the stuff we want and nothing else, and because we’re getting it from the source that we know and trust. That is the network, and that is atemporality. All real shit. No authenticity.

Social Media is Ruining Everything‘ by Leigh Alexander at Thought Catalog.

All people are defined by the approval, response and input of others in their society, but thanks to social media, individuals can beg to be defined by the digital screams of strangers, of nobodies. They do; they want to. Developing an ‘internet presence’ is part of teenage self-actualization and independence-assertion now. It’s fucked.

Why do I write?

Why do I write?

I haven’t stopped to think about it lately, and I probably should. So I sat down for an afternoon and tried to come up with all the reasons why I write. Here, in no particular order, are presented the main reasons I write:

 

1. Because I’m reasonably good at it. I started blogging because I’d learnt I had the knack for turning words into sentences and sentences into paragraphs; paragraphs into chapters; chapters into theses. That’s the essence of what writing is. Turning words into something larger.

Words on their own mean something, but the relationship between words when they are placed in order is vastly more important. Much like binary code in which the significance of any individual ‘1’ or ‘0’ is simultaneously and paradoxically nil and ultimate (the significance coming from a relation to all the 1’s and 0’s that precede and follow) so too every word means simultaneously almost-nothing and almost-anything. Their individual significance is minor to the point of being generally interchangeable. Like any binary ‘1’ on a spinning magnetic disk, swap it for any other ‘1’ and the meaning remains the same. Just so, words together can cumulatively enlarge and grow and warp and twist and crackle across the page with such fire and power that it seems as though the very world was enveloped by words!

The world is not enveloped by words, but one can at least better understand the attraction to philosophy’s near-all-consuming ‘linguistic turn’.

2. Because the act and process of writing helps expose me to my own thinking, and develop my own ideas. When I’m writing a piece and connecting logical dots, when I come to one or more seemingly contradictory conclusions (or, more commonly, am overtaken by a growing realisation of contradiction or confusion) I have to re-examine my premises, or the terms that I am using, or some other aspect of my approach entirely. I have to wonder, what do I really mean here? I have to comprehend my own unarticulated intimations and somehow untangle the mess of connections as though a snarl of many twisted wires.

It’s kind of like The Socratic Method for solo cogito, where you have a lone dialogue with yourself by way of externalising thoughts through words.

3. Because I like the way a particular turn of phrase or a particular use of words can make me think in a completely new direction. Take, for example the following completely functional sentence:

Leaving food in my bedroom attracts rats and cockroaches.

There’s absolutely nothing wrong with that sentence whatsoever. It contains four nouns – ‘food’, ‘bedroom’, ‘rats’ and ‘cockroaches’ – as well as the transitive verb ‘leaving’ . It carries the meaning efficiently and effectively, warning readers not to leave food lying around in my bedroom unless they want to encourage rats and cockroaches. Now take the following modification of that sentence.

Leaving food in my bedroom attracts vermin.

All I have done is substituted ‘rats and cockroaches’ for the word ‘vermin’. The difference from the first sentence is both subtle and profound. For starters, it has at once simplified the sentence, reducing the number of words and nouns to three – ‘food’, ‘bedroom’ and ‘vermin’). It has also changed the scope of the statement, increasing the range of the implied admonishment to encompass the entire category of creatures that are considered pestilent. Even further, the word ‘vermin’ brings with it connotations of disease. Now, instead of our imaginary food attracting merely two species of pest, it attracts a whole lot more. On Wikipedia’s page for vermin it discusses the word’s scope:

Disease-carrying rodents and insects are the usual case, but the term is also applied to larger animals—especially small predators — on the basis that they exist out of balance with a human-defined (desired) environment…Pigeons, which have been widely introduced in urban environments, may be considered vermin

There is so much more possible meaning to be drawn from the second sentence than the first: now a reader’s mental image of the consequences of leaving food in my bedroom includes a virtual menagerie of all types of vermin; adding it’s presence along with the rat on the side-table and the cockroach on the plate is now the pigeon that flies in my window to nibble on leftover crumbs, the mouse nibbling on some mince, and any other ‘vermin’ the reader’s imagination might conjure up. All this from using one word instead of two (well, three if you count the conjunction ‘and’).

4. Because words are the things that grant me access to ‘things’. Using new words gives me access to new things; everything from thoughts and emotions to new words for composite activities and entire processes. As a process of discovery it’s exciting to be able to attach a word to something that was previously indescribable, held only in the mind as a vague miasma of thoughts, actions or emotions. Try and concieve something that has no word (or group of words) for it, or some that you don’t know the word for, and what results? A vague sense of wrongness, uneasiness, a sense of indeterminacy and a reliance on broad, childlike strokes at attempting to describe something in an inevitably not-quite-right way.

Take a word like ‘thanatosis’, a word which roughly means the act of feigning death in an animal, usually as a reflex action. Sure, you could always just describe that as “the act of feigning death in an animal, usually as a reflex action” but to have a word-tool available gives it the sense of coherence, or a unity. This is a phenomena, it exists, whereas before all we had was a compound series of words/sentences. It’s a relatively powerful aide to thought.

5. Because writing is non-literal (or doesn’t have to be literal). It can be allusive, as well as functional; persuasive as well as descriptive; figurative as well as useful. Computer code is functional in that it does things, and this results in the inseparability of understanding what a piece of code does from an understanding of what it is. The IF/THEN statement is exactly what it does, quite unlike languages and writing which hold a non-linear, indeterministic relationship between what a unit of writing is and does.

Supple.

6. Because writing can be its own reward! Thus, if my words change the world, so be it. If they do not, so be it.

7. Because the end result of writing (having a piece of writing, contrasted with not having a piece of writing) is something that I can point to and say ‘That is something; something that I have made and that reflects something about me, be it my character, my prejudices, my perspective, my limitations and boundaries, my insights, my vocabulary, my speech-thought patterns, my philosophical predisposition, my proclivities and peccadilloes, or my command over my very own thoughts.

8. Because writing is communication and I am hungry to communicate – to reach out and touch other people.

To understand and to be understood is a deeply powerful, even sacred, relationship. Comprehension is both skill and choice; as a skill it’s one that many people seem to lack but it can be developed.

If writing is practice comprehending myself, then reading back over your own writing can be practice at comprehending yourself as comprehended by someone else.

9. Because writing is technical in that there is a right way and a wrong way to do it. Words have correct spellings (leaving aside differences between regions) and grammar is essentially a semi-rigid system of rules. Oftentimes there are good and better ways of writing (particularly when writing with a purpose or audience in mind), but there are also right and wrong ways. That is a comfort.

10. Because writing can do amazing things, as well as be amazing. It can do art as well as be art. Out of the same ‘stuff’ is fashioned the most withering critique of the vapid artist and the utmost fantastic exploration into the character of 1920’s Parisian expatriates.

 

Further reading: Lyndon Warren’s ‘Wittgenstein, Games, and Language‘, and Philosophy Bro’s excellent ‘Wittgenstein’s “On Certainty”: A Summary‘.

Presented without comment #13

Faulty Towers: The Crisis in Higher Education‘ by William Deresiewicz at The Nation.com.

What we have in academia, in other words, is a microcosm of the American economy as a whole: a self-enriching aristocracy, a swelling and increasingly immiserated proletariat, and a shrinking middle class. The same devil’s bargain stabilizes the system: the middle, or at least the upper middle, the tenured professoriate, is allowed to retain its prerogatives—its comfortable compensation packages, its workplace autonomy and its job security—in return for acquiescing to the exploitation of the bottom by the top, and indirectly, the betrayal of the future of the entire enterprise.

Amateur vs Indie‘ by Andrew Doull at Game Set Watch, circa 2008.

The gaming press is conflating two trends in game development into a single category that they label the Independent Game. The first is commercial oriented, casual, independently produced games by people attempting to make a living from writing and designing games without committing to a publisher. These I’m happy to call Indie Games, and they operate much in the same way that the independent labels in the music industry, or independent studios in Hollywood.

The second is subversive, modded, copycat, patched together from pre-built parts, non-commercial or anti-commercial. Amateur game development is done by people who are scratching an itch, who can’t not write computer games, who want to see their ideas in pixel form ahead of trying to generate a return.

On Object Orientation: An Antapologia for Brian Moriarty‘ by Abe Stein at MIT’s GAMBIT lab.

Designers have grown attached to the perception that they are creators of artifacts. In truth the act of game design is more like composing a musical score or choreographing a dance; the “object” of the creation is not fully realized until it is engaged through performance.

I have a robot’s voice

Because Skype + my only computer microphone are an exceptionally awkward combination.

Listen to my verbal wanderings over at the Pop Matters Moving Pixels Podcast. It’s easy to forget not everyone else knows all the “history” I do. Do I even know history? I know something like history. That’s another question for another day.

Thanks to Chris Williams and the Moving Pixels crew for having Eric and I, and a special thanks for asking such interesting questions.

Presented without comment #12

How the Internet Gets Inside Us‘ by Adam Gopnik at The New Yorker.

…if you stretch out the time scale enough, and are sufficiently casual about causes, you can give the printing press credit for anything you like. But all the media of modern consciousness—from the printing press to radio and the movies—were used just as readily by authoritarian reactionaries, and then by modern totalitarians, to reduce liberty and enforce conformity as they ever were by libertarians to expand it. As Andrew Pettegree shows in his fine new study, “The Book in the Renaissance,” the mainstay of the printing revolution in seventeenth-century Europe was not dissident pamphlets but royal edicts, printed by the thousand: almost all the new media of that day were working, in essence, for kinglouis.gov.

Corporate Rule of Cyberspace‘ by Slavoj Žižek at InsideHigherEd.com.

…whether it be Apple or Microsoft, global access is increasingly grounded in the virtually monopolistic privatization of the cloud which provides this access. The more an individual user is given access to universal public space, the more that space is privatized.

…Partisans of openness like to criticize China for its attempt to control internet access — but are we not all becoming involved in something comparable, insofar as our “cloud” functions in a way not dissimilar to the Chinese state?

Fool Me Once‘ by Agent Orange at FutureBook.net.

If there is any organization on the planet with a real understanding of the value of the e-book market it is Amazon and the fact that they are honing in fast on publishers’ territory is the clearest possible indicator there could possibly be of the viability and potential buoyancy of the publishing business.

The conglomerates dog in the manger approach to the issue of e-books is doing them no favours. The day is fast approaching when a truly major international author will realize they are going to be greatly better rewarded by being published by Amazon because they will offer them a sensible share of the revenues they generate.

A response to Dan Cook’s “blunt critique of game criticism”

NB: Since posting this piece the original post in question has been edited to remove many of the phrases I initially took aim at. As such I now feel like the fool flailing away at thin air, so thanks for doing that to me, Dan. It is, as always, his prerogative, but it’s also exactly what I was talking about in the final post-script. I leave my original response as-is.

N.B.B: My attention has been drawn to this copy of the original version of Dan Cook’s essay, which is important context for reading this response in.

This is a response of sorts to Dan Cook’s self-described “A blunt critique of game criticism”. If you haven’t read it yet the rest of this post is likely to make little sense, so please go read it first to get the full context of my remarks.

What I’m going to be attempting here is exercise what Wayne C. Booth termed ‘Listening Rhetoric’. In his book, The Rhetoric of RHETORIC Booth described listening rhetoric like this:

When LR is pushed to its fullest possibilities, opponents in any controversy listen to each other not only to persuade better but also to find the common ground behind the conflict. They pursue the shared assumptions (beliefs, faiths, warrants, commonplaces) that both sides depend on as they pursue their attacks and disagreements. (The Rhetoric of RHETORIC, p.10.)

So to help me make sure I’m not misunderstanding it, here’s a summary, my take on what his piece is about laid out in the simplest fashion possible:

  • Dan Cook feels passionately about game criticism (as do I)
  • But he has a very particular view of what does and does not constitute good criticism (one that I do not and cannot share)
  • He doesn’t see enough of his idea of what is good criticism and is understandably frustrated

But frustration does not excuse the fact that he’s said a lot of things in an offensive and condescending manner, things that are only true if one shares Cook’s underlying premise. Disagree with Cook on the point that criticism has to be useful to “[improving] future games” and his critique comes falling down like a house of cards.

He’s not helped by the fact that there is the stench of the troll around the piece. I have tried extremely hard not to take offense at Cook’s post, to exercise ‘listening rhetoric’ as it were, particularly given how civil he’s been on twitter to me. His comments on the post are practically conciliatory but his position remains narrow-minded and dogmatic. So confident is he in this position that he seems unwilling to extend the courtesy of listening rhetoric to his critics.

I think it is an offensive post. Whether you want to call it a rant or ‘hyperbole’ or a ‘revisable draft’ or whatever else, it retains a number of assumptions and cranks them out to their logically unsound conclusion. So here’s my “response” or “feedback”, for what it’s worth: Cook’s article is condescending, inaccurate and unfair. His response to others reactions leaves me equally with the strong sense of a hypocrite trying to wriggle out of it and a dogmatic priest trying vainly to hold onto a dying faith.

 

It’s condescending and he should be ashamed.

According to Cook, the vast majority of games criticism is “a waste of [his] time as a game developer”. Similarly, critics’ experience of games (i.e. anyone not like Cook – both writer and developer) are “impoverished” because those of us who have never correctly loaded a C++ header file “know little to nothing about the philosophy and process of making games”.  The charge of wasting his time is patently absurd because, frankly, this is the internet.

The second charge is equally absurd but harder to demonstrate. In the comments section, numerous examples have been given that highlight why knowledge of the process of creation should not be made into a standard to measure critics against (leaving aside the equally absurd notion that the ‘philosophy of making games’, whatever that is, needs to be known to make any kind of critique). Cook has been presented with arguments for this, notably by Fraser Allison, to which his response is extremely telling.

We’ve already established that Cook’s piece assumes incorrect things about the nature and purpose of game criticism (specifically: that its purpose is to help developers produce better games) and yet having had that pointed out to him he still wants to assert that he’s not “putting game criticism in a box” (which is precisely what he’s doing – that’s what a claim about the nature of a thing does), that he’s just “asking [us] to be better at [our] job” (which is another offensive assumption since he clearly doesn’t understand what our jobs are).

Consider an imaginary line of text taken from a manifesto I wrote in a parallel universe. In this manifesto I an heatedly entreating Game Developers to be better at your jobs, which is (naturally) making more games like Far Cry 2. Far Cry 2 is clearly the pinnacle of game design (for me) so anything else you could possibly do is a waste of my time. Please developers, stop wasting my time. Imagine how cross Dan Cook (and hundreds of other developers!) would rightly be if I wrote that and sent it around the developer circles in search of “feedback”.

 

It’s inaccurate and he should be ashamed.

While calling for “clarion clarity” in game criticism, he relies upon a straw-man argument. While railing against the fact that much criticism is “useless” to him as a developer, he later goes on to say that “most writing is by gamers for gamers”.

I wish there were a better way of putting this, but: no-fucking-duh Mr Cook they’re not writing for you as a developer. Cook knows this; he clearly understands it and (otherwise he wouldn’t be lamenting it!) yet for whatever reason he still decides to turn these innocents into an enemy. How does that aid clarity? Is this not just wilfully misconstruing the point of a whole field of work that you find wastes your time? Is this anything less than bordering on trolling? I turn his own words against him: “you can do better.

 

It’s unfair and he should be ashamed.

Cook attempts to diagnose the malaise currently afflicting game criticism and it is actually afflicted; part of the reason I was so ecstatic upon first sight and skim of the piece was that I thought this was going to be a sister piece to my ‘Rhetorical Questions’. On the way to diagnosing the current problems criticism faces Cook gets lazy, and elides some of the logical leaps he makes. Observe the following passage:

…most game criticism suffers from an immense lack of hands-on knowledge about what it takes to make a competent game. In the past week of essays on Critical Distance, I found 1 writer of 12 had any declared experience making games.

Cook is not stupid: he’s added the caveat “declared”, as he knows very well that it’s entirely possible for a person to possess the very “hands-on knowledge about what it takes to make a competent game” and fail to produce useful criticism. He says as much himself. “I have a friend who makes games, but publicly writes gamer-esque drivel.” But hold on a second there, Cook – where’s the fairness in directing a screed like this only at critics given that there are numerically more developers well placed to become critics (according to your standards) but aren’t?

Furthermore, given that ‘being a developer’ is certainly no guarantee of the ability to write useful criticism, what then are we to make of the following statement?

If you are writing about games in language that suggest intelligent analysis, state upfront in your bio or perhaps even at the start of the article your perspective and experience.

Forgetting for the moment the problematic notion of possessing a never-changing ‘perspective’, why should we then do this? What would be the point of outlining ones developer credentials if that is still not enough to guarantee good criticism? Why even bother? Surely Cook knows that good criticism is either evident within a piece or it’s not, in which case, what does it even matter what experience a critic has as a developer? Could you reader even guarantee 100% that a writer of a “good” piece of criticism has had game developing experience? Of course not – statements of “authentic experience” are empty and pointless, contributing to little more than the pointless goal of oneupmanship. “Oh yeah, well I have even more experience developing games than you, so my critiques are even more valid.” Cf.this Monty Python sketch.

So we’ve seen that Cook’s position can be summarised as holding to the following two points:

–          Being a developer is necessary to write good criticism,

–          Being a developer is no guarantee of being able to write good criticism.

 

Cook gives some examples of what he’s advocating at the end of his article (vetted for candidates with real game design experience, one presumes!). Amongst them is one AJ Glasser, former journalist/critic. From what I gather, Glasser left journalism for development sometime after 2009, and the article Cook is citing is from 2011. So in the space of a year (I’ve been informed it’s even less), and with who-know-how-much hands on actual game design experience (has she shipped a game? Has Cook checked?) she’s passed from being an utterly useless and time wasting critic (Cook’s assertion! Remember, no development exp means no good criticism!) to a useful developer-critic. It’s a stretch to believe.

I’m using a definition of critic as distinct from over-enthusiastic commenter, and one that Cook seems to share: a ‘critic’ in this sense is not any old Tom, Dick, or Harry that comments on an IGN post or that has a Destructoid community blog. A critic is someone that gets linked on Critical Distance. This type of critic may be interested in thinking and writing about games from any number of perspectives, not just a technical or design perspective without fear of being labelled ‘a waste of time’.

But Cook protests! “Games have a functional heart that resists being reduced to the softest of sciences in the same way there is little room [for] ‘rock criticism’ in the practice of geology”, and further adds: “Games have more in common with functional works involving mathematics, psychology, governments, economics or other complex systems.” My favourite thinker at the moment is Bruno Latour and in his book We Have Never Been Modern he describes the same movement Cook is attempting to make: it’s none other than the exact same attempt made by the (failed) project of modernity.

The problem with Cook’s assertions is that games are no more pure “science” than they are pure “human construct” (in the sociological sense) and no more than they are mere “text” (in the humanities sense). Latour calls things like games ‘quasi-objects’ – not quite objective enough to be entirely the purview of science, not quite relative enough to be the mere products human perception. Here’s Latour:

Quasi-objects are much more social, much more fabricated, much more collective than the ‘hard’ parts of nature, but they are in no way the arbitrary receptacles of a full-fledged society. On the other hand they are much more real, nonhuman and objective than those shapeless screens on which society needed to be ‘projected’. (We Have Never Been Modern, p.55)

I’m sorry to say, but you cannot merely wish-away the ‘soft’ elements of games, whether you want to or not (I suspect Cook, if presented with this question directly, would answer ‘not’ despite actually arguing for it in the piece). There is and will always be an element of games that studying and critiquing “the object in itself” will never reveal, and in one wave of the hand Cook seems to both acknowledge and dismissing this as irrelevant to his position as a developer.

As has been pointed out in the comments section (by developer-critic Darius Kazemi no less), outlets for criticism like The Border House are vital. Injunctions against criticism targeting anything but the ‘hard’ parts of a game will always marginalise legitimate and worthy concerns about things like discrimination, racism, sexism, etc, etc. And that’s without getting into the importance of reception and reappropriation.

Yes, Cook rightly asserts that designers have a lot of knowledge of how games are received (probably, even more than we often give them credit for), but they can never have complete knowledge either. Clint Hocking in the years of development that went into Far Cry 2 never in his life dreamed someone would turn his game into an experimental exercise in player imposed permadeath and a machinma novel documenting it.

Reception and audiences matter, and I respectfully disagree in the strongest terms with Cook’s following comment:

I would also like more people to write about games in a way that moves game development as opposed to game playing forward. That’s me being selfish.

That’s certainly his prerogative, but as a player and critic I think there is still not enough being done to actually change the way players play. Taken together with the tone of the rest of the article (think “waste of time”, &tc) and the impression one gets is that Cook has the (unstated) assumption that the criticism of development should be privileged over play. Developers are working ever day to push game design in new directions (or I assume they do? Most are not aiming to cynically cash in on rehashed and recycled ideas, surely?) and who or what do players have to challenge their assumptions about play, the ways they play, and the purposes their play can have? They have criticism, that’s what they have. Cook can bemoan the lack of discussion amongst the game developer fraternity till the cows come home but ultimately game design will keep chugging along so long as there is money to be made. Players and critics do not have to be unwitting, ignorant, and slavishly thankful accepters of whatever received piece of gaming design developers want to dish out. But neither are players and critics the mightily empowered, all powerful adjutants Cook seems to be afraid they are.

The balance is not right (and I have said as much myself) but my answer to the imbalance is not to advocate doing away with the kind of criticism that has no direct ‘utility’ for a developer, rather it is to redress the imbalance directly. Dan has been told this by others so I won’t labour that point. I can only hope however that he has taken it to heart.

Dan asked for feedback and I have given it. My advice to him would be, to reconsider whether he’s actually interested in ‘criticism’ or not; as a practice more like art, and one that does not depend on a utilitarian purpose. If he does, he should cut his losses and start again. Rethink his premise. What variables are actually involved in the production of ‘criticism’? Don’t write in anger or out of frustration.

P.S. I’m calling bullshit on the positioning of Cook’s piece as a “draft”. Drafts don’t get released to the public. In my view, he’s merely trying to push buttons and, judging by the reactions, it’s working. If he’s going to put his words out there, he should have the temerity to damn well stick by them. Rants don’t get revisions.

Presented without comment #11

Is the squeeze worth the juice?‘ by Glen Fuller at New Matilda.

The hyperbolic vitriol of some of the commentators about the case currently underway is a bit disconcerting. Chief among them is Mike Masnick, TechDirt.com CEO and editor, who has consistently represented himself as an evangelical apologist for the worst excesses of digital capitalism in his attempts to understand new business models. He does, however, isolate one crucial point:

“Can you imagine the impact on the internet as a whole if Tasini actually won? It would basically uproot the entire concept of the internet. Any site that involved user contributions would have a massive liability.”

I am sure I wasn’t the only person who laughed a little at Masnick’s suggestion that the “entire concept of the internet” was at stake (is the dream of the ARPANET creators orTim Berners-Lee under threat?). What is actually under threat is the ability to use the internet to induce users into surrendering their labour for free.

‘Chapter 6: In which Ralph is called Ralph‘ by Mark Dapin, in “Sex and Money” (2004), Allen and Uniwn, p.103.

Howard counterfeited political capital by characterising the whole Keating political culture as ‘PC’, and smeared his attempts at reconciliation with the noxious grease of commonsense racism. The subtext to the sneers was always, ‘Nobody really likes Abos/slopes/queers – so why pretend?’ The campaign against politica correctness became a One Nation policy platform.

Dark Clouds‘ by Scott Juster at Experience Points.

As more and more of our lives move into the cloud, it’s likely that outages and problems will become more crippling despite likely being less frequent. In other words…the ubiquity and reliability of on-line storage will continue to grow, but with so many more people, devices, and services relying on the Internet, any problems that do crop up will be extremely inconvenient, if not devastating.