The everyday blog of Richard Bartle.
RSS feeds: v0.91; v1.0 (RDF); v2.0; Atom.
Previous entry. Next entry.
9:49am on Saturday, 9th June, 2018:
Thursday and Friday this week I was at the post-GGC workshop organised by the Higher Education Video Game Alliance. From listening to people talk about how computer games are taught at their universities, and chatting to them about the way forward, it strikes me that there's a lot yet to do in this area.
So, the usual scenario is that computer games are not considered a natural fit by most universities, and in order to set up degree schemes in the subject it has to be obvious that they will attract money (in the form of undergraduates). The way this idea is typically sold to skeptical university high-ups is that the video games industry is rapidly-growing and in need of qualified personnel (which is indeed true). This argument, in combination with the availability of a ready-made IGDA course template and the recruitment of practitioners from games companies to provide subject matter expertise, the degrees that are developed end up being very industry-focused.
They're more than that, though. Compared to other subjects taught at university, they're ridiculously industry-focused. Creative Writing courses are not designed around the skill sets that book publishers demand of authors. Screenwriting courses are not designed around the skills that the film industry said it needed in a consultation exercise. There's a relationship between what's taught and the circumstances of the industry, sure, but these remain degrees which are more about education than training. Games degrees seem to be mainly about training.
They're very specialised, too. Computer Science departments take input from industrial advisory panels, but they don't have people spending three years learning how to program in one language, using one set of tools, so they can go for that one kind of programming job. It would be over-specialisation. Games degrees do pretty well just that, though.
There's also a big thing made of portfolios. Students should have a portfolio of work that can demonstrate their skills and creativity to prospective employers. Now there's no problem with this if the work is the product of one person's endeavours (as it might be for an artist, say). Most universities have their students design and develop games as a team, though. This leads to more impressive, sophisticated and complex games, but it also makes it hard to unpick the contributions of individual team members. If you're one of three programmers on a team, a prospective employer is going to have to interview you to discover what you did. The very point of a portfolio is to find out who's actually worth interviewing, not to provide subject matter for an interview (although it can lead to that). Also, employers aren't going to look at more than one, possibly two, portfolio entries anyway; they don't have the time. They'd rather play one game that a student made on their own than half a dozen games they made with other people. The portfolio requirement seems to be locked in, unquestioned, though.
This is at undergraduate level. At postgraduate level, the tables turn completely.
To set up a games degree, you need someone to teach it. As I said, some of these can be recruited from industry. These individuals are not going to have research experience. They're not going to have teaching experience either, of course, but that's less of a problem as they're subject experts and will at least have been on the receiving end of teaching at some point in their lives. They're going to have to learn to do research, which is possible if their university allows it but it's not what they're paid to do.
Researchers with a more academic background will also do teaching, and it could be very technical in nature; they are unlikely to research what they teach, though. This is because they got their PhD studying something not industry-facing, supervised by a professor with no industry experience. It's entirely possible that no-one in a department researches anything that the department teaches at undergraduate level. OK, so there may be a Games Studies module or a Serious Games module, but they're peripheral. Most of the modules will be skills-based and concern topics of little research interest to researchers.
Scandinavian Universities do have a component known as a Bachelor's thesis. This involves doing some actual research, which might be sort of technical (comparing various game engines with each other) but is often not (looking at how some favoured minority group is represented in games). Some of this does align with the research interests of the faculty members, and it can act as a way to identify potential Masters and PhD candidates for those universities that have programmes for them.
This isn't research that people in industry would say they wanted if you asked them. They'd want better AI or faster ways to push polygons or better project management methods or some way to do fast procedurl generation. The academic response is that industry doesn't have the breadth of knowledge to know what it wants, and anyway it does take on board research that it might not have asked for but found valuable anyway (it turns out that making your game characters more diverse relly does sell more games).
Wait a moment, though. How come universities are willing to turn themselves into the training arm of the games industry when it comes to undergraduate work, but have next to no interest in what industry's requirements are for research? Shouldn't they be consistent? Either do research that industry needs, or teach what is academically more interesting? There's a disjunction here.
Other academic departments also have such disjunctions. If no-one is taking Biology any more, you launch a Sports Science degree and watch the undergraduates flood in. Some will be taught by professional sports people employed as adjuncts and some will be taught by academics. It doesn't matter that this particular academic's main interest may be plants from the Amazon: the aim of Sports Science is to act as a cash cow to support the "real" research. Chemistry departments that have a Pharmacy degree often do the same thing: you could be taught by someone interested in inorganic chemistry or motor racing fuel.
With Computer Games, it looks as if the industry-facing undergraduate degrees are usually acting as cash cows to support research in Games Studies or Computer Science or Communications Studies or whatever. There is little research being done on actual game design and development themselves. This is dangerous: if researchers in a department look on Computer Games as a cash cow, the rest of the university is entitled to regard the whole department the same way. There does need to be some research that is directly releated to what is being taught to undergraduates (and I do mean directly, not "game designers need to know this because I think it's important").
Clearly, this isn't going to be true for all parts of a degree. Undergraduates do need to be taughts the basics. No-one researches anatomy any more, but medical schools nevertheless have to teach it because so much else depends on knowledge of it. It's the same with games. Some of it is going to be bookwork. However, it shouldn't be the case that the entire degree is bookwork. The students need to know what it is they're doing, why they're doing it, and how this can help them make better games. Diving into a Deleuzean reading of a stealth game week after week isn't going to achieve that.
My own view is that this focus on industry is going to have to change as undergraduate games degrees mature. At the moment, it's putting the cart before the horse. If you ask industry what skills it wants people to have, you'll end up training students to have those skills; if you ask industry what people it wants, you'll end up educating students to be those people. The latter is much better for industry, students and researchers alike.
With research, well if it aligns, great! I'm not complaining that both AI research and the games industry are benefitting from looking at AI for games (or even games for AI). My concern isn't that research doesn't serve the needs of industry, it's that it doesn't serve the needs of undergraduate teaching. Students at research universities should be being taught the latest techniques because that's what the person teaching them is researching. Instead, they're taught the latest techniques that there's a book for, written by someone teaching other students elsewhere.
I want better games. The current way that Computer Games degrees are generally set up is not going to lead to that. This is why we games academics all need to talk to one another.
About this blog.
Copyright © 2018 Richard Bartle (firstname.lastname@example.org).