I wanted to use my first blog post as an opportunity to talk about something close to my heart. That is computer science in my home country of England (part of Great Britain and Northern Ireland). This article may just explain some feelings you’ve had in recent years about a lack of programmers in Britain.
I: Wooden Computers Be Good
Great Britain has been very influential in the video games industry over the past 30 years. I was lucky enough to have lived through it, ever since I first heard of Apple (& Tangerine), Texas Instruments, Acorn Computers, Sinclair Research and, for my sins, the EACA Video Genie (a clone of the Tandy TRS-80). In the 80’s we had a massive push from government to get computers into schools and into homes. Cheap hardware such as the BBC Micro, instructive television programmes and a global fascination for movies about a future in outer space assured Great Britain would be a leader in entertainment technology. Let’s jump forward 15 years, or so.
II: Computational theory
Over the past 10 years I have interviewed many people for potential programmer jobs in the games development companies I have been working. I’ve hired some great men and women to those roles and I’m happy to say they’ve all worked out really well. However, about 5 years ago I found that the skills the applicants had were changing and becoming less easy to match to those I expected. The knowledge with which programmers come out of college and university is so different to what it was 10 years ago.
Back then, programmers knew all the low level stuff: assembler, bit twiddling, fixed point vs. floating point arithmetic, memory cache usage, branch prediction, sorting… to name a few. I can’t ask these questions anymore as I always get blank looks (try it!).
III: The abandonment of theory for practice
About 4 years ago I moved jobs. I went from a small independent company to one much larger which had a lot of links with education. This link was one of the factors that drew me to the company. It is not the only company thankfully, but back then, there weren’t that many.
It became evident after hearing their experiences that games development courses were poorly structured. They were focusing on new media, i.e. the web, and they were clumping game development into that group. As a result we have been pushing these courses to teach C++ to those that want to attract students that want to get jobs in console game development [SkillSet].
Unfortunately it doesn’t end there. About 2 years ago there was a noticeable drop in the students coming out of university with computer science or similar degrees (57% fall in A Level Computing between 2001 to 2009). It wasn’t that they were teaching the wrong material any more, but that there weren’t the students going into computer science degrees. This had partially been the reason that universities had started dumbing-down the courses in the preceding years. The intake was waning and they had to attract more to their courses. “Bums on seats ==£££”
So what was happening? Unfortunately for the British education system our government had listened to very big business & corporations when setting education policy. Computer Studies (the study of how computers work, and a course that I took at school in the 1980’s) was replaced with ICT (the study of how computers can be used). ICT should never have been seen as a replacement as it teaches children how to use a word processor, a spreadsheet, a database, a web browser, an art package, etc. (Of course, you can all guess which particular brand of word processors, etc. they were being taught). The teachers could follow a curriculum and they could mark how well they understood these tools. Everything seemed good.
But there was no study of computer science. From my investigations, the study of computer science for British children ends at about age 11. They do a bit of turtle programming, instruction sequences, and traffic lights then it’s onto Microsoft Office. They could have moved onto type theory, the model-view-controller or more deeply into sequences-selection-iteration with the aim of finding those interested in the science. However, children spend 6 years clicking boxes and dragging rectangles and then they are asked at 17; “so do you fancy doing more of that at university in a computer science course?” What do you think they say?
IV: A New Hope
The Livingstone-Hope report (Next-Gen.) of last winter has made recommendations for skills we need to encourage for our creative and technical excellence. Google’s chairman Eric Schmidt has criticised how the British, who invented many of the computer software and hardware advances, have uninspired its future technology innovators. The Computing at Schools working group is supporting and promoting the positive vision of computing in UK schools.
I’ve been following all of these groups, initiatives and reviews over the last couple of years and, as an active member of CAS, I can see a definite change in how children between the ages of 11 and 18 are going to be taught. I am very excited about the prospect of this and what it will mean for computer science. After a period of hiatus in growing a new generation Britain’s programmers, I hope to see the next decade as being very exciting for computer science across the country.
Even tonight (Monday 10 October 2011) I hear that the BBC Newsnight programme has an article about exactly the topic of this blog. David Braben will be showing the Raspberry Pi, which could be a new BBC Micro for this decade, although I’ve also just read about a project for a BBC Micro 2.0! Finally there is the Behind the Screen project which is running a trial curriculum for computer science skills in schools.
We can increase the number of talented programmers in Britain but it clearly won’t happen with our current system. And the worst thing is,
we had the answer back in 1982. Wooden computers be good.
/end of line