Roughly every decade, two years worth of students get invited back to my college, involving afternoon tea, a service in the chapel, and a drinks reception followed by a formal dinner in hall. In a few days time it's my year's turn, and it looks to be a good turnout, with around 130 out of around 200 possible signed up. (I know this because there's now a web page showing who's coming - a sure sign of the ubiquity of the internet!). The last one was great fun, and was fascinating seeing how people had changed (not much), and catching up with home and work life. This year won't be quite as surprising as many people are in touch via LinkedIn or Facebook (which now has an amazing 26m UK users [BBC], that's around 40% of the population). But anticipating seeing everyone (and rehearsing my answers to the inevitable 'what are you doing now' questions) got me thinking about how well my education had prepared me for my working life.
The first thing is to get some background about the computer industry, and education. I'd always been brought up with computers in the background, as my father worked as a computer engineer and later a trainer. Visiting the training facilities on the way home from school led to my first programming experience when I was around 10-11 using teletypes and VDUs attached to large VAX clusters. This interest led to me getting one of the first ZX Spectrums as part of the exploding home computing craze. Sure, I played lots of games, but I also did a lot of programming, not just using the built-in BASIC, but also using assembler, C, Forth and Pascal. I eventually sold it just before the peak, and didn't get another computer of my own until around 2000.
So how was I using computers during my education? Well, the answer is 'hardly at all'. When I was choosing which O levels to do, I was advised that I'd be able to pass Computing tomorrow, so I didn't take it. I did use a bit for my Control Technology project, which used a computer controlled gadget to read multiple choice question papers (amusingly on the morning of the assessment one of the photo receptors failed, but realising that using my test sheets I could infer its value from the others, I managed to tweak the program to work correctly. An early lesson in data redundancy and error correction). For my A levels we didn't need computers at all. And in my Maths degree, we had a couple of projects using BBC micros to solve Schrodinger's equation, and do numerical analysis finding iterative solutions keeping track of errors. While I did some really nice user interfaces and code structuring, the projects were marked for the mathematical conclusions you came to rather than the code. And that was it! The main things I took away were being able to think logically, and some idea of how to understand orders of magnitude (useful when it came to understanding the STL's complexity guarantees) and estimate numerical errors.
Thankfully I was getting some proper experience elsewhere - to top up my finances I'd got a sponsorship from the Ministry of Defence, and spent the summers in Malvern at the Royal Signals and Radar Establishment (now part of Qinetiq) doing statistical analysis of aircraft trajectories to better understand the consequences of a collision avoidance system [TCAS]. This was on VAXes using Pascal and various maths libraries, and LaTEX for creating the reports, and I learnt a lot about optimisation and code structuring. It was probably this experience that allowed me to get my first job in software.
So would I have been better prepared if I'd taken a Computing degree? Very likely, although at the time the course was much more theoretical as it was harder to get access to computing resources. Also, the whole idea of Software Engineering was relatively young. Nowadays it's pretty much a given that everyone has their own laptop, so courses are more hands-on and relevant (as an example, here's Cambridge's lecture and project list [Cambridge]), although it's still hard for people to get 'real-world' experience on projects of a decent size. This is inevitable and must be taken into account when hiring graduates, making sure you understand the strengths and weaknesses (basically, they've done a bit of everything but not much depth, and will not have much idea of the software life cycle nor projects of much size) so you can design their early projects and experiences to get them up to speed, perhaps using some mentoring/apprentice model.
So what sort of education, formal or otherwise, have I done over the years while I've been working? I've never had any formal training that lead to a qualification I could put on a CV. Some do exist, and some sectors are keener on seeing things like that than others. One of the problems is that computing evolves so quickly that putting together a formal syllabus can lead to it being out of date almost immediately. Quite often such qualifications are pushed by the larger companies as a way of promoting their technologies, such as MSCE [Microsoft] or SCP [Sun]. These tend to be more up to date, and can be useful if you need to specialise in that area. I have attended some training courses, and you do get some sort of certificates for attending, or even taking some sort of exam, but I'm a bit more dubious about the worth of many of them - a few days study will get you up to speed on the basics, but doesn't tell you much about how well you can put things into practice, and whether you've progressed to becoming proficient, or even expert. Those things takes time, experience, learning from the experiences of others.
It is this last area where organisations like ACCU really shine. By bringing together a disparate group of people all eager to learn you can get great insights that would have taken you years to have had (if at all). Even if it's just planting the seed of an idea, when a future problem needs tackling quite often these ideas will pop up and you have a possible avenue to explore. Quite often though you learn something of immediate use, which will improve your ability to produce good quality and value software.
Of course this is just my experience, and I'm very aware that my career is very different from other people's. In particular I've tended to mainly work for small companies, often start-ups with small customer bases and a small amount of largely new code that has to be produced quickly without letting quality fall. In such an environment broad experience and adaptability tend to be prized. It would not surprise me that in other situations, perhaps something like financial modelling for a large bank, then in that case long term qualifications showing real depth in a specialized area would be favoured instead. And yet both extremes still have in common that continuously learning, whether via books, magazines, conferences, or formal courses, is the only way to improve and keep your skills current.
The wisdom of crowds?
A few issues ago I mentioned this book by James Surowiecki [Wisdom], which discusses how large groups of people can be better at certain tasks than individuals, even experts. He wisely also discusses when such ideas are not applicable, and can lead to sub-optimal solutions. Interestingly there have been some great examples recently, which quite often rely on computing, in particular social media. One odd one was BP asking for ideas to help cap the deep water oil leak in the Gulf. However, this wasn't as crazy an idea as it first seemed, as they were dealing with a problem that no one had had to face before, so casting around for ideas could inspire alternative approaches in case the main approaches failed. The main problem with it was more political - it makes it look like they didn't have any ideas of their own, and are desperate.
This is also a risk with one of the more recent ones launched by the new government for people to suggest which laws could be repealed [YourFreedom], and areas for spending cuts [SpendingChallenge]. A neat idea in theory, not that dissimilar to an online suggestions box, but the way it was implemented as a Web 2.0-style social forum meant that it tended to be the loudest with a grudge to dominate the discussions and suggestions, which has the danger of drowning out the more interesting ideas. This is almost inevitable - by making it so open, public, and interactive (for perfectly laudable reasons), it breaks some of the criteria to get a Wise Crowd. Hopefully someone will be taking the time to sift through all the ideas to find the interesting unexpected ones, rather than the obvious ones with a large populist backing. It does show that social webs are excellent at allowing people to seek out and interact with like minded people, but are not as good at getting a balance of opinion as they are all too easily susceptible to group-think and self-selection. While such uses of technology can be powerful, you do need to understand whether the dynamics of the resulting system match your needs.
References
[BBC] http://www.bbc.co.uk/news/technology-10713199
[Cambridge] http://www.cl.cam.ac.uk/teaching/0910/CST/
[Microsoft] http://www.microsoft.com/learning/en/us/certification/mcse.aspx
[SpendingChallenge] http://spendingchallenge.hm-treasury.gov.uk/
[Sun] http://en.wikipedia.org/wiki/Sun_Certified_Professional
[TCAS] http://en.wikipedia.org/wiki/Traffic_collision_avoidance_system
[Wisdom] http://en.wikipedia.org/wiki/The_Wisdom_of_Crowds
[YourFreedom] http://yourfreedom.hmg.gov.uk/
Overload Journal #98 - August 2010 + Journal Editorial
Browse in : |
All
> Journals
> Overload
> 98
(7)
All > Journal Columns > Editorial (221) Any of these categories - All of these categories |