Journal Articles
Browse in : |
All
> Journals
> CVu
> 322
(9)
All > Topics > Process (83) Any of these categories - All of these categories |
Note: when you create a new publication type, the articles module will automatically use the templates user-display-[publicationtype].xt and user-summary-[publicationtype].xt. If those templates do not exist when you try to preview or display a new article, you'll get this warning :-) Please place your own templates in themes/yourtheme/modules/articles . The templates will get the extension .xt there.
Title: Thoughts on ‘Computational Thinking’
Author: Bob Schmidt
Date: 02 May 2020 18:17:19 +01:00 or Sat, 02 May 2020 18:17:19 +01:00
Summary: Silas S. Brown considers the drawbacks of skill assessments.
Body:
I met a PhD student from Cambridge University’s Faculty of Education who is engaged in an ‘educational psychology’ project in collaboration with Cambridge Assessment (a university-affiliated company that runs rather a lot of school examinations), and has been gathering the viewpoints of computing teachers and anyone in the computing background with a view to developing a method of assessing a skill called ‘computational thinking’, that is, the ability to think about a problem in such a way as to be able to program a computer to solve it.
Frankly, I am concerned about what Cambridge Assessment is going to do when they think they have a method of assessing computational thinking. Will they believe that this abstract skill called ‘computational thinking’ can be measured in a child before they learn any computer science, to find out whether it’s likely to be worth teaching them or not? If so, the worst that can happen is it will become like the ‘Eleven Plus’ examination feared nationwide throughout the mid twentieth century, segregating children into streams of apparent ability before they had the chance to demonstrate how they will really develop, setting up late developers for further failure and generally increasing the gap between the haves and the have-nots. (I do have a slight bias here as my grandmother taught children who failed their Eleven Plus.) If the assessment ends up measuring ‘people who are good at passing exams’ instead of anything else, then it has failed and I’d hate to see it be used to screen out children from having the opportunity to learn how to program.
In the early 1990s, my parents saw an advertisement for a ‘high IQ’ club called Mensa and thought I might be clever enough to join. For this, I had to take a test to measure my ‘IQ’, and the school kindly arranged to supervise it. The exact nature of how Mensa tests are graded has probably changed over the years, so I don’t know if it’s the same today, but, at the time, I recall they said anyone who scored an IQ of 140 or above can join a special elite inner circle of the club, and I only got 139. (I might now be misremembering the exact figures, but I do recall that I very narrowly missed out on the elite membership. I could still get a normal membership, and I did participate for a year, but I didn’t feel it was delivering enough value to justify the renewal cost to my low-income family.) But the reason why I’m telling the story now is this: Two or three weeks after I took that Mensa test, I happened to learn in a maths class how to solve simple simultaneous equations, and I immediately realised that this was the tool I was missing for one of the Mensa questions I couldn’t answer. If only I’d taken the test after that maths class instead of before it, I might have scored that extra mark and got the elite membership. And then I realised the IQ test was flawed. It was supposed to be measuring something called ‘intelligence’, which we are told is an intrinsic quality that does not depend on what you know or what you’ve been taught, but here I had evidence that the result can indeed be influenced by what the test-taker has or hasn’t been taught in the run-up to the test. Perhaps a child with an IQ of 200 who did not know simultaneous equations could re-invent them on the spot and get the mark anyway, but down here in the 130s it clearly did make a difference what I had or hadn’t been taught, so at the very least I had discovered the IQ mark has error bars we hadn’t been told about, which makes it less valuable than the advertising suggested, and if you set sharp cut-offs then the fate of borderline cases might be more random than you think.
And this shows what I am afraid of with regards to the idea of measuring computational thinking. I won’t say ‘there is no such thing as computational thinking’ any more than I will say ‘there is no such thing as intelligence’, but in both cases I don’t think we can reasonably expect to reduce it to a single number and measure it to three significant figures, nor do I think it possible for an assessment to separate off skills from knowledge. If there is indeed such a thing as an intrinsic level of skill you have before you learn anything, then I wouldn’t expect any test to detect the difference between someone who’s had a ‘head start’ by virtue of having more of this skill, versus someone who had less to begin with but worked harder to make up the difference.
How do we know the thing you’re trying to measure has only one dimension? Someone might be better at some aspects but not so good at other aspects. Imagine trying to assess every tool in a carpenter’s toolbox: does a hammer score more than a drill? Perhaps it does if we specify that the job involves banging in nails, but if we don’t yet know about the job then the comparison starts to lose meaning. Or, as it’s useful for all programmers to have at least some level of skill outside their main speciality, perhaps the comparison should be between two different Swiss army knives, one of which happens to have a really good blade, versus another that happens to have a really good corkscrew. Yes you could give them overall scores, but a higher score given to the one with the good blade might not help you so much when you have to open bottles. My personal weak points include so-called ‘front-end’ programming: yes I can just about do it, but I very much prefer if somebody else does that while I fill in the back-end code that does the actual processing. What does it mean to rank me higher or lower than a front-end programmer? True, all good tools have something in common (they’re well-made, they don’t fall apart as soon as you start to use them) and it might be possible to measure the programmer’s equivalent of that (for example, the ability to act professionally in our craft), but if trying to measure this, we have to be very careful not to confuse it with the particular strong-points that will manifest differently in different cases.
With regards to learning and training, well we all know there exist different teaching methods, and, while some learners might be so good that they’re going to learn anyway no matter how badly it’s taught, there are other cases where the quality of teaching is going to make a difference, and there might even be different methods that are suitable to different people. I have seen this even at university level, when I meet a student who hasn’t understood what a lecturer said but finally ‘gets it’ when I explain it differently (am I better than the lecturer? no, I’m just different, which is what that particular student needed), so students’ performance at assessment might at least partly depend on whether there happened to be a good match between their best learning style and the teaching styles they happened to encounter. That might be OK as long as we realise that’s what we’re getting and we don’t fool ourselves into thinking we’re measuring some kind of innate ability that’s completely removed from experience.
If there is a good predictor about whether a child would do well in computing, then if my experience is anything to go by I would suggest it may have a lot to do with how good that child is at patience and at seeing value in small things. I’ve lost count of the number of times someone has asked me to ‘teach them programming’ and then backed off when they realised they won’t be able to write the next blockbuster video game on a one-hour crash course. I, on the other hand, got left in the public library as a child (to save on heating bills at home), found a book that showed me the workings of a ‘half-adder’ (an essential part of the arithmetic/logic unit in processors), and drew out on paper the circuit diagram for a full adder (I think I even made it a 32-bit one although it was still the 8-bit era), saying one day I’ll figure out how to build this so I can have a computer to use (although just the adder by itself wouldn’t have been very useful, but I saw it as a good start). I salvaged circuit boards from the rubbish skip at the local telephone exchange and hoarded them, thinking I’ll eventually find out how to get the transistors off and build a processor (although family members didn’t think so and threw them away). I would spend hours writing out simple programs on paper, which my family tended to throw away as rubbish until a grandparent showed one to a local computer repairman who said it made sense. At one point, I could hardly look at any printed page without thinking about how the word-processor had wrapped the lines and how the print-head had moved over the paper, responding to the print-driver’s commands to put down the dots one small group at a time (if you looked carefully you might occasionally have seen me trace how I thought the print-head was programmed to move). Others called me crazy, but I saw value in such details, not taking them for granted.
When I was about 8 years old, I went on a school trip to the Alum Bay glass workshop on the Isle of Wight, and after demonstrating various aspects of glass-making, they pulled off part of the molten bulb, stretched it into a long thread, let it solidify and gave it to the teacher who broke it up and gave pieces to the children (just a few centimetres each) to see what we’d make of it. I, of course, had read the right library books and knew it was an optical fibre, and I felt like I was holding the very future in my hands. Just make it 3,000 miles longer and put it in the ocean, and we could exchange huge quantities of data with America (this was before the Web had been invented, but I knew something like that would come), or perhaps one day I’d be able to make a fully-optical processor and this will be one of the links. That is, until the child next to me was cruel enough to shatter it and say, “Look how upset he gets about only a bit of glass.†He saw a bit of glass; I saw the future it represented, and I found his lack of respect, his lack of interest, his unwillingness to even hear me tell him why it was special, to be even more upsetting than the loss of my fibre.I don’t know who’s going to be good at computational thinking, but I’m afraid people who only live in the here-and-now wanting instant gratification, people who don’t see the value in starting with small things, people who break the glass without thinking how it can run an Internet, are going to be less likely candidates unless they mend the error of their ways.
Who hath despised the day of small things?
~ Zechariah 4:10
It is the responsibility of the student to be interested. No one can be interested for you, and no one can increase your interest unless you so will.
~ William H. Armstrong, Study Is Hard Work, 1956
Sadly, children nowadays are less likely to stumble across those books in the junior section of the local library, partly because fewer libraries stock such things and partly because children don’t spend time in libraries. Some of them get given smartphones, each with more computing power than I could shake a stick at, but with no apparent need to program. One rather active child recently asked if I could show him how to ‘make an app’, and I asked if he had the patience to sit still and watch a timer count down from five minutes to zero, which seems a bit mean, but he’ll need more patience than that if I start saying, “Well, you need to learn how to use a text editor, and you need to set up a compiler, and to spend time learning about some basic language constructs, and learn how to browse the class-library documentation, and some principles about adaptive display layouts, and top-down design, and principles of user interaction, not to mention meeting the acceptance criteria for the Store†etc.
When I first got access to a real computer (as opposed to paper ones), it was a BBC Micro at school which booted into a BASIC interpreter: there are better languages to learn with, but at least that environment had a reasonable balance of letting you get into programming quickly while also showing that you can’t expect to build Rome in a day. Nowadays, programming languages are often not included at all, and, while Rome can be downloaded in seconds, the effort required to build it is not apparent and potential learners are ill-prepared for the shock of what they might be up against. Not that I want to discourage anyone, but they should realise this mountain means serious climbing, not just riding to the top in the tourist train. One organisation that is doing something about it is Cambridge’s very own Raspberry Pi Foundation (which seems to have succeeded more than MIT’s One Laptop Per Child project did a few years previously), putting ‘properly programmable’ computers into the hands of children and schools, although the take-up still has some way to go. MIT’s Scratch language is also encouraging, as despite its limitations it is being used in schools to introduce many children to the idea of programming (probably not the way I would have done it, but it’s something). Those who get somewhere with such things might be good candidates for further instruction, although I wouldn’t like to categorically rule out those who don’t.
A more advanced skill, which might need some practice, is that of looking at a set of instructions and figuring out how to ‘break’ them. Many instances of what we call ‘bugs’ are due to some programmer not completely thinking through all the possible branches their code could take. Back when I was a child in that public library, I also found a book that used an imaginary toy robot as a tool to introduce the idea of flowcharts (although I wouldn’t recommend using flowcharts in learning these days), and it didn’t take me long to realise there was a certain set of inputs the book’s authors probably weren’t expecting, which would send their robot into an infinite loop. The ability to think of such things can be a valuable asset, because you can then go back and fix your code to avoid that problem (it’s especially important in security, when you’re up against somebody who’s not merely being ‘stupid’ but is deliberately looking for ways to break what you’ve done), and the software ecosystem in general could be better if more programmers thought ‘what if this silly thing happens’ before it does.
Another important skill that often gets overlooked (but which again might need some practice) is the simple realisation that a symbol or a name might carry a meaning that’s different from the one you’ve used before. Take, for example, the humble assignment operator, which in most programming languages is an equals sign (=
). In algebra, if you see x = 5 then it’s reasonable to take that as a statement of truth: for the purposes of this question or discussion, x stands for 5, and it always has done and always will do. Then we move over to coding, and we find x = 5
is now an instruction, performed by a computer at a certain point in time, meaning ‘make x
equal to 5’. So at times before the instruction takes place, x
may or may not have been equal to 5, but at times after the instruction, x
will be equal to 5, until some other instruction comes along and changes the value of x
to something else. The ability of a student to get that flow of time into their head, instead of viewing x = 5 as a statement of truth as they’ve been taught to do in algebra, is a surprisingly good predictor of how well they’re likely to get on with the rest of programming, but I certainly hope it is a skill that can be taught rather than an innate tendency. But it gets worse: what if we see code that says if x=5
? In some programming languages, this changes the meaning of ‘x=5’ yet again, this time to mean ‘test the value of x
at this point in time and give me TRUE
if it’s equal to 5 or FALSE
if it is not’. In other programming languages, if x = 5
is an error (which may or may not be pointed out by the computer) and if you wanted the ‘test’ meaning, you should have written x==5
with two equals signs. You can argue the notation is badly designed, but coping with badly-designed systems is unfortunately one of the things we have to do. And what about variable names? Ideally, they should be well thought-out, but how often have you seen code using variable names that do not (at least to you) convey accurately the meaning of what is really stored in them? At some point, every learner has to grapple with the ‘rose by any other name’ idea and realise that the thing another programmer called X might be different from what you would call X yourself. Context is important, and although I wouldn’t call the understanding of this a fundamental part of computational thinking, it’s certainly a necessary hurdle to overcome unless you have the luxury of designing the language yourself from the ground up.
In the long term, the skills that make for good computational thinking might change. Right now, practically all computers you will program are based on the Von Neumann architecture (processor, memory, etc), and, at its most fundamental level, programming is the ability to ask yourself the question ‘If I were the processor, what should I do next?’ and writing down each step. (The ability to write out instructions for another human who is determined to follow them as ‘stupidly’ as possible might be a good start, since it gets you used to specifying everything and not leaving anything to ‘common sense’, which we can’t expect computers to have.) Nowadays we have abstractions, like function libraries, which can make the steps bigger, meaning you won’t have to write out the instructions in as much detail as you would if you didn’t have some common sub-tasks pre-defined by the library, but on some level every programmer is going to have to think ‘What should I do next if I were the processor?’ and write it down. This will probably be the case for years to come. But there are two areas of research that just might have the potential to change it a bit: massively parallel processors, and quantum computation (although I’m not so sure about the latter). Traditional computer programmers find the programming of parallel processors to be more difficult: it’s far easier to think ‘What should I do if I were the one-step-at-a-time processor?’ and come up with a series of steps in order, than it is to think ‘What should I do if I were this huge collection of networked processing nodes?’; much research in parallelism goes into making it easier for old-style programmers to cope with without having to hold too much awareness of all the parallelism that’s going on underneath. But could there be some child out there somewhere with an altogether different way of thinking that just happens to suit parallel processing better? If so, they just might be the future, and I wouldn’t want to mark them down too much just because they find our old-fashioned sequential stuff too hard. If an assessment is introduced then I would like it to take an approach that is as broad as possible, not limiting oneself to only today’s programming languages or design methods.
And regarding the idea of innate skills that are separate from what we learn, I am concerned that genetic traits in general are overrated. (It was the idea behind Nazi eugenics after all.) Suppose we turn it around and imagine a bad trait: suggest somebody comes to you and says ‘I can’t help being a serial killer, my doctor says it’s in my genes’. Although there probably isn’t a ‘serial killer’ gene, there may well be inheritable tendencies to assume a certain (mis-)balance of emotional states that make somebody more likely to become a killer. But that doesn’t mean such a person ‘can’t help it’. It just means they might have to put in more effort to control themselves than is required by the rest of us. I personally do not get an urge to kill somebody every time they do something I don’t like (which is just as well, otherwise I’d be going for all the smokers and Radio 1 listeners for a start), but if somebody out there does get such an urge yet successfully controls it, I admire them for having overcome what genetics threw at them and refusing to be a victim of their predispositions. Conversely, if someone’s genes predispose them to do well at a certain skill, that does not mean others don’t stand a chance: the one with the gene may have a head start, but it’s surprising how quickly you can lose a head start if others practice and you don’t. If we could measure just the predisposition, that measurement wouldn’t tell us anything, because a person’s final skills and characteristics will be made up not only of their inherited abilities but also of how they learned to control and use these – and the second is far more important than the first.
is a partially-sighted Computer Science post-doc in Cambridge who currently works in part-time assistant tuition and part-time for Oracle. He has been an ACCU member since 1994.
Notes:
More fields may be available via dynamicdata ..