Journal Articles
Browse in : |
All
> Journals
> CVu
> 121
(30)
|
Note: when you create a new publication type, the articles module will automatically use the templates user-display-[publicationtype].xt and user-summary-[publicationtype].xt. If those templates do not exist when you try to preview or display a new article, you'll get this warning :-) Please place your own templates in themes/yourtheme/modules/articles . The templates will get the extension .xt there.
Title: Is there a "best language"?
Author: Administrator
Date: 03 January 2000 13:15:34 +00:00 or Mon, 03 January 2000 13:15:34 +00:00
Summary:
Body:
"If you think C++ is so good, how come it can't do graphics? Pascal can do graphics. Does C++ need to catch up with Pascal?" - anonymous C++ beginner who knew some Pascal.
I see this view as the wrong way to compare modern programming languages. In the early days of home computers, different computers had different capabilities and so did their built-in languages (usually various flavours of BASIC), and it made sense to speak of, for example, BBC BASIC being better at graphics than some other form of BASIC. Nowadays, however (and even in those early days outside the home computer market), there is a distinction between a language and a library. Any such language can "do" graphics (and sound and anything else) if the appropriate libraries are added. In fact, if C++ can't do graphics then neither can Pascal - that student had been taught Borland Turbo Pascal for DOS, which included the Borland Graphics Interface (BGI) as a library, and the teacher had not made clear the distinction between the Pascal language and the add-on library. The student was therefore led to believe that the graphics functions were part of the Pascal language itself.
Why does the difference matter? To understand this, consider taking a Pascal program written on Borland Turbo Pascal for DOS and compiling it under Windows. Since the BGI only works in DOS, none of the graphics will work anymore. (Actually there is a port of the BGI for Windows, but it is expensive.) And yet the Pascal program will still be a Pascal program. If you know the difference between the language Pascal and the graphics library, you will know what is wrong and be able to change it. But if you think of them all as part of the same thing, you may not.
You may think you only ever want to write programs for one specific platform (operating system), but what happens to your skills when that platform becomes obsolete? Windows will one day go the same way as DOS, and the major "portable" languages will almost certainly outlive it. Also, commercial libraries sometimes become obsolete as their sellers go out of business. There is also a positive reason why you should understand the difference between a language and a library, and that is that you can choose between different libraries to use in the same language. If you don't like something about a library or it is inadequate, you may be able to use a different one that suits your purposes. If you prefer to learn the language and the library as one thing without worrying about the distinction, you will not be able to do this so easily.
The usual way of ensuring that your program will work on more than one library is to isolate the library-specific parts of the program from the rest. Unfortunately, this is not always possible with GUI libraries, since many of them demand that the whole structure of your program revolves around the library's model of the user interface. It is as though, instead of the user interface being added to the program, the program is added to the user interface. Cross-platform development is still possible with cross-platform GUI libraries (such as V, Qt and GDK) and by using libraries that work on a similar basis, but it remains difficult to add completely different types of interface to your program. Furthermore, having to learn details about GUI code, such as event loops, callbacks and message dispatching (which not every application is suited to), may cause the novice to unwittingly spend longer on learning about GUIs than on learning about programming. Programming will almost certainly last longer than GUIs.
However, people learning to program need gratification, and nowadays they are often not satisfied with being limited to text. I have one friend who wants to start with a simple animation controlled by a real-time dialogue box. Unless somebody writes a much simpler GUI library that does not place demands on the structure of your program, it is inevitable that using such an example will lead to being constrained by whatever GUI library is in use and not getting a chance to experiment with different approaches to design. Instead of learning the language through use of the library, one would be learning the library through use of the language. If that's what she wants, fine, but she asked me to teach her C++, not Qt, MFC or whatever.
The comparison of languages by their commonly available libraries is not new. It is responsible for the longevity of FORTRAN, before the Numerical Analysis Group (NAG) ported their library to C++. In this case people learnt and used FORTRAN simply because it was the only language for which the NAG library is available; their goal was not so much to use FORTRAN as to use the NAG library, and such a comparison would not have been valid if libraries like NAG were available for several other languages in the way that today's user interface libraries are available for many languages. Indeed, some libraries have language-independent APIs (application programming interfaces), which means it doesn't matter which language you use as long as you can call the API from it.
Java is unusual in that many of the GUI libraries are included as standard, so it theoretically guarantees portability. However, the standard is not static and some have commented "write once, run anywhere today and nowhere tomorrow", so you have been warned!
If comparing languages by what their runtime libraries offer is so limited, what, then, is the "right" way to compare languages? Is there one?
You have probably heard of the view that there is no one "best language", but different languages are suited to different applications. But why? Are there things that some languages can "do" but others cannot?
There is a piece of computation theory that says the answer to that last question is No. This originates with the concept of the Turing machine, which is a simple theoretical computer that can move linearly along an unbounded tape, reading and writing symbols according to a simple set of rules. Amazingly, the humble Turing machine can emulate (albeit very slowly) every kind of CPU ever invented by man, and probably every kind that will ever be invented. The theory behind this (Church's thesis) has not actually been proved, but it seems likely, especially given that other attempts to model our intuitive concept of "computability" are provably equivalent to the Turing machine.
One result of this is the equivalence of Turing-powerful languages. A language is Turing-powerful if it can emulate a Turing machine, and can therefore emulate any other language. Since a Turing machine program can be written in just about any programming language (and even in some special-purpose description languages like Postscript), it follows that any program written in any language can be emulated in almost any other language. You may have to do a lot more work to make the emulation happen, but it can still be done.
Why, then, chose one language over another? Why not simply pick the first language you ever came across and program in it for the rest of your life?
Apart from practical considerations, such as repositories of existing code and the availability of compilers for embedded systems, the main answer to this question is that different things are idiomatic in different languages. Object-oriented design, for example, is more idiomatic in a language that was developed with object-orientation in mind than in one that was not, even though it can be done in both. Functional programming is more idiomatic in a functional programming language, backtracking and clause resolution is more idiomatic in a logic programming language, low-level memory manipulation is considerably more idiomatic in a language with pointers, and so on. If something is more idiomatic in a particular language, then it is easier and less error-prone to write it in that language (assuming you are familiar with the language), and, as with natural languages, the more idioms you know, the easier it will be to express yourself.
Why, then, shouldn't there be a language in which everything is idiomatic? Unfortunately, such a language would have no single framework without compromising on some of its features, and it would probably amount to having several modules written in different languages with APIs between them, but that can be done already.
Unfortunately, in the commercial world the choice of a language is often affected by its marketability, or by the expertise available in a particular company. My friend who wanted to start with animation later implied that she was doing it for auxiliary CV points and was therefore not interested in perfecting her skills so much as in writing eye-catching programs in marketable languages with as little effort as possible. I don't really know what to say to that.
Notes:
More fields may be available via dynamicdata ..