Programming Language

views updated


In order for computers to accept commands from humans and perform tasks vital to productivity and e-commerce, a means of communication must exist. Programming languages provide this necessary link between man and machine. Because they are quite simple compared to human language, rarely containing more than few hundred distinct words, programming languages must contain very specific instructions. There are more than 2,000 different programming languages in existence, although most programs are written in one of several popular languages, like BASIC, COBOL, C++, or Java. Programming languages have different strengths and weaknesses. Depending on the kind of program being written, the computer it will run on, the experience of the programmer, and the way in which the program will be used, the suitability of one programming language over another will vary.

At the most basic level, computer hardware is controlled through machine language, consisting of numbers (mainly zeros and ones). Immediately above machine languages are assembly languages, which use mnemonic names instead of numbers to represent instructions. This level of language is the lowest a programmer is likely to see. Special programs known as assemblers take assembly code and translate it into the machine language used by a computer's hardware. Although they aren't numeric, assembly languages have several disadvantages, including the fact that they are hard to understand and often are very specific to a certain machine's central processing unit (CPU). Each kind of CPU has its own form of assembly language. In the early 2000s, programs usually were not written directly in assembly language. Rather, it was used by experienced programmers to work on critical parts of computer programs.

When people refer to programming languages, they normally mean one of many different kinds of high-level languages or fourth-generation languages that reside above the level of assembly language. Unlike machine and assembly languages, high-level languages resemble human grammar and syntax more closely, and are often portable to different operating systems and machines. Three programming languages were instrumental in opening the lines of communication between programmers and computers. FORTRAN, COBOL, and ALGOL were created around the 1950s and many variations of these languages were still in use during the early 2000s.

Once a program is written in a high-level language, a program called a compiler or an interpreter is used to convert it to a computer's specific machine language, much like an assembler converts assembly code into machine language. As explained by Daniel Appleman in his book How Computer Programming Works, "A compiler translates an entire program into machine language. Once translated, a program can execute by itself. An interpreter reads your program source code and performs the operations specified without actually translating the code into machine language. The interpreter program executes the program you create, so your program always requires the interpreter."


According to Computer Languages, Konrad Zuse created the very first high-level language in 1945. During World War II, Zuse, who had previously constructed several basic, general purpose computers in his parent's apartment, fled Berlin for the Bavarian Alps, where he lived as a refugee. The programming language Zuse created translated from German as "The Plan Calculus." In theory, this computer language could be applied to a variety of different computer problems. Unlike the computers that existed at the time, Zuse's program relied not on decimals, but on binary notation.

Zuse's programming language was never adopted for widespread use on actual computers. FORTRAN and COBOL were the first high-level programming languages to make an impact on the field of computer science. Along with assembly language, these two high-level languages have led to or influenced the development of many modern programming languages, including Java, C++, and BASIC.

FORTRAN (FORmula TRANslating), released in 1957 after a three-year developmental period, is well suited for math, science, and engineering programs because of its ability to perform numeric computations. The language was developed in New York by IBM's John Backus. At the time, IBM was trying to make computer's more user-friendly in an effort to increase sales. FORTRAN achieved this goal, because it was easy to learn in a short period of time and required no previous computer knowledge. It eliminated the need for engineers, scientists, and other users to rely on assembly programmers in order to communicate with computers. Although FORTRAN is often referred to as a language of the past, computer science students were still taught the language in the early 2000s for historical reasons, and because FORTRAN code still exists in some applications.

COBOL (COmputer Business Oriented Language), was released in April of 1959 and has been updated several times since then. Shortly after the introduction of FORTRAN, users from different fields, including academia and manufacturing, convened at the University of Pennsylvania to discuss the need for a standardized business language that could be used on a wide variety of computers. The eventual result was COBOL, a programming language well suited for creating business applications. COBOL's strength is in processing data, and in its simplicity. Because the language is readable and easy to understand, it is difficult to hide malicious or destructive computer code within a COBOL program, and is easy to spot programming errors.

In the early 2000s, COBOL was a frequently discussed topic in e-commerce circles. Many companies sought to allow customers to access data on mainframe computers running COBOL programs. Finding ways to enable COBOL to interface with hypertext markup language (HTML), which is used to create pages on the World Wide Web, became important.


Appleman, Daniel. How Computer Programming Works. Berkeley: Apress. 2000.

Computer Languages. Alexandria: Time-Life Books. 1986.

Hansen, Augie. C Programming. New York: Addison-Wesley Publishing Publishing Co., Inc. 1989.

Radcliff, Deborah. "Moving COBOL to the WebSafely." Computerworld, May 1, 2000.

"Programming Language." Ecommerce Webopedia, March 12, 2001. Available from

"Programming Language." Techencyclopedia, March 12, 2001. Available from


programming language

views updated

programming language A notation for the precise description of computer programs or algorithms. Programming languages are artificial languages, in which the syntax and semantics are strictly defined. Thus while they serve their purpose they do not permit the freedom of expression that is characteristic of a natural language.

About this article

programming language

All Sources -
Updated About content Print Topic