Skip to main content
Select Source:

programming language

programming language, syntax, grammar, and symbols or words used to give instructions to a computer.

Development of Low-Level Languages

All computers operate by following machine language programs, a long sequence of instructions called machine code that is addressed to the hardware of the computer and is written in binary notation (see numeration), which uses only the digits 1 and 0. First-generation languages, called machine languages, required the writing of long strings of binary numbers to represent such operations as "add," "subtract," "and compare." Later improvements allowed octal, decimal, or hexadecimal representation of the binary strings.

Because writing programs in machine language is impractical (it is tedious and error prone), symbolic, or assembly, languages—second-generation languages—were introduced in the early 1950s. They use simple mnemonics such as A for "add" or M for "multiply," which are translated into machine language by a computer program called an assembler. The assembler then turns that program into a machine language program. An extension of such a language is the macro instruction, a mnemonic (such as "READ" ) for which the assembler substitutes a series of simpler mnemonics. The resulting machine language programs, however, are specific to one type of computer and will usually not run on a computer with a different type of central processing unit (CPU).

Evolution of High-Level Languages

The lack of portability between different computers led to the development of high-level languages—so called because they permitted a programmer to ignore many low-level details of the computer's hardware. Further, it was recognized that the closer the syntax, rules, and mnemonics of the programming language could be to "natural language" the less likely it became that the programmer would inadvertently introduce errors (called "bugs" ) into the program. Hence, in the mid-1950s a third generation of languages came into use. These algorithmic, or procedural, languages are designed for solving a particular type of problem. Unlike machine or symbolic languages, they vary little between computers. They must be translated into machine code by a program called a compiler or interpreter.

Early computers were used almost exclusively by scientists, and the first high-level language, Fortran [Formula translation], was developed (1953–57) for scientific and engineering applications by John Backus at the IBM Corp. A program that handled recursive algorithms better, LISP [LISt Processing], was developed by John McCarthy at the Massachusetts Institute of Technology in the early 1950s; implemented in 1959, it has become the standard language for the artificial intelligence community. COBOL [COmmon Business Oriented Language], the first language intended for commercial applications, is still widely used; it was developed by a committee of computer manufacturers and users under the leadership of Grace Hopper, a U.S. Navy programmer, in 1959. ALGOL [ALGOrithmic Language], developed in Europe about 1958, is used primarily in mathematics and science, as is APL [A Programming Language], published in the United States in 1962 by Kenneth Iverson. PL/1 [Programming Language 1], developed in the late 1960s by the IBM Corp., and ADA [for Ada Augusta, countess of Lovelace, biographer of Charles Babbage], developed in 1981 by the U.S. Dept. of Defense, are designed for both business and scientific use.

BASIC [Beginner's All-purpose Symbolic Instruction Code] was developed by two Dartmouth College professors, John Kemeny and Thomas Kurtz, as a teaching tool for undergraduates (1966); it subsequently became the primary language of the personal computer revolution. In 1971, Swiss professor Nicholas Wirth developed a more structured language for teaching that he named Pascal (for French mathematician Blaise Pascal, who built the first successful mechanical calculator). Modula 2, a Pascallike language for commercial and mathematical applications, was introduced by Wirth in 1982. Ten years before that, to implement the UNIX operating system, Dennis Ritchie of Bell Laboratories produced a language that he called C; along with its extensions, called C++, developed by Bjarne Stroustrup of Bell Laboratories, it has perhaps become the most widely used general-purpose language among professional programmers because of its ability to deal with the rigors of object-oriented programming. Java is an object-oriented language similar to C++ but simplified to eliminate features that are prone to programming errors. Java was developed specifically as a network-oriented language, for writing programs that can be safely downloaded through the Internet and immediately run without fear of computer viruses. Using small Java programs called applets, World Wide Web pages can be developed that include a full range of multimedia functions.

Fourth-generation languages are nonprocedural—they specify what is to be accomplished without describing how. The first one, FORTH, developed in 1970 by American astronomer Charles Moore, is used in scientific and industrial control applications. Most fourth-generation languages are written for specific purposes. Fifth-generation languages, which are still in their infancy, are an outgrowth of artificial intelligence research. PROLOG [PROgramming LOGic], developed by French computer scientist Alain Colmerauer and logician Philippe Roussel in the early 1970s, is useful for programming logical processes and making deductions automatically.

Many other languages have been designed to meet specialized needs. GPSS [General Purpose System Simulator] is used for modeling physical and environmental events, and SNOBOL [String-Oriented Symbolic Language] is designed for pattern matching and list processing. LOGO, a version of LISP, was developed in the 1960s to help children learn about computers. PILOT [Programmed Instruction Learning, Or Testing] is used in writing instructional software, and Occam is a nonsequential language that optimizes the execution of a program's instructions in parallel-processing systems.

There are also procedural languages that operate solely within a larger program to customize it to a user's particular needs. These include the programming languages of several database and statistical programs, the scripting languages of communications programs, and the macro languages of word-processing programs.

Compilers and Interpreters

Once the program is written and has had any errors repaired (a process called debugging), it may be executed in one of two ways, depending on the language. With some languages, such as C or Pascal, the program is turned into a separate machine language program by a compiler, which functions much as an assembler does. Other languages, such as LISP, do not have compilers but use an interpreter to read and interpret the program a line at a time and convert it into machine code. A few languages, such as BASIC, have both compilers and interpreters. Source code, the form in which a program is written in a high-level language, can easily be transferred from one type of computer to another, and a compiler or interpreter specific to the machine configuration can convert the source code to object, or machine, code.

Bibliography

See R. Cezzar, A Guide to Programming Languages: Overview and Comparison (1995), T. W. Pratt and M. V. Zelkowitz, Programming Languages: Design and Implementation (3d ed. 1996); C. Ghezzi and M. Jazayem, Programming Language Concepts (3d ed. 1997); R. W. Sebasta, Concepts of Programming Languages (4th ed. 1998).

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"programming language." The Columbia Encyclopedia, 6th ed.. . Encyclopedia.com. 21 Oct. 2017 <http://www.encyclopedia.com>.

"programming language." The Columbia Encyclopedia, 6th ed.. . Encyclopedia.com. (October 21, 2017). http://www.encyclopedia.com/reference/encyclopedias-almanacs-transcripts-and-maps/programming-language

"programming language." The Columbia Encyclopedia, 6th ed.. . Retrieved October 21, 2017 from Encyclopedia.com: http://www.encyclopedia.com/reference/encyclopedias-almanacs-transcripts-and-maps/programming-language

Programming Language

PROGRAMMING LANGUAGE

In order for computers to accept commands from humans and perform tasks vital to productivity and e-commerce, a means of communication must exist. Programming languages provide this necessary link between man and machine. Because they are quite simple compared to human language, rarely containing more than few hundred distinct words, programming languages must contain very specific instructions. There are more than 2,000 different programming languages in existence, although most programs are written in one of several popular languages, like BASIC, COBOL, C++, or Java. Programming languages have different strengths and weaknesses. Depending on the kind of program being written, the computer it will run on, the experience of the programmer, and the way in which the program will be used, the suitability of one programming language over another will vary.

At the most basic level, computer hardware is controlled through machine language, consisting of numbers (mainly zeros and ones). Immediately above machine languages are assembly languages, which use mnemonic names instead of numbers to represent instructions. This level of language is the lowest a programmer is likely to see. Special programs known as assemblers take assembly code and translate it into the machine language used by a computer's hardware. Although they aren't numeric, assembly languages have several disadvantages, including the fact that they are hard to understand and often are very specific to a certain machine's central processing unit (CPU). Each kind of CPU has its own form of assembly language. In the early 2000s, programs usually were not written directly in assembly language. Rather, it was used by experienced programmers to work on critical parts of computer programs.

When people refer to programming languages, they normally mean one of many different kinds of high-level languages or fourth-generation languages that reside above the level of assembly language. Unlike machine and assembly languages, high-level languages resemble human grammar and syntax more closely, and are often portable to different operating systems and machines. Three programming languages were instrumental in opening the lines of communication between programmers and computers. FORTRAN, COBOL, and ALGOL were created around the 1950s and many variations of these languages were still in use during the early 2000s.

Once a program is written in a high-level language, a program called a compiler or an interpreter is used to convert it to a computer's specific machine language, much like an assembler converts assembly code into machine language. As explained by Daniel Appleman in his book How Computer Programming Works, "A compiler translates an entire program into machine language. Once translated, a program can execute by itself. An interpreter reads your program source code and performs the operations specified without actually translating the code into machine language. The interpreter program executes the program you create, so your program always requires the interpreter."

HISTORY OF PROGRAMMING LANGUAGES

According to Computer Languages, Konrad Zuse created the very first high-level language in 1945. During World War II, Zuse, who had previously constructed several basic, general purpose computers in his parent's apartment, fled Berlin for the Bavarian Alps, where he lived as a refugee. The programming language Zuse created translated from German as "The Plan Calculus." In theory, this computer language could be applied to a variety of different computer problems. Unlike the computers that existed at the time, Zuse's program relied not on decimals, but on binary notation.

Zuse's programming language was never adopted for widespread use on actual computers. FORTRAN and COBOL were the first high-level programming languages to make an impact on the field of computer science. Along with assembly language, these two high-level languages have led to or influenced the development of many modern programming languages, including Java, C++, and BASIC.

FORTRAN (FORmula TRANslating), released in 1957 after a three-year developmental period, is well suited for math, science, and engineering programs because of its ability to perform numeric computations. The language was developed in New York by IBM's John Backus. At the time, IBM was trying to make computer's more user-friendly in an effort to increase sales. FORTRAN achieved this goal, because it was easy to learn in a short period of time and required no previous computer knowledge. It eliminated the need for engineers, scientists, and other users to rely on assembly programmers in order to communicate with computers. Although FORTRAN is often referred to as a language of the past, computer science students were still taught the language in the early 2000s for historical reasons, and because FORTRAN code still exists in some applications.

COBOL (COmputer Business Oriented Language), was released in April of 1959 and has been updated several times since then. Shortly after the introduction of FORTRAN, users from different fields, including academia and manufacturing, convened at the University of Pennsylvania to discuss the need for a standardized business language that could be used on a wide variety of computers. The eventual result was COBOL, a programming language well suited for creating business applications. COBOL's strength is in processing data, and in its simplicity. Because the language is readable and easy to understand, it is difficult to hide malicious or destructive computer code within a COBOL program, and is easy to spot programming errors.

In the early 2000s, COBOL was a frequently discussed topic in e-commerce circles. Many companies sought to allow customers to access data on mainframe computers running COBOL programs. Finding ways to enable COBOL to interface with hypertext markup language (HTML), which is used to create pages on the World Wide Web, became important.

FURTHER READING:

Appleman, Daniel. How Computer Programming Works. Berkeley: Apress. 2000.

Computer Languages. Alexandria: Time-Life Books. 1986.

Hansen, Augie. C Programming. New York: Addison-Wesley Publishing Publishing Co., Inc. 1989.

Radcliff, Deborah. "Moving COBOL to the WebSafely." Computerworld, May 1, 2000.

"Programming Language." Ecommerce Webopedia, March 12, 2001. Available from www.e-comm.webopedia.com.

"Programming Language." Techencyclopedia, March 12, 2001. Available from www.techweb.com.

SEE ALSO: BASIC; C; COBOL; FORTRAN; HTML; Java

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Programming Language." Gale Encyclopedia of E-Commerce. . Encyclopedia.com. 21 Oct. 2017 <http://www.encyclopedia.com>.

"Programming Language." Gale Encyclopedia of E-Commerce. . Encyclopedia.com. (October 21, 2017). http://www.encyclopedia.com/economics/encyclopedias-almanacs-transcripts-and-maps/programming-language

"Programming Language." Gale Encyclopedia of E-Commerce. . Retrieved October 21, 2017 from Encyclopedia.com: http://www.encyclopedia.com/economics/encyclopedias-almanacs-transcripts-and-maps/programming-language

programming language

programming language A notation for the precise description of computer programs or algorithms. Programming languages are artificial languages, in which the syntax and semantics are strictly defined. Thus while they serve their purpose they do not permit the freedom of expression that is characteristic of a natural language.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"programming language." A Dictionary of Computing. . Encyclopedia.com. 21 Oct. 2017 <http://www.encyclopedia.com>.

"programming language." A Dictionary of Computing. . Encyclopedia.com. (October 21, 2017). http://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/programming-language

"programming language." A Dictionary of Computing. . Retrieved October 21, 2017 from Encyclopedia.com: http://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/programming-language