Procedural languages are computer languages used to define the actions that a computer has to follow to solve a problem. Although it would be convenient for people to give computers instructions in a natural language, such as English, French, or Chinese, they cannot because computers are just too inflexible to understand the subtleties of human communication. Human intelligence can work out the ambiguities of a natural language, but a computer requires a rigid, mathematically precise communication system: each symbol, or group of symbols, must mean exactly the same thing every time.
Computer scientists have created artificial languages that enable programmers to assemble a set of commands for the machine without dealing directly with strings of binary digits. The high-level form of a procedural language frees a programmer from the time-consuming chore of expressing algorithms in lower-level languages such as assembly and machine language. Additionally, procedural language instructions are expressed in a machine-independent form that facilitate portability, thus increasing the lifetime and usefulness of a program.
Higher-level languages work for people because they are closer to natural language, but a computer cannot carry out instructions until that communication has been translated into zeros and ones. This translation may be done by compilers or interpreters, which are special programs custom-made to fit both the language and the machine being used. A compiler reads the entire program, makes a translation, and produces a complete binary code version, which is then loaded into the computer and executed. Once the program is compiled, neither the original program nor the compiler is needed. On the other hand, an interpreter translates and executes the program one instruction at a time, so a program written in an interpreted language must be interpreted each time it is run. Compiled programs execute faster, but interpreted programs are easier to correct or modify.
A procedural language is either compiled or interpreted, depending on the use for which it was created. FORTRAN, for example, is usually implemented with a compiler because it was created to handle large programs for scientific and mathematical applications where speed of execution is very important. On the other hand, BASIC is typically implemented with an interpreter because it was intended for use by novice programmers.
Each programming language has a special vocabulary of keywords, which correspond to specific operations or sequences of operations to be performed by the computer. Some of them act as verbs, or commands, others act as nouns, modifiers, or punctuation marks. By using them to form sentences, a programmer tells a computer exactly what to do with each item of information being processed. Typical commands include: input and output, conditions, and repetition. Because they are indispensable to programmers, these commands are common to all computer languages, but they are written differently in each language because the sentences must follow the syntax of the language.
Many of the hundreds of programming languages also have dialects. A dialect is a variation of the main language. While people can understand one another even if they speak different dialects of their language, a computer cannot understand a program written in a different dialect from its own. Dialects present problems when a program using the same data set gives different answers when run on two different machines. This means that the program cannot be ported from one machine to the other. Asking several programmers to name the best computer language will most likely give you several different answers, because there is no best computer language, any more than one natural language is better than all the rest. Theoretically, most programming tasks could be accomplished with any language, but writing a program for a given job is actually considerably easier in some languages than in others. ﾀ None can claim all-around utility.
This essay covers, in some historical order, procedural languages that were popular or significant during the period of their development: FORTRAN, ALGOL, COBOL, BASIC, Pascal, C, and Ada.
We can safely say that FORTRAN (FORmula TRANslator) was the first true high-level language. A factor that influenced the development of FORTRAN was the amount of money spent on programming in 1954. The cost of programming heavily impacted on the cost of operating a computer, and as computers got cheaper, the situation got worse. American computer scientist John Backus was able to convince IBM's directors that a language could be developed with a compiler that would produce very efficient object code. He was put in charge of the group—which included Irving Ziller, Roy Nutt, David Sayre and Peter Sheridan—that developed FORTRAN. One of their goals was to design a language that would make it possible for engineers and scientists to write programs on their own for the IBM 704.
The first FORTRAN compiler took about twenty-five person-years to complete and proved to be as efficient as the then-current assemblers, making it a striking achievement in the history of programming languages. FORTRAN I was released in 1957 and was followed in the spring of 1958 by FORTRAN II. It included function statements and better diagnostic messages. A more advanced version, FORTRAN III, depended heavily on the architecture of the IBM 704, and was not made into a commercial product. However, many of its features were incorporated into FORTRAN IV, which was released in 1962 and had a life of almost fifteen years. It added COMMON storage, double-precision and logical data types, and relational expressions as well as the DATA statements, which provided a simple facility to initialize variables. Programs written using versions subsequent to FORTRAN III were machine independent, which meant that they could be run on any scientific machine. For the first time one single language was used by many manufacturers for many different machines.
By the mid-1970s FORTRAN IV was no longer a modern language, and although the investment in FORTRAN programs was immense it was time to bring it up to speed. In 1967 work began on what was later called FORTRAN 77, which became the official standard in April of 1978. By 1981 the demand for FORTRAN 77 compilers was very high, making it clear that it was a success. However, it did not have all the features needed to implement modern control structures, so work on its successor began in 1978. It was to include if-then-else control structures, case selection, doenddo structure, and recursion, among other things. Work on this project ended in 1990 and FORTRAN 90 was published in 1991. FORTRAN 95, an extension of FORTRAN 90, was published in December 1997, and work on FORTRAN 200x was underway in 2001. It is an upward compatible extension of FORTRAN 95 adding support for exception handling, object-oriented programming, and improved interoperability with C.
Because many languages and dialects were developed between 1956 and 1959 creating portability problems, various computer groups petitioned the Association for Computing Machinery (ACM) to recommend action for the creation of a universal programming language. Representatives from industry and universities were appointed to a committee that met three times, starting in January 1958, and agreed that the new language would be an algebraic language similar to FORTRAN. However, FORTRAN could not be used as a universal language because, in those days, it was a creation of IBM and closely tied to IBM hardware. Some members of the group, including John Backus and Alan Perlis, were chosen to represent the American viewpoint at the meetings that would shape this international language. ﾀ
ALGOL 58 was really a group effort. It was the first formal attempt to address issues of portability, and integrated the best features of programming languages available at the time. It introduced new terms such as: type, formal versus actual parameter, for statement, the begin end delimiters , and three levels of language description. This effort was considered as a draft and was not commercially implemented. However, many recommendations for its improvement were considered at a Paris meeting in June 1959.
In January 1960 seven representatives of European countries, including Peter Naur and Fritz Bauer, and six from the United States, including Backus and Perlis, met in Paris to develop ALGOL 60, which was expected to become a universal tool with the addition of the following features: block, call by value and call by name, dynamic arrays, own variables, global and local variables, until, while, if then else, and recursion. ALGOL was used more in Europe than in the United States by computer scientists conducing research. ALGOL 60 became the standard for the publication of algorithms and was a great influence on future language developments.
In 1962 a new international committee of computer scientists was formed to develop an enhanced version of ALGOL 60. The meetings began in 1965 and lasted until 1968 when ALGOL 68 was released. Although it allowed non-English-speaking programmers to write programs in their own language, it proved to be too cumbersome to be readily accepted. Proficient programmers had trouble understanding the document that defined it and very few institutions had an actual ALGOL 68 compiler in use.
However, out of this effort arose a new language, Pascal, developed by Niklaus Wirth who began work on it in 1968.
In April of 1959, two years after the introduction of FORTRAN, a group of academics, computer manufacturers, and computer users, including American programming pioneer Grace Hopper (1906–1992), met to discuss the feasibility of designing a programming language that would satisfy the needs of the business community and would become a standard. FORTRAN did not suit their needs because business programs deal with large quantities of data but do not perform complicated calculations. Existing programming languages were not portable—they could only function in one type of computer, scientific or business. Since large organizations sometimes had different types of computers, their programmers had to know several languages, thus increasing the cost of software. For example, the U.S. Department of Defense had more than 1,000 computers and it was costing the DoD close to $500 million a year to program them and keep them operating smoothly.
A meeting of forty representatives from the government, users, consultants, and manufacturers met at the Pentagon on May 1959 to discuss the need of a common business language. They formed three committees and proceeded to analyze existing business programming languages: FLOWMATIC, AIMACO, and Commercial Translator. They sought to learn if the best features of each could be merged into one. By December of 1959 the group had completed the specifications for COBOL, which were made public in 1960.
COBOL programs are composed of four divisions, each one serving a specific purpose. ﾀ The IDENTIFICATION division serves to identify the program and programmer. The ENVIRONMENT division is used to identify the actual computer, compiler, and peripheral devices that will be used by the program and it is the most machine dependent. The DATA division describes the files, records, and fields used by the program, and the PROCEDURE division contains the instructions that will process the data. COBOL commands are written using English words and syntax, and its variable names can be up to 30 characters long, making them very descriptive. These features make programs easy to read and understand for nonprogrammers, and it also makes them easier to debug and maintain. COBOL programs are highly portable, therefore COBOL was readily accepted by the American business community.
The 1961 revision of COBOL included the Report Writer and Sort features, and was the first to be widely implemented. COBOL was revised again in 1965 and 1968, the latter was the first American National Standards Institute (ANSI) standard compiler.
COBOL 74 improved indexed file handling, specifically, ISAM (Indexed Sequential Access Method). During the growth of the microcomputer market, several versions of microcomputer COBOL became available and were used in the business community as well as in universities and colleges. COBOL 85 reflected the efforts of making it more compatible with structured programming by providing END IF, END PERFORM, a direct case structure, and an in-line PERFORM. Publication of the next revision was expected in 2002 and was to include object-oriented features.
In the early 1960s there were no personal computers. If you wanted to compute, you had to punch your program on cards, carry them to the nearest computer center, and then wait hours for the results. John G. Kemeny and Thomas E. Kurtz, professors at Dartmouth College, believed that computer programming was too important to be relegated exclusively to engineering students and professional programmers. So in 1963 they designed and built a time-sharing system and developed the Beginners All-purpose Symbolic Instruction Code (BASIC). Their goals included ease of learning for the beginner, hardware and operating system independence, the ability to accommodate large programs, and sensible error messages in English. BASIC became available in 1964. Although Kemeny and Kurtz implemented it to run with a compiler, current versions run under interpreters.
BASIC can be classified as a general-purpose language because it can handle business applications as well as scientific and educational applications. Unfortunately the language has been widely modified and extended by computer manufacturers and software companies. Numerous dialects of BASIC, each with its own syntax and special features, make it difficult to port programs from one computer to another.
The original version was revised and expanded by Kemeny and Kurtz to include graphic statements in 1975. The following year, in order to comply with the requirements of structured programming, they dropped the GOTO statement; this version was called SBASIC and was taught to Dartmouth undergraduates. In 1983 they developed "true BASIC," a more powerful and versatile form that follows the proposed ANSI standards. Some of its features were optional line numbers, long variable names, array-handling statements, Do loops, a SELECT case structure, and independent subprograms. BASIC was widely accepted in the educational community because it was an easy language to teach and learn.
Pascal was developed by Niklaus Wirth, a Swiss computer scientist who was part of the ALGOL 68 committee. He felt that ALGOL was too complex and wanted to design a computer language that could easily be taught to college students. The new language, which is a derivative of ALGOL, was published in 1971 and was later called Pascal.
Pascal ﾀ incorporates the ideas of structured programming that started to appear in the 1960s, redefining ALGOL's concept of breaking down a program into modules, procedures, and functions, and also expanding on some of ALGOL's features by adding new data types and control structures. Its structure makes programs easier to read and maintain by people other than the original programmer. Although there are variations among Pascal compilers, the language has a fairly standard form, so programs are portable between different computers.
Wirth's idea found its most important audience at the University of California at San Diego, where in late 1974 Kenneth Bowles worked out a Pascal operating system and compiler to be used on mini-and microcomputers. He went on to develop an entire system containing a compiler, text editor, an assembler, a linker, a file-handling utility, and a set of utility programs. This package, ready for distribution by 1977, was known as UCSD Pascal. By 1978 it began to receive national attention. The growth of personal computers helped it achieve wide acceptance in the educational community, and for almost two decades it was the language of choice for most introductory computer science courses.
Because Pascal was meant to be used as a teaching tool, its input and output functions were limited making it impractical for writing commercial applications. However, several languages including Modula-2 and Ada were based on it.
C is one of the descendants of ALGOL 60. It was developed in 1972 by Ken Thompson and Dennis Ritchie, both of Bell Laboratories. Their goal was to create a language that would combine high-level structured language features with those that control low-level programming. This makes C well suited for writing operating systems, compilers, and also business applications. C compilers can basically run on all machines, and since a standard for C was defined in 1988, most C programs are portable. Conversely, C has been defined as a programming language written by a programmer, which means that novices find it difficult to learn.
C supports structured programming and provides for several data types. For example, pointer arithmetic is an integral part of C, as is the use of functions that may be called recursively. Although input and output statements are not part of the language, they are functions found in a "library" ready to be used when needed. Some of the functions found in a standard UNIX C library include string manipulation, character functions, and memory allocation. In addition to external, automatic and static variables, C provides register variables, which shorten execution time because they use registers.
C makes it possible to work on bit data using the bit operators for AND, OR, Exclusive OR, One's complement, SHIFT LEFT, and SHIFT RIGHT, giving programmers great control over data manipulation.
Development of Ada started in 1975 under the auspices of the U.S. Department of Defense (DoD) for use in its military computer systems. This action was necessary because the expense of developing and maintaining DoD programs was becoming very high due to the variety of programming languages being used. In the early 1970s, the DoD used at least 450 different computer languages and dialects.
The DoD uses most of its programming efforts to guide military equipment, such as tanks, airplanes, and nuclear bombs. Those programs execute in real time, at the same time as a tank is moving or an airplane is flying. For example, to perform its mission, a fighter pilot cannot wait for the computer to send back the results later in the day. Although real-time systems can operate outside of the device they are controlling, they can also be embedded within a larger system, for example a robot.
Usually real-time systems are large and multifaceted, so that the task of coordinating the programming effort is key to the success of the system. These systems have to respond to outside events, which happen in the real world, within a specific amount of time. They must be able to communicate with typical computer peripherals, such as printers and modems, as well as non-typical input and output devices like patient monitoring devices. Most importantly, real-time systems have to be reliable because in certain cases an error in the program could result in a loss of human lives. These conditions dictate that programming languages for real-time systems must be robust. That means that the compiler must detect programming errors automatically before any damage is done, and the language must provide for recovery from undetected errors.
In 1975 the High Order Language Working Group (HOLWG) was formed to find the exact language for DoD's needs. After careful study, the committee decided that none of the existing languages would be appropriate and a new one had to be developed. The foundations for the definition and design of this language were: PL/I, ALGOL 68, and Pascal. It came to be called Ada. Its development was carefully monitored; it took five years before the first reference manual was published in 1980. A revision came out in 1982, and in 1995 ANSI adopted a new standard for Ada.
Ada was developed to reduce the cost of software development and maintenance, especially for large, constantly changing programs that will be used for a long period of time. A fundamental idea of this language is the "package," which is used to divide a program into modules that can be compiled, tested separately, and stored in a library until needed. This makes large programs easier to write, debug, and maintain by teams of programmers. Another feature of Ada is that it supports parallel processing , including concurrently executable code segments called "tasks," which can execute independently of each other or can be synchronized to relay information between themselves.
Although Ada is not very difficult to learn at the basic level, using it to its full capacity requires programming knowledge and experience. Therefore, Ada is considered a language for advanced programmers, especially suited for large projects, real-time systems, and systems programming.
Because it is a very good language for large critical systems, Ada has achieved a high level of acceptance and is used by many organizations worldwide. Not only is most DoD code written in Ada, but the language has been used to write important non-military applications such as international air traffic control, railways, and commercial satellites. For example, programs for the French TGV rail system, Channel Tunnel, and many Global Positioning System projects are mostly written in Ada.
see also Algol-60 Report; Algorithms; Compilers; Programming.
Ida M. Flynn
Baron, Naomi S. Computer Languages. Garden City, NY: Anchor Books, 1986.
Hsu, Jeffrey. Microcomputer Languages. Hasbrouck Heights, NJ: Hayden Book Co., 1986.
Wexelblat, Richard, ed. History of Programming Languages. New York: Academic Press, 1981.
ﾀ A language designed for scientific applications does not readily lend itself to the writing of a program for managing a payroll database.
ﾀ The ALGOL meetings were conducted from May to June 1958 in Zurich, Switzerland. The name ALGOL (ALGOrithmic Language) was suggested at that time.
ﾀ COBOL is short for COmmon Business Oriented Language.
ﾀ Pascal is named for Blaise Pascal (1623-1662), a seventeenth-century French mathematician.