Entries

Encyclopedia of Public Health Encyclopedia of Science and ReligionEncyclopedia of Business and Finance, 2nd ed.Gale Encyclopedia of E-Commerce Further reading

NON JS

Information Technology

INFORMATION TECHNOLOGY

Many areas of public health, including vital statistics, investigation and research, surveillance, epidemiology, surveys, laboratories technology, maternal and child health, and environmental health, use information technology (IT) to achieve their goals and objectives. IT includes the use of computers and communications, and the transformation of data into information and knowledge.

In the 1960s, "number crunching" was one of the first applications for which computers were used in the hospital environment. A decade later, in the early 1970s, IT applications were being used

Table 1

Text, Graphics, Multimedia Common Sound, Still-Video, and Motion-Video Formats on the Web
Extension Text & Graphics Formats Explanation
TXT, DOC MS Word Word processing application
WPD WordPerfect Word processing application
RTF Rich Text Format Method of encoding text formatting and document structure using ASCII character set.
PPT Microsoft PowerPoint Presentation Graphics application
PRS Harvard Graphics Presentation Graphics application
XLS MS Excel Spreadsheet application
HTM MS FrontPage Editor: application for creating and editing Web pages. Explorer: application for maintaining, testing, and publishing webs.
Extension Sound Formats Explanation
RA RealAudio Used with RealAudio Web Server and RealAudio Player add-on for browsers
SBI Sound Blaster Instrument Used for a single instrument with Sound Blaster cards
WAV MS Waveform Sound format used in Windows for event notification
Extension Still-Video/Graphics Formats (SVF) Explanation (Raster or bitmap images)
GIF Graphics Interchange Format Compressed graphics format commonly used on CompuServe
BMP Windows bitmap
PCC, PCX PC Paintbrush Bitmap images
JPEG, JPG Joint Photographic Experts Group Highly compressed format for still images, widely used for multi-platform graphics
TIFF Tagged Image File Format High-resolution, tag-based graphics format used for the universal interchange of digital graphics
PCD Photo CD A graphics file format developed by Eastman Kodak Company
PDF Portable Document Format Adobe's format for multi-platform document access through its Acrobat software
PS PostScript Adobe's type description language, used to deliver complex documents over the Internet

in admissions, patient care, clinical laboratories, and intensive care units. In the 1990s, the fusion of computers and all forms of communication have become commonplace in all aspects of life. The Internet and the World Wide Web (WWW) are now tools that both professionals and laypeople use for all type of businesses. An evolution has occurred in the ways people use computers, in the power, capacity, and speeds of computers, and in

Table 1, continued

Text, Graphics, Multimedia Common Sound, Still-Video, and Motion-Video Formats on the Web [CONTINUED]
Extension Still-Video/Graphics Formats (SVF) Vector Images
source: Courtesy of author.
AI Adobe Illustrator
CGM Computer Graphics Metafile
DRW Micrografx Drawing
PCT Macintosh PICT
WMF Windows Metafile Used mostly for word-processing clip art
WPG WordPerfect Graphics Word-processing clip art
Extension Motion-Video Formats (MVF)
DVI Digital Video Interactive MVFs found in CD-ROMs
FLI Flick Autodesk Animator MVF
MPEG, MPG Motion Picture Experts Group Full-motion video standard using frame format similar to JPEG with variable compression capabilities
MOV Quick Time Apple's motion video and audio format (originally for Macintosh, available for Windows)

the way systems are put together and integrated (see Table 1).

Most people tend to think about computers in terms of the systems that they use at home or at work. Most of the time these are "stand-alone" models, such as desktops, laptops, or notebooks, and sometimes they are wireless devices, such as palm pilots, personal organizers, and third-generation cellular phones that allow access to e-mail and the Internet. Although public health has not yet taken full advantage of these technologies, it is important to understand the basics of these technologies in order to visualize their potential uses in the near and long-term future.

COMPUTERS

Initially, the computer was conceived as a device to manipulate numbers and solve arithmetical problems. During its development, it was recognized that a machine capable of manipulating numbers could also be used to manipulate any "symbol" represented in numeric form. An electronic data processing system (EDPS) involves at least three basic elements: the input entering the system, or source data; the orderly processing that takes place within the system; and the output, or end result. The EDPS has four functional units: the input devices; the central processing unit (CPU); the storage, or memory; and the output devices.

The central processing unit (CPU) is the control center of the EDPS, and it has two parts: the "arithmetic/logic unit" (ALU) and the "control unit." The ALU performs operations such as addition, subtraction, multiplication, and division; as well as moving, shifting, and comparing data. The control section of the CPU directs and coordinates all the operations of the computer according to the conditions set forth by the stored program. It selects instructions from the stored program and interprets them. It then generates signals and commands that cause other system units to perform certain operations at appropriate times. It controls the input/output units, the arithmetic-logic operations of the CPU, and the transfer of data to and from storage. It acts as a central nervous system, but performs no actual processing operations on data.

Storage Devices. The main storage of a computerthe memory, or internal storage unit is basically an electronic filing cabinet where each location is capable of holding data and instructions. The storage unit contains four elements: (1) all data being held for processing, (2) the data being processed, (3) the final result of processing until it is released as output, and (4) all the program instructions while processing is being carried out. Each location in main storage is identified by a particular address. Using this address, the control section can readily locate data and instructions as needed. The size or capacity of main storage determines the amount of data and instructions that can be held within the system at any one time. In summary, the internal memory is a temporary storage and is called "random access memory" (RAM). There is also a second type of memory, called "read-only memory" (ROM). This memory is fixed; meaning it can be read but cannot be written to, changed, or deleted. There are also secondary memory devices or auxiliary storage, sometimes called "sequential access memory," such as diskettes, hard drives, and magnetic tape. Depending on how often the data will be used these auxiliary devices will be chosen. For example, mass storage devices or certain types of tapes may be used for archival purposes of medical records or bank accounts, where certain legal aspects of the data may be required.

Input/Output (I/O) Devices. These are devices that are linked to the computer and can introduce data into the system, and devices that can accept data after it has been processed. Some examples are: disk storage drives, printers, magnetic tape units, display stations, data transmission units, and the old punched card or paper tape. Input devices perform the function of converting the data from a form that is intelligible to the user to a form that is intelligible to the computer. Output, on the other hand, is data that has been processed, (e.g., shown on a display device). In some cases, a printer can readily display the data in an understandable form. In other instances, such as with a magnetic tape drive, the data is carried as input for further processing by another device. In this case, the computer retains the data until further processing takes place. In summary, a digital computer identifies an electronic device capable of manipulating bits of information under the control-sequenced instructions stored within the memory of the device. Some common forms of storing data today include: floppy disks (used mainly for temporary storage); magnetic disks (fixed or removable); and optical disks that can store very large amounts of data. CD-ROM (compact disk read only memory) devices store the information by means of a finely focused laser beam that detects reflections from the disc. This technology is sometimes referred by the term "write once, read many times" (WORM).

Computer System. The computer elements described thus far are known as "hardware." A computer system has three parts: the hardware, the software, and the people who make it work. The computer software can broadly be divided in two categories: systems software and application software or programs. These systems software can be further divided into: operating systems and programming languages. A computer program is a set of commands (in the form of numeric codes) that is put into the computer's memory to direct its operation. Testing, or debugging, is done to check if a program works properly. The ongoing process of correcting errors and modifying working programs is called software maintenance. The science of software engineering has provided formal methods for writing and testing programs.

DATA PROCESSING, DATA REPRESENTATION

When people communicate by writing in any language, the symbols used (the letters of the alphabet, numerals, and punctuation marks) convey information. The symbols themselves are not information, but representations of information. Data in an EDPS must be expressed symbolically so that the machines can interpret the information presented by humans. In general, the symbols that are read and interpreted by a machine differ from those used by people. The designer of a computer system determines the nature and meaning of a particular set of symbols that can be read and interpreted by the system. The actual data that is used by these systems is (or was in the past) presented as holes on punched cards or paper tape, as spots on magnetic tape, as bits (binary digit) or bytes of information in a disk, diskette, CD-ROM, or optical disk; as magnetic-ink characters; as pixels in display-screen images; as points in plotted graphs; or as communication-network signals.

In many instances, communication occurs between machines. This communication can be a direct exchange of data in electronic form over cables, wires, radio waves, infrared, satellites or even wireless devices such as cellular phones, pagers, and hand-held personal organizers and/or notebooks. It can also be an exchange where the recorded or stored output of one device or system becomes the input of another machine or system.

In the computer, data is recorded electronically. The presence or absence of a signal in specific circuitry represents data in the computer the same way that the absence or presence of a punched hole represented data in a punched card. If we think of an ordinary lightbulb being either on or off, we could define its operation as a binary mode. That means that at any given time the lightbulb can be in only one of two possible conditions. This is known as a "binary state." In a computer, transistors are conducting or nonconducting; magnetic materials are magnetized in one direction or in the opposite direction; a switch or relay is either on or off, a specific voltage is either present or absent. These are all binary states. Representing data within the computer is accomplished by assigning a specific value to each binary indication or group of binary indications. Binary signals can be used to represent both instructions and data; consequently the basic language of the computer is based primarily on the "binary number system."

A binary method of notation is usually used to illustrate binary indications. This method uses only two symbols: 0 and 1, where 0 and 1 represent the absence and presence of an assigned value, respectively. These symbols, or binary digits, are called "bits." A group of eight bits is known as a "byte," and a group of 32 bits (4 bytes) is known as a "word." The bit positions within a byte or a word have place values related to the binary number system. In the binary number system the values of these symbols are determined by their positions in a multidigit numeral. The position values are based on the right to left progression of powers having a base of 2 (20, 21, 22, 23), commonly employed within digital computers. For example, if there are four light bulbs next to each other numbered 4, 3, 2, and 1 and 1 and 3 are "on" and 2 and 4 are "off," the binary notation is 0101.

The system of expressing decimal digits as an equivalent binary value is known as Binary Coded Decimal (BCD). In this code, all characters (64 characters can be coded), including alphabetic, numeric, and special signs, are represented using six positions of binary notation (plus a parity bit position). The Extended Binary Coded Decimal Interchange Code (EBCDIC) uses eight binary positions for each character format plus a position for parity checking (256 characters can be coded). The American Standard Code for Information Interchange (ASCII) is a seven-bit code that offers 128 possible characters. ASCII was developed by users of communications and data processing equipment as an attempt to standardize machine-to-machine and system-to-system communication.

Computer Number Systems and Conversions. Representing a decimal number in binary numbers may require very long strings of ones and zeros. The hexadecimal system is used as a shorthand method to represent them. The base of this system is 16, and the symbols used are: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E and F. In other words, F is 15 in decimal notation and 1 1 1 1 in binary.

Programming Languages Techniques. Assembler languages are closer to machine instructions than to human language, and having to express logical procedures, arithmetical calculations, and textual manipulations in these languages affects a programmer's productivity because they are so cumbersome. There are many higher-level programming languages, such as ALGOL, BASIC, COBOL, FORTRAN, and Pascal, that are much closer to human means of expression.

A programmer writes a source program in a human-readable programming language. A compiler translates these English-like statements into instructions that the computer can executesuch instructions are called an "object program." Through added library routines the computer does further processing of the object program, executes it, and an "output" is produced. There are some "optimizing compilers" that automatically correct obvious inefficiencies in source programming. Sometimes, with the use of "interpreters," debugging can be done to a program as it executes the user program piece by piece. MUMPS, LISP, and APL are interpreters used for this purpose in the health care environment, artificial intelligence, and mathematics fields, respectively. Because of the time and costs associated with development, it is generally not cost effective in today's environment to develop an application package, but rather buy it (if available) from a vendor. The costs are thus spread among thousands of users. Typical applications packages used for public health purposes are SAS and SPSS (for biostatistics) and ArcView/GIS (for Geographical Information Systems). In addition there are some data manipulation languages (e.g., Oracle and dBASE) that were written with this purpose. A database manipulation language (DML) is a special sublanguage used for handling data storage and retrieval in a database system. Using a data definition language (DDL), programmers can organize and structure data on secondary storage devices.

Data Acquisition. Capturing and entering data into a computer is expensive. Direct acquisition of data avoids the need for people to read values and measure, encode, and/or enter the data. Automated data acquisition can help eliminate errors and speed up the procedure. Sensors connected to a patient convert biological signals into electrical signals that are transmitted into a computer. Many times these signals (e.g., ECG, blood pressure, heart rate) are analog signals, and in order to be stored into a digital signal a conversion needs to occur. This process is called analog to digital conversion (ADC).

DATABASES AND DATABASE MANAGEMENT SYSTEMS

A database (DB) system is a computer-based record keeping system used to record and maintain certain types of information that have a significant value to some organization. A DB is a repository of stored data, which in general is both integrated and shared. Between the physical database and the users of the system is a layer of software, usually called the database management system (DBMS). All requests from the users to access the DB are handled by the DBMS.

When trying to organize the data and information within an organization, the DB helps the user in entering, storing, and retrieving it, and when trying to integrate all or part of the information of the enterprise the DB becomes a key player. Normally, within the DB, information is organized into data elements, fields, records, and files. In a system such as a hospital information system (HIS), a patient name is a data element or a field; a record could be related to that patient's visit on a particular date (e.g., date, diagnoses, treatments, charges, medications, tests) at a particular time; and a file would contain all the information from all the visits for that patient. An HIS DB will include not only patient files, but it could also have accounting information related to charges, inventory, payroll, and personnel records. With DB systems, different people can have access to different parts of the system, so, for example, not all personnel employees will have access to laboratory results.

The DBMS organization and definition of the contents of the individual data elements, fields, records, and files are provided via a machine-readable definition called "schema." This creates an independence of physical location from logical location of the content of a DB. The DBMS not only "manages the DB" but also allows for entering, editing, and retrieving results. The DBMS helps with the integration of data coming from multiple sources. The user can also access and retrieve specific types of information via queries.

A DB provides an organization with centralized control of its operational data. Some of the advantages of having centralized (versus distributed) control of the data are:

  • Redundancies can be reduced.
  • Inconsistencies can be avoided.
  • Data can be shared.
  • Standards can be enforced.
  • Privacy, confidentiality, authenticity, and security restrictions can be applied.
  • Integrity can be maintained.
  • Conflicting requirements (among users) can be balanced (for the enterprise).
  • Data is easier to support (the single repository, the application, and the endusers).

Due to technological advancements, databases today are much more complex than a few decades ago. They contain "multimedia" information, such as text, graphics, scanned images from documents, clinical images from all modalities (X-rays, ultra-sound, MRI, CT scan), still and dynamic studies, and sound. When doing population studies, the creation of data "warehouses" is necessary, and data "mining" techniques are used to extrapolate results. In public health, the data needed for a study can reside in a small computer, in a local area network (LAN), or in a wide area network (WAN). In order to use information that is geographically distributed (and/or with distributed users) it is important to learn techniques for data integration and data communications. Because of the continuing fusion of computers and communications, this is the fastest changing area within information technology.

INTERNET AND THE WORLD WIDE WEB

There is little historical precedent for the swift and dramatic growth of the Internet, which was originally a limited scientific communication network developed by the U.S. government to facilitate cooperation among federal researchers and the university research community. With its rapid adoption by the private sector, the Internet has remained an important research tool, and it is also becoming a vital ingredient in maintaining and increasing the scientific and commercial leadership of the United States. In the twenty-first century, the Internet will provide a powerful and versatile environment for business, education, culture, entertainment, health care and public health. Sight, sound, and even touch will be integrated through powerful computers, displays, and networks. People will use this environment to work, study, bank, shop, entertain, visit with each other, and communicate with their health care providers. Whether at the office, at home, or traveling, the environment and its interface will be largely the same, and security, reliability, and privacy will be built in. Benefits of this dramatically different environment will include a more agile economy, improved health care (particularly in rural areas), less stress on ecosystems, easy access to lifelong and distance learning, a greater choice of places to live and work, and more opportunities to participate in the community, the nation, and the world.

Internet and WWW Acronyms. People that communicate with each other electronically may not have the same "platform." "Cross-platform" means that people do not have to use the same kind of operating system to access files on a remote system. In order to access the Web there are two basic mechanisms: (1) using the telephone system to link to another computer or network that is connected to the Internet, and (2) connecting to a network; and from there into the Internet. An Internet service provider (ISP) may be required to access the Internet. An important factor regarding Internet access is bandwidth, which determines how much data a connection can accommodate and the speed at which data can be accessed.

Information on the Web is generally written in Hypertext Markup Language (HTML), which is a text-based markup language that describes the structure of a Web document's content and some of its properties. It can also be viewed as a way of representing text and linking it to other resources, such as multimedia files, graphic files, still or dynamic images files, and sound files. HTML contains the information or text to be displayed and the control needed for its display or playback.

Navigation Tools. Prior to the use of Web browsers, there were several Internet navigation tools that required more user expertise than the modern browser, including:

  • File Transfer Protocol (FTP), a cross-platform protocol for transferring files to and from computers anywhere on the Internet.
  • Gopher, a tool for browsing files on the Internet.
  • Usenet, a worldwide messaging system through which anyone can read and post articles to a group of individuals who share the same interests.
  • Wide Area Information Server (WAIS), one of a handful of Internet search tools that can be spread across the network to scour multiple archives and handle multiple data formats.
  • Hyperlink (also called link), a pointer from text, from a picture or a graphic, or from an image mapto a page or file on the World Wide Web; hyperlinks are the primary way to navigate between Web pages and among Web sites.

Today, a Web browser is the main piece of software required by the end user to find information through Internet. Some of the most popular browsers are: Lynx, Mosaic, Netscape Navigator/Communicator, and Internet Explorer. Lynx is a text-only Web browser; it cannot display graphical or multimedia elements. Mosaic, a graphical Web browser, was the first "full-featured" graphical browser for the Web. It was developed by a team of programmers at the National Center for Supercomputing Applications (NCSA). One of these programmers, Marc Andreesen, later formed Netscape. Netscape Navigator/Communicator is one of the most popular Web browsers. Internet Explorer is Microsoft's Web browser.

Web Resources. A Uniform Resource Locator (URL) is a Web resource that describes the protocols needed to access a particular resource or site on the Web, and then point to the resource's Internet location. URLs are, in short, used to locate information on the Web.

Normally the URL is composed of six parts:

  1. The protocol or data source (i.e., ftp://, gopher://, news://, telnet://, WAIS://, http://)
  2. The domain name (for the Web server where the desired information resides)
  3. The port address
  4. The directory path (location of the Web page in the Web server's file system)
  5. The object name
  6. The spot (precise location within the file)

Protocols are the rules and formats that govern the methods by which computers communicate over a network. Protocols link clients and servers together and handle requests and responses, including making a connection, making a request, and the closing of the connection. Transmission Control Protocol/Internet Protocol (TCP/IP) is the full set of standard protocols used on the Internet. Hypertext Transfer Protocol (HTTP) is an Internet protocol specifically for the World Wide Web. It provides a way for Web clients and servers to communicate primarily through the exchange of messages.

Multipurpose Internet Mail Extension (MIME) is a technique designed to insert attachments within individual e-mail files. MIME allows a Web server to deliver multiple forms of data to the user in a single transfer. Also, when creating a Web page, it could include text files as well as nontext files, such as sound, graphics, still images, and videos.

Intersection and Information Technology and Public Health. The applications of IT in public health are numerous and varied. One particularly important example, however, is the use of Geographical Information Systems (GIS). Using GIS, public health officials can create very effective procedures to do their tasks using information technology. Doing a feedback loop they can: measure, plan, act, and measure again. In this manner, officials can identify a problem (e.g., cancer) by measuring data from a registry. Further, from the health care providers community, they can select a target population (e.g., breast cancer) and develop an implementation strategy for an intervention plan with the health care providers. Finally, by measuring again, GIS allows public health officials to evaluate the impact of the implementation plan on that data registry.

GIS is thus an information technology which can help improve health care and public health in many areas such as disease tracking, outbreak investigations, geostatistical analysis, and routing of health workers. As a means of tracking, residential zip codes of patients who appear at different clinics can be plotted with signs and symptoms of a selected diagnosis (e.g., upper respiratory infections [URI]). URIs are a marker for some toxic biological agents. Furthermore, community outbreaks of infectious diseases such as measles can be quickly analyzed then using GIS tools. Color shading can indicate areas with certain levels of morbidity probability or likeliness of getting sick. Areas that require immediate interventions such as immunizations can be depicted by a different shade. Geostatistical analysis is one of the most powerful tools available to a public health department. With a relatively small number of sampling points, predictive maps can be quickly produced to provide the likely extent of threats to public health. This mode of forecasting allows for the effective and efficient allocation of health care resources in a community.

GIS can also help create disease focused databases representing patients from a specific userdefined geographic area. In this fashion, the impact of a toxic release or exposure against a target population can be measured. GIS is a powerful tool for supplying immediate visualization of the likely geographic exposures, allows an analyst to examine the various variables that might effect the "fallout" of sprayings and to estimate its extent. Through the use of Computer Aided Design tools and GIS, medical centers as well as clinics are increasingly monitoring their patient care environments to assist managers evaluate risk for highly contagious diseases and implement control and isolation programs.

GIS helps health organizations visualizing diagnostic and geographic information simultaneously and dynamically. Over 14,000 ICD 9 and 10 codes describe medical diagnosis, treatment, and medical events worldwide. Public health clinics, hospitals, managed care, and health insurers use this application to conduct data mining on very large clinical and administrative data warehouses.

In public health education, GIS can be an analytical tool of choice for health promotions staff when deciding where to target the public health messages and warnings. GIS is also used to create interactive maps for health organizations required to publish information to the public. Health organizations require interactive maps depicting geographical areas and regions where infectious diseases and threats to the public's health are imminent.

luis G. Kun

(see also: Communication for Health; Communication Theory; Data Sources and Collection Methods; Information Systems; Internet )

Bibliography

Adams, J. B. (1986). "Three Surveillance and Query Languages for Medical Care." M.D. Computing 3:11.

American Medical Informatics Association (1997). "A Proposal to Improve Quality, Increase Efficiency, and Expand Access in the U.S. Health Care System." JAMA 4:340341.

Bronzino, J. D. (1982). Computer Applications for Patient Care. Boston, MA: Addison-Wesley.

Collen, M. F., ed. (1997). Multiphasic Health Testing Services. New York: John Wiley & Sons.

Council on Competitiveness (1996). Highway to Health: Transforming the U.S. Health Care in the Information Age. Available at http://www.compete.org/bookstore/book_index.html.

DeFriese, G. H., ed. (1987). "A Research Agenda for Personal Health Risk Assessment Methods in Health Hazard/Health Risk Appraisal." Health Services Research 22:442.

Federal Communications Commission. Health Care and the FCC. Available at http://www.fcc.gov/healthnet/.

Fitzmaurice, M. (1994). Putting the Information Infrastructure to Work Health Care and the NII. Washington, DC: Department of Health and Human Services. Available at http://nii.nist.gov/pubs/sp857/health.html.

(1995). "Computer Based Patient Records." In The Biomedical Engineering Handbook, ed. J. Bronzino. Boca Raton, FL: CRC Press.

Kun, L. (1999). "The Global Health Network of the 21st Century: Telehealth, Homecare, Genetics, Counter-Bioterrorism, Security, and Privacy of Information, Do We Need It and Are We Ready For It?" HPCN Conference, ISISITAB'99, Amsterdam, Netherland. April. Available at http://www.hoise.com/vmw/99/articles/vmw/lv-vm-0599-14.html.

LaPorte, R. E. (1994). Towards a Global Health Network. Pittsburgh, PA: University of Pittsburgh. Available at http://www.pitt.edu/HOME/GHNet/GHNet.html.

LaPorte R. E.; Akazawa, S.; Hellmonds, P.; Boostrom, E.; Gamboa, C.; Gooch, T.; Hussain, F.; Libman, I.; Marler, E.; Roko, K.; Sauer, F.; and Tajima, N. (1994). "Global Public Health and the Information Super-highway." British Medical Journal 308:16511652.

Lasker, R. D.; Humphreys, B. L.; and Braithwaite, W. R. (1995). Making a Powerful Connection: The Health of the Public and the National Information Infrastructure Report of the U.S. Public Health Service. Washington, DC: U.S. Public Health Service.

National Coordination Office for Computing, Information, and Communications. The Next Generation Internet. Available at http://www.ccic.gov/ngi/.

National Research Council (1997). For the Record: Protecting Electronic Health Information. Washington, DC: National Academy Press.

National Science and Technology Council (1999). Information Technology Frontiers for a New Millennium. A Report by the Subcommittee on Computing, Information, and Communications R&D, Committee on Technology. Washington, DC: Author.

Office of Technology Assessment (1997). Policy Implications of Medical Information Systems. Washington, DC:U.S. Government Printing Office.

Schiller, A. E. (1992). Telecommunications: Can It Help Solve America's Health Care Problems? Cambridge, MA: Arthur D. Little.

Shortliffe, E. H., and Perreault, L. E. (1990). Medical Informatics: Computer Applications in Health Care. Boston, MA: Addison Wesley.

Smith, J., and Weingarten, F., eds. (1997). Research Challenges for the Next Generation Internet. Washington, DC: Computing Research Association. Available at http://www.cra.org.

Wiederhold, G. (1981). Databases for Health Care. New York: Springer-Verlag.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

Kun, Luis G.. "Information Technology." Encyclopedia of Public Health. 2002. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

Kun, Luis G.. "Information Technology." Encyclopedia of Public Health. 2002. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1G2-3404000457.html

Kun, Luis G.. "Information Technology." Encyclopedia of Public Health. 2002. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3404000457.html

Information Technology

Information Technology


Information technology (IT) is a general term used to cover most aspects of computer-related technology. Intimately connected with information and information theory, IT deals with the representation, storage, manipulation, and transmission of information in digital form. The religious significance of information technology must be considered in the light of a general theological view of the nature and purpose of human life. Wherever any medium comes to permeate and shape almost all aspects of social and individual existence, questions can be asked about the direction in which such changes lead, and whether they are favorable or inimical to the purpose of human life as conceived theologically.

At the center of the debate lies the broader question of the way in which human beings represent, model, and shape the world. As computer scientist Joseph Weizenbaum (1923) once put it, ". . . the computer is a powerful new metaphor for helping us to understand many aspects of the world, but . . . it enslaves the mind that has no other metaphors and few other resources to call on" (p.277).

IT began as a tool that human beings could use as they saw fit. In less than half a century it came to occupy an indispensable place in the world. No single human being altogether controlled that rise, and no single human being understands all its implications. The question is rapidly becoming whether human beings will control information technology or information technology will control human beings. Is the demand that people render the processes of human life in forms that are susceptible to digitization forcing them to alter the way they live their lives without giving them a chance to decide whether those changed lives are the ones they wish to lead? That question forces people to examine, perhaps as they have never examined before, the things they think valuable about human life.

Digitization adds a new dimension to the philosophical issues associated with representation. Their sensory system limits what humans can experience, and their intellectual systems attempt to compensate with imagination for those limitations. By universalizing concepts and generalizing theories from experience of particulars, humans have achieved an understanding of the universe of extraordinary power, but that power is not without its costs and its drawbacks. Universal concepts overwrite the particularities of specific instances, just as Plato believed they should, but they lose sight of detail when number and quantity, statistics and probability, replace specificity. Once the world is digitized, this process takes another step toward unreality: A computer stores data in a medium that is incapable of retaining all the detail and presents people with clear-cut images, data, and their constructs that bear a more remote connection to the "real world" than their usual appropriation in human intellectual systems.

Conceptual clarity, of course, has its power and its uses. By concentrating first on idealized simplified situations, processes that are unimaginably complex in reality can begin to be grasped. Computer-generated models of the workings of a living cellits DNA-replication and division, its immune-system response to attack by viruses, and so forthilluminate and clarify. But the reality is far less clear-cut, like the digital signals that are represented as beautifully symmetric square waves.

The more pervasive IT becomes, the more it will tend to influence, shape, and direct human lives. In itself it is no more a force for good or evil than other tools, but the range of its influence makes it unlike most other technological changes. The way colored glass affects everything seen through it affords an analogy: As people come to conceptualize the world more and more in terms borrowed from IT, does a time arise when IT comes to shape their view of the world rather than transmit and interpret the world for them to view? At a very basic level, IT does not answer questions about what people should do with it. It is open and indifferent to that use. But it is easy to overlook the way it constrains what people can see, what they aspire to see, and how they see it.


Uses and impact

Security and surveillance. Information transmitted down wires or by radio waves is inherently more vulnerable to interception than information retained in a vault, and so security measures have been developed to match the increased threat. Chief among these are advanced methods of data encryption using encoding systems that are virtually impossible to break, even by the most advanced computer systems.

The ability to render data safe by encryption also has the potential to prevent those responsible for surveillance from decoding messages between subversives, terrorist groups, criminals, pedophiles, and others deemed socially undesirable. There have therefore been attempts to restrict access to high-performance encryption systems, to forbid the transmission of heavily encrypted signals over the Internet, and to prohibit the export of encryption software likely to enable data to be made impregnable to snooping.

Weaponry and conflict. Many of the pressures that have produced advances in the understanding and command of guidance and control systems have arisen from military applications. Warfare has been transformed by advanced technology. "Smart" bombs that seek specific targets, "jamming" devices that interfere with the ability of an enemy to communicate on the battlefield, software viruses that disrupt control systems, eavesdropping on email and digital telephone calls, and electronically disseminated misinformation are all part of the stuff of modern warfare and state security. But smart bombs are not as smart as people are led to believe, and the technology has proved less reliable than military and political leaders insinuate.


Work and society. Information technology significantly alters the parameters governing the way human beings cooperate to achieve their goals. The manufacture of physical objects requires people to be physically present at the site of their construction, in however widespread a way the components are manufactured. Before electronic mail and data communication through the Internet, office work similarly required people to be collected together in their workplaces. Where added value arises largely from the manipulation of data stringsthrough programming, database design and construction, composing and editing text, and so forththis physical juxtaposition is unnecessary. People can relate across digital channels through video conferencing in ways that significantly reduce the need and opportunity for physical meetings.

It is not yet clear what the consequences of this shift in work patterns will be, and they are not unique in human history. Just as the industrial revolution drew populations to the cities and the invention of the telephone and radio communication had a major impact on the relationship between society and work, so decentralized but cooperative data-working will effect further changes in that relationship. The threat of loneliness will increase alongside the opportunities for freer work patterns and wider circles of friends, and many have found their experiences of "virtual communities" deeply unsatisfying and unfulfilling.

Viruses, hacking, and censorship

The destruction of the modern Eden of computergenerated communication by deliberately made viruses is a story of almost biblical proportions. The fact that computers must be accessible to a public domain to receive email or access Internet websites makes them vulnerable to attack from malicious software that attaches itself to email and downloadable packages. Executable files and attachments, once opened, infect the host machine, and commonly export themselves to other machines by spawning copies of themselves in bogus messages sent to all or some of the entries in the local address book. The cost to commerce, worldwide, of damage caused by viruses is already measured in billions of dollars, and the cost of antivirus software that struggles to keep up with ways to immunize systems against attack by viruses that become more sophisticated every day has added to that cost.

Hacking, as the process of gaining unauthorized access to another computer is known, is also a major problem. Just as authors of software viruses regard every new defensive shield as a new challenge, so all the sophisticated mechanisms that are employed to prevent unauthorized access to a computer represent a similar challenge. Hackers' conventions set up competitions where the winners are those who can most successfully penetrate the defenses devised by other competitors, and there have been many instances where commercial, national defense, and other secure systems have been penetrated. Some hackers are motivated by no more than the intellectual challenge; some are malicious; some are politically motivated; some are disgruntled employees; some are just socially disenfranchised and angry.

The location of the physical machine hosting a website is not easy to discover. As a result, it is difficult to police the Internet in order to impose any kind of censorship. But it is not clear whose responsibility or entitlement it is to act either as censor or police force. National governments and international organizations are frequently thwarted in their attempts to track subversive, criminal, or other groups by the lack of boundaries on the Internet.

The most obvious cases where some believe censorship should be imposed are sites posting, advertising, and selling sexual material. Others include terrorist organizations, industrial saboteurs, and all sorts of political activists. But here as everywhere the boundaries between public security and private freedom are hard to define.

On the other hand, the difficulty of policing the Internet affords a means to support and help oppressed minorities in countries where they are persecuted. It enhances freedom of speech and expression. It joins together those who find themselves in minorities. It affords the means for all kinds of propaganda wars to be waged. It allows books and art and music to be made available to the poor and to those who live where some material is prohibited or circumscribed. All of these opportunities can, of course, be used for good and ill, and whether the good outweighs the ill remains to be seen.


Reality and virtual reality

Sciences and religions strive to increase knowledge and awareness of what they take variously to be "reality." They have argued extensively and bitterly about the boundaries of "reality," even though their conceptions of reality have grown and changed through the centuries.

The term virtual reality is generally taken to denote that new realm of experience fabricated with the aid of IT from the connections between people throughout the world and the capabilities of software to generate new kinds of communication and even new fictional environments in which they can interact. There is nothing in principle to prevent people living on opposite sides of the globe from donning some sort of virtual-reality headset and sharing the exploration of an entirely unreal virtual habitat.

Virtual habitats are not, of course, new. Every fictional book ever written has created virtual habitats for the human imagination, and so, more recently, have films. It is the interactive capacity of virtual realities that is new and poses sharp questions about what people take to be the nature and purpose of human existence.


Individuals and societies

A person's sense of self has typically been associated with a certain geographical locality, a workplace, and a group of friends largely drawn from his or her own nation. People and their cultures are intimately intertwined, even if every culture consists of a myriad of subcultures with their own mores and customs. Selves are distributed through these cultures, and people know themselves as reflected and invested in them.

Because information technology offers people the opportunity to associate with anyone in the world with access to the Internet in a way that far surpasses in immediacy and intimacy anything possible through the telephone or "snail-mail"through email, video conferencing, websites, chat rooms, and so forthit is now possible to withdraw from the community defined by a locality, a geographically defined subculture, or a nation-state, and to find (or lose) oneself in the greater culture that exists through the interactions of persons on the Internet.

It is often suggested that computer technology has made human beings less sociable or neighborly. Now that people can choose like-minded conversational partners from anywhere in the world, they are supposedly less minded to socialize with their neighbors. It is not obvious that this is true. Computer technology is as ambiguous as was the television, the telephone, or the motorcar.

Computer communities do, however, break national boundaries without the need for expensive travel, and it is certainly arguable that greater international fraternization will reduce rather than increase the long-term threat of war. What is not clear is the extent to which having the world as one's neighbor will make one less able to negotiate tolerantly with those physical neighbors who surround one every day, or whether exposure only to those who agree will make one less tolerant of those who do not.

Although it is not true that the Internet has spawned "virtual" communities as an entirely new phenomenonthey have always existed through newsletters, journals, conferences, and the likeit has certainly made their activities more widespread and the frequency of their interactions much greater.

Whatever interest people have, there is almost certainly an Internet community that shares that interest. Through online discussions, websites, mass-circulation email, and so forth, such groups establish both their mutual interest and, usually, considerable interpersonal rapport that spills over into wider aspects of life. Participants will commonly share their joys and sorrows, support one another, and exercise general pastoral care for the group. This phenomenon has led some to suggest that the World Wide Web may facilitate the generation of a new kind of religious community in which mutual care and even worship arise within a virtual world rather than geographically close localities or through meeting eclectically in physical buildings.

Embodiment and realism challenged

Science and religion agree that human beings are embodied: finite, physical existence in a physical world, the fact that life has a beginning and an end. These things occasion no disagreement, even if the nature of the beginning and the end do. Human evolutionary history has been dictated by this physicality, and the need to reproduce, feed, and survive as individuals and species has been deeply influential in making all creatures what they are. Virtual selves challenge this history by providing an intelligible alternative in which people might one day come to exist not as physically embodied selves but as remote functional intellectual agents that would stand evolution on its head by adapting the world to fit human imaginations rather than adapting human bodies to fit the world.

Most people recoil from this suggestion because they do not want to lose their physical embodiment. The pleasures of physical contact, whatever they may be, seem so central to what it is to be human that people want to stop in its tracks any process that would render them less than fully physical and embodied.

This instinctive reaction raises clear questions about what people really and genuinely and deeply value as human beings. Science, in its popularly conceived objectivity, cannot answer those questions because it is indifferent to them. For science, human beings and all living and nonliving things simply are what they are; there is no justifiable scientific view of what anything "ought to be." As soon as one asks how things "ought to be," one is in philosophical or religious or ethical territory; science strikes rock, and its spade is turned.

Philosophy and psychology enable people to see that there is no such thing as a raw perception neither filtered nor colored nor shaped by certain sorts of conceptual apparatuses. The world and what is designated reality are complex mixtures of sensory stimulation and intellectual construction. Software and hardware change the way human beings see the world, first as a matter of programming necessity, and later because the image of the world they have has been distorted by the information-theoretic format. One is also tempted to believe that the sheer quantity of information available on the Internet somehow replaces the filtered, processed knowledge imparted through more traditional means of dissemination.


IT models and reality

A theology of creation identifies the physical embodiment of persons as playing a major part in the achievement of the creator's purpose. Physical embodiment entails certain limitations imposed by sensory parameters and necessitates certain kinds of community and cooperation. The nature of the world comes to be construed in accordance with certain kinds of gregarious cooperative endeavor.

IT has the power to change the relationship between human's perceptual and conceptual systems and the world. Digital clarity, arising from the cleansing of data of its inconvenient messiness, encourages one to reconfigure the world; virtual communities encourage one to reconfigure the parameters of friendship and love; software models first imitate and then control financial, political, and military worlds. The beginning of the twenty-first century is an age when the residual images of a predigital worldview remain strong; one can still see that there is a difference. A theology of creation suggests that this analogical unclarity is deliberate and purposive; a digital worldview may prove more incompatible with that creative story than currently supposed. The digital reconfiguration of epistemology may yet prove to be the most profound shift in human cognition in the history of the world, and the changes impression of reality that it will afford will present any theology of creation with a deep new challenge.


See also Embodiment; Information; Information Theory


Bibliography

hofstadter, douglas r. gödel, escher, bach: an eternal golden braid (1979). london, penguin, 1996.

ullman, ellen. close to the machine: technophilia and its discontents. san francisco: city lights books, 1997.

weizenbaum, joseph. computer power and human reason: from judgment to calculation (1976). london: pelican, 1984.


john c. puddefoot

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

PUDDEFOOT, JOHN C.. "Information Technology." Encyclopedia of Science and Religion. 2003. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

PUDDEFOOT, JOHN C.. "Information Technology." Encyclopedia of Science and Religion. 2003. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1G2-3404200283.html

PUDDEFOOT, JOHN C.. "Information Technology." Encyclopedia of Science and Religion. 2003. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3404200283.html

Information Technology

INFORMATION TECHNOLOGY

Information technology (IT) turns arduous chores into efficient tasks and corporate activities into achievable accomplishments. Online banking, electronic mail (e-mail) communications, ATM transactions, and Internet-based research are possible because of IT. IT has evolved into an essential component of everyday life.

The Information Technology Association of America (ITAA) provides a concise definition of IT as, "the collection of products and services that turn data into useful, meaningful, accessible information." Tony Gunton provides a more comprehensive definition of IT as "electronic technologies for collecting, storing, processing, and communicating information separated into two main categories (1) those which process information, such as computer systems, and (2) those which disseminate information, such as telecommunication systems" (1993, p. 150). Specific equipment (computers) and software are needed to process data so that information can be acquired. IT is reliant upon items to electronically input, output, process, store, and retrieve data. Data may include, but are not limited to, text, graphics, sound, and video. Although IT is a complex entity, it makes daily tasks easier and more efficient.

Computers, networks, satellites, robotics, videotext, television, e-mail, electronic games, and automated office equipment are some of the many tools used in IT. The IT industry uses hardware and equipment such as computers, telephones, World Wide Web sites, transaction machines, and office equipment to transfer information. Specific software and services are used to ensure rapid processing of information that is reliable and secure.

HISTORY OF INFORMATION TECHNOLOGY

Although the term information technology first appeared in the 1970s, the basic concept can be traced to much earlier times, when the abacus (c. 1400), the movable press (1450s), and slide rule (1600s) were considered the first "computers." Although these tools may seem primitive, these "analog" computers provided valuable information for their users.

IT then took a huge leap as military and business industries combined their efforts in the early 1900s. Together they were a major force in IT research and development. Punched cards and electrical pulses quickly gave way to vacuum tubes and electronic digital computers.

The first electronic digital computer was designed at the University of Pennsylvania by John Presper Eckert, Jr. (19191995) and John W. Mauchly (19071980) in 1945. The electronic numerical integrator and computer, or ENIAC, was designed to discover, monitor, and predict flight paths of weapons. ENIAC was designed using 18,000 vacuum tubes that provided a week's worth of information in one hour, but was laden with maintenance problems.

The first commercial computer was the Universal Automatic Computer (UNIVAC), developed by Eckert and Mauchly in 1951. The UNIVAC I was used by the Census Bureau to predict the outcome of the 1952 presidential election. The development of ENIAC and UNIVAC I prompted an increase in IT research and development that continues into the twenty-first century. Computers are designed for a variety of purposes and are divided into four categories: supercomputer, mainframe computer, microcomputer, and minicomputer. The categories are defined by size, cost, and processing ability.

Supercomputers are developed for use in science and engineering, for designing aircraft and nuclear reactors, and for predicting worldwide weather patterns. These computers are of significant size and cost millions of dollars.


Information is processed quickly using multiple processors. Few supercomputers exist because of their cost.

Mainframe computers are large general-purpose computers requiring special attention and controlled atmospheres. They are used in large corporations to calculate and manipulate large amounts of information stored in databases. Mainframe computers are high-speed, multi-purpose machines that cost millions.

Microcomputers were introduced in 1975 by the Massachusetts Institute of Technology (MIT). These desktop computers were designed using a single-chip microprocessor as its processing element. Tandy Corporation quickly followed MIT by offering Radio Shack's first microcomputer in 1976. The Apple microcomputer was introduced in 1977. IBM introduced the first personal computer (PC) in the fall of 1981, causing a dramatic increase in the microcomputer market. The microcomputer is generally known as a PC. The cost for PCs ranges from $500 to $2,000. Because of dramatic improvements in computer components and manufacturing, personal computers do more than the largest computers of the mid-1960s at a fraction of the cost.

Minicomputers came on to the scene in the early 1980s in small businesses, manufacturing plants, and factories. Minicomputers are multitasking machines that connect many terminals to each other and a mainframe computer. As such, they are able to process large amounts of data. Minicomputer systems (desktop, network, laptop, and handheld devices) range in price from $15,000 to $150,000.

Since the 1950s, four generations of computers have evolved. Each generation reflected a decrease in hardware size but an increase in computer operation capabilities. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, and the fourth used integrated circuits on a single computer chip. Advances in artificial intelligence that will minimize the need for complex programming characterize the fifth generation of computers, still in the experimental stage.

INFORMATION TECHNOLOGY PHASES

Information processing involves five phases: input, process, output, storage, and retrieval. Each of these phases and the devices associated with each are discussed below.

Input

Input refers to information or stimulus that enters a system. Input can include commands entered from the keyboard to data from another computer. Input devices include the keyboard, pointing devices (such as mouses), scanners and reading devices, digital cameras, audio and video input devices, and input devices for physically challenged users. Input devices are used to capture data at the earliest possible point in the workflow, so that the data are accurate and readily available for processing.

Processing

Processing occurs after data have been entered into the computer. When data are processed, they are transformed from raw facts into meaningful information. A variety of processes may be performed on the data, such as adding, subtracting, dividing, multiplying, sorting, organizing, formatting, comparing, graphing, and summarizing. Data processing includes the input, verification, organization, storage, retrieval, transformation, and extraction of information from data. Processing can also include the execution of a program.

Output

Output is information that comes out of a computer. Four common types of output are text, graphics, audio, and video. After the information has been processed, it can be listened to through speakers or a headset, printed onto paper, or displayed on a monitor. An output device is any computer component capable of conveying information to a user. Commonly used output devices include display devices, printers, speakers, headsets, data projectors, fax machines, and multifunction devices. A multifunction device is a single piece of equipment that looks like a copy machine but provides the functionality of a printer, scanner, copy machine, and perhaps a fax machine.

Storage

Storage refers to a variety of techniques and devices that retain data. Storage devices preserve items such as data, instructions, and information for retrieval and future use. Storage is measured in a hierarchy of bytes.

  • Bit: single unit of data coded in binary form (0 or 1)
  • Byte: most commonly comprised of 8 bits (combinations of 0s and 1s)
  • Kilobyte: 1,024 bytes
  • Megabyte: 1,024 kilobytes or 1 million bytes
  • Gigabyte: 1,024 megabytes or 1 billion bytes
  • Terabyte: 1,024 gigabytes or 1 trillion bytes

Devices used to store data include floppy disks, hard disks, compact disks (both read-only and disk-recordable), tapes, PC cards, smart cards, microfilm, and microfiche. Portable drives (flash drive/jump drive) can also serve as storage devices.

Retrieval

Retrieval is the ability to search for and locate information that has been stored. The information can be text, sound, images, or data. Information retrieved may include documents, information within documents, and information within a stand-alone database or hyperlinked database such as the Internet or intranets.

IT drives the educational, business, medical, and military worlds. As such, it is imperative that the relationship between and among the phases of IT work seamlessly to input, process, display (output), store, and retrieve data. Continuous research and development is needed to meet the future needs of the world.

THE FUTURE OF INFORMATION TECHNOLOGY

The future of IT is promising. People use computers in new ways every day. Computers are increasingly affordable, more powerful, and easier to use. Communication needs will continue to grow; the functions of e-mail, instant messaging, Weblogs, and wireless communications will improve as the demands of informational society increase. Daily tasks will continue to be enhanced as more people use Web-based technologies.

Potential problems concerning IT center on its delicate infrastructure. Educational, business, and military systems are mindful of the underlying foundation necessary to support its respective communities. In fact, researchers are already hard at work exploring possible solutions to the infrastructure concerns. One solution offered is the creation of a "mobile Internet." Another possible solution is the automation of data integration.

What will the future hold for IT? While questions exist, one thing is certain: IT will continue to grow and adapt making life more enjoyable and efficient.

see also Hardware; Information Processing; Information Technology; Office Technology

bibliography

Fryman, Harriet (2004, March 1). The future of IT is automation. Retrieved September 21, 2005, from http://www.cioupdate.com/reports/article.php/3319601

Gunton, Tony (1993). A Dictionary of information technology and computer science (2nd ed.). Manchester, England: NCC Blackwell.

Information Technology Association of America. (n.d.). The U.S. information technology industry: A brief overview. Retrieved September 20, 2005, from http://www.itaa.org/eweb/DynamicPage.aspx?webcode=LTII&wps_key=86291cb4-0e13-41c7-89f0-e0767fcf4eb6

National Coordination Office for Information Technology Research and Development. (2001, February). Using information technology to transform the way we learn. Report to the president. Arlington, VA. (ERIC Document No. 462 969)

Reiser, R. A., and Dempsey, J. V. (2002). Trends and issues in instructional design and technology. Upper Saddle River, NJ: Prentice-Hall.

Reynolds, P. (2005). A vision of the Internet in 2010. In Les Lloyd (Ed.), Best technology practices in higher education (pp. 193200). Medford, NJ: Information Today.

Smaldino, Sharon E., et al. (2005). Instructional technology and media for learning (8th ed.). Upper Saddle River, NJ: Pearson/Merrill/Prentice Hall.

Charlotte J. Boling

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

Boling, Charlotte. "Information Technology." Encyclopedia of Business and Finance, 2nd ed.. 2007. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

Boling, Charlotte. "Information Technology." Encyclopedia of Business and Finance, 2nd ed.. 2007. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1G2-1552100167.html

Boling, Charlotte. "Information Technology." Encyclopedia of Business and Finance, 2nd ed.. 2007. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-1552100167.html

Information Technology (IT)

INFORMATION TECHNOLOGY (IT)

Information technology (IT) broadly describes the processing and management of data in computer systems. Within IT's wide parameters are the hardware (including hard drives, modems, monitors, servers, mainframe systems, and routers) and software (word processing and spreadsheet programs, Web browsers, and databases) that make the movement, manipulation, and storage of information possible. Thus, IT also gives life to the Internet, the World Wide Web, and e-commerce. A 1999 Computer Weekly poll identified the World Wide Web (1989); the first IBM PC (1981); and COBOL, a high-level programming language created in 1959 and used for writing business software as the top three IT developments of all time. E-mail (1971); Visicalc (1979), the first spreadsheet program; MS-DOS (1980); and the Apple Macintosh computer (1984); were among the top 10.

From the early 1970s onward, computers and electronic information were increasingly critical elements of the corporate landscape. Large companies devoted entire departments to information technology. These IT departments went by a variety of names, including information systems (IS) and management information systems (MIS). E-commerce created additional demand for IT workers. Although there were many layoffs in the technical industry during the early 2000s due to poor market conditions and failing Internet companies, overall demand remained strong in mid-2001, according to InternetWeek. This was especially true for workers with e-commerce and Web development experience. InfoWorld identified plenty of opportunity for IT workers, especially those with the ability to use programming languages like Java and C++.

When the Internet and e-commerce exploded in popularity, many companies spent hefty sums on IT in an effort to keep up with or exceed the competition. According to Fortune, the amount spent on software and equipment increased from four percent annually in the last quarter of 1999 to 21 percent in the first quarter of 2000. However, due partly to worsening economic conditions, this had changed by March 2001. At that time, a survey by Merril Lynch found that chief information officers in the United States and Europe were planning to scale back IT spending on things like mainframe computers, printers, consulting, and outsourcing. Conversely, spending on Internet-related technologies, including servers, wireless products, and storage, was expected to remain strong. Information from International Data Corp. (IDC) forecast stronger growth in IT spending throughout the rest of the world, with the strongest potential in Australia, Western Europe, and developing markets like Latin America, the Middle East, Africa, Eastern Europe, and Asia.

The term IT includes a mind boggling number of different brands, variations, and kinds of computer systems, platforms, devices, applications, and products. As consumers and businesses purchase these products over time, issues of integration and compatibility frequently arise. Because of issues like this, companies rely on relationships with the vendors from whom they purchase products for technical advice and support. In addition to hiring IT professionals of their own, organizations also rely heavily on consultants to improve the functionality of systems and processes.

Although consultants often provide strategic value to companies, such is not always the case. Like other business practices, there are advantages and drawbacks to using consultants. As explained in Computerworld, "IT has always depended on strategic relationships with vendors and its heavy use of consultants to a degree that's unmatched in any other field of business. That's because no company can go it alone. The best consultants either provide special skills, handle the ever-growing IT workload and provide development and integration capabilities or take on the management of large-scale projects. The worst consultants believe their companies are smarter than their IT clients, instead of recognizing that they're extensions of their clients' resources."

Advances in IT and the widespread adoption of the Internet allowed e-commerce to develop and evolve. Some professionals held that by eliminating human involvement from business transactions, companies would achieve greater profits. While this may be true to a degree, the human element still was very important to a company's success in the early 2000s. Some leading organizations found customer service delivered by human employees to be an important differential in a competitive market where all players had access to similar technology. Thus, the human touch was still important for developing and maintaining customer relationships, problem solving, helping customers to understand and accept technology used for e-commerce, and more. As e-commerce evolves, so will IT. Each is a critical piece of a larger whole.

FURTHER READING:

Bernasek, Anna. "Buried in Tech." Fortune, April 16, 2001.

"Corporations to Cut IT Budgets." Nua Internet Surveys. March 7, 2001. Available from www.nua.ie/surveys.

"Information Technology." Techencyclopedia, May 7, 2001. Available from www.techweb.com/encyclopedia.

"IT." Ecommerce Webopedia, May 7, 2001. Available from ecomm.webopedia.com

"IT 'Classics' Beaten by the Web in Reader Poll." Computer Weekly, November 25, 1999.

Joachim, David. "Report: IT Workers Still In Short Supply." InternetWeek, . April 30, 2001.

Keen, Peter G. "Consultant, anyone?" Computerworld, March 12, 2001.

Prencipe, Loretta W. "Management BriefingThe Job Market: Are IT Professionals Working in a Time of Feast or Famine?" InfoWorld, April 9, 2001. Available from e-comm.webopedia.com.

"U.S. IT Spending to Slow, but Global Outlook Positive." Nua Internet Surveys. March 23, 2001. Available from www.nua.ie/surveys.

Whiteley, Philip, and Max McKeown. "The Human Face of IT." Computer Weekly, April 12, 2001.

SEE ALSO: Database Management; Knowledge Management

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Information Technology (IT)." Gale Encyclopedia of E-Commerce. 2002. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

"Information Technology (IT)." Gale Encyclopedia of E-Commerce. 2002. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1G2-3405300237.html

"Information Technology (IT)." Gale Encyclopedia of E-Commerce. 2002. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3405300237.html

information technology

information technology (IT) Any form of technology, i.e. any equipment or technique, used by people to handle information. Mankind has handled information for thousands of years; early technologies included the abacus and printing. The last four decades or so have seen an amazingly rapid development of information technology, spearheaded by the computer; more recently, cheap microelectronics have permitted the diffusion of this technology into almost all aspects of daily life and an almost inextricable cross-fertilizing and intermingling of its various branches. The term information technology was coined, probably in the late 1970s, to refer to this nexus of modern technology, electronic-based, for handling information. It incorporates the whole of computing and telecommunication technology, together with major parts of consumer electronics and broadcasting. Its applications are industrial, commercial, administrative, educational, medical, scientific, professional, and domestic.

The advanced nations have all realized that developing competence in information technology is important, expensive, and difficult; large-scale information technology systems are now economically feasible and there are national programs of research and education to stimulate development. The fundamental capabilities that are usually recognized to be essential comprise VLSI circuit design and production facilities, and a common infrastructure for the storage and transmission of digital information (including digitized voice and image as well as conventional data and text). Major research problems include improved systems and software technology, advanced programming techniques (especially in knowledge-based systems), and improved human-computer interfaces.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

JOHN DAINTITH. "information technology." A Dictionary of Computing. 2004. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

JOHN DAINTITH. "information technology." A Dictionary of Computing. 2004. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1O11-informationtechnology.html

JOHN DAINTITH. "information technology." A Dictionary of Computing. 2004. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O11-informationtechnology.html

Information Technology

Information Technology

ADVANCED MICRO DEVICES, INC.

APPLE COMPUTER, INC.

COMPAQ COMPUTER CORPORATION

COMPUTER ASSOCIATES INTERNATIONAL, INC.

COMPUTER SCIENCES CORPORATION

CONNER PERIPHERALS, INC.

DIGITAL EQUIPMENT CORPORATION

HEWLETT-PACKARD COMPANY

ICL PLC

INTELLIGENT ELECTRONICS, INC.

INTERGRAPH CORPORATION

INTERNATIONAL BUSINESS MACHINES CORPORATION

LOTUS DEVELOPMENT CORPORATION

MICROSOFT CORPORATION

NATIONAL SEMICONDUCTOR CORPORATION

NCR CORPORATION

NOVELL, INC.

ORACLE SYSTEMS CORPORATION

STORAGE TECHNOLOGY CORPORATION

TANDEM COMPUTERS, INC.

UNISYS CORPORATION

WANG LABORATORIES, INC.

XEROX CORPORATION

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Information Technology." International Directory of Company Histories. 1992. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

"Information Technology." International Directory of Company Histories. 1992. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1G2-2841000081.html

"Information Technology." International Directory of Company Histories. 1992. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-2841000081.html

information technology

information technology (IT) Computer and telecommunications technologies used in processing information of any kind. Word processing, the use of a database and the sending of electronic mail (e-mail) over a computer network all involve the use of information technology. Television stations employ information technology to provide viewers with teletext services. IT has revolutionized retailing and banking through the development of bar codes. In manufacturing, IT has enabled the development of computer-aided manufacture (CAM). See also artificial intelligence

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"information technology." World Encyclopedia. 2005. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

"information technology." World Encyclopedia. 2005. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1O142-informationtechnology.html

"information technology." World Encyclopedia. 2005. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O142-informationtechnology.html

information technology

information technology (IT) See CYBERSOCIETY.

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

GORDON MARSHALL. "information technology." A Dictionary of Sociology. 1998. Encyclopedia.com. 25 Sep. 2016 <http://www.encyclopedia.com>.

GORDON MARSHALL. "information technology." A Dictionary of Sociology. 1998. Encyclopedia.com. (September 25, 2016). http://www.encyclopedia.com/doc/1O88-informationtechnology.html

GORDON MARSHALL. "information technology." A Dictionary of Sociology. 1998. Retrieved September 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O88-informationtechnology.html

Facts and information from other sites