Practical Unix Programming:
A Guide to Concurrency, Communication, and Multithreading

by Kay. A. Robbins and Steven Robbins

Chapter 1: What is Concurrency?

Concurrency refers to the sharing of resources in the same time frame. This usually means that several processes share the same CPU (that is, they execute concurrently) or share memory or an I/O device. Incorrect handling of concurrency can lead to programs which fail for no apparent reason, even with the same input for which they previously seemed to work perfectly. Operating systems manage shared resources, and in the past programmers could allow the operating system to handle all aspects of concurrency. This is no longer the case for today's complex programs, which need to run efficiently and robustly on modern computers. Multiprocessor machines on the desktop and distributed systems are examples of architectures in which concurrency control takes on new and important meaning for systems designers. This chapter introduces the subject of concurrency and provides guidelines for programming on Unix systems in a concurrent environment.

Next chapter

Back to chapter summaries