Computers had become significant by the early 60’s and some had already focused on the increasing costs of developing software. The computers had been and largely still were so much more expensive than people that it seemed OK to ignore the convenience of the programmer in order to use the computer most efficiently. As the tasks for computers became more complex so did the programs and the task of debugging began to limit what computers could do. Adding more programmers did not speed the debugging process.

Stand Alone Computer Scheduling (Livermore 1954-1960)

A debug shot would consist of a 10 of 15 minute period when the machine was allocated to a programmer who was developing and debugging some program. Part of this time would be required to read cards from the card reader, 150 cards per minute. Often a computer print out at the same speed was also required. This was clearly inefficient but apparently necessary within the batch paradigm.

At Livermore an additional problem arose: It took a computer operator several minutes to switch between one production job and another. This led to long machine runs in order to limit this task switching overhead. Often the simulation that these long runs would perform would go wrong part way thru and the subsequent time spent on that run would be wasted. It required the occasional attention and possible intervention by the physicist throughout the computation to assure that the run was useful. In short the physics had to be debugged as well as the program.

Queued Processing (1960-1970)

IBM and other computer companies provided ‘batch processing’ software to automate this process by queuing tasks for the machine on a magnetic input tape. An executive program would ideally remain in core between jobs and cause the compiler, such as Fortran, to do its part and then the compiled code to be run. Printer output would be queued on another output tape to be printed offline later. This was somewhat more efficient of machine time but the intervention of operators was still constantly required—especially finding and mounting magnetic tapes. The executive program was itself vulnerable to programs that were largely undebugged. The equipment to copy cards to tape and to print from tape was very expensive and operator intensive.

Timesharing

Online random access storage, to wit disks, were looming on the horizon. IBM had built the RAMAC some years before and it became clear to some that a computer could divert its attention between a number of users very much more quickly than the stand-alone or queued access paradigms allowed. These ideas were developed along with emerging hardware protection schemes that would protect the executive programs mentioned above. There remained widespread skepticism that hardware protection would suffice for an executive to retain control in the midst of multiple undebugged programs.

The promise was that the programmer could afford to step thru his program to see just how it was behaving. Perhaps 10 or 20 such queries, none of which would require recompilation, could much more quickly advance the development of the new program. In some cases it would require more resources and some cases less. The critical factor was often: “How soon can we get the program right?”, and not “How best can we minimize the computer resources required to debug the program?”. The latter question was moot when the computer was sitting idle for lack of developed software.

The vision was that a timesharing executive could deploy the development tools, such as compilers and assemblers, on behalf of the users, and also give a portion of the machine over to the programs in development for short shots to let the program developers test their programs. John McCarthy recounts the earliest thoughts and steps in this direction. CTSS was an early and well publicized success story. (Fano demonstrating CTSS and describing its motivation, about 1963) One cost reduction that had not been widely foreseen the elimination of core dumps where one would print the entire contents of core upon program failure. These were expensive but the alternative had been additional debug shots to get the clues that abbreviated dumps had omitted. Timesharing could keep the core image on disk and provide interactive access, as well as search functions that paper dumps did not. Soon the remote ancestors of gdb arose.

Unix arose in this milieu. Although it was originally devoted to the single user machine, it gradually adopted the system features that supported timesharing.