Table of contents
1. The origin of the C programming language2. The birth of the C programming language
3. Evolution of the language and its influence
The C programming language emerged from a series of developments at Bell Telephone Laboratories (Bell Labs) during a period of significant innovation in computer systems research. Its story begins in the late 1960s, when Bell Labs engineers were deeply involved in the ambitious but eventually unwieldy Multics operating system project. By 1969, the team, which included Ken Thompson, Dennis M. Ritchie, and others, had grown dissatisfied with the complexity and slow progress of Multics and decided to pursue their own, more streamlined computing environment. This decision led to the creation of the Unix operating system on a modest DEC PDP-7 computer with a mere 8K words of memory.
In this austere setting, Thompson first wrote Unix primarily in PDP-7 assembly language due to the lack of suitable higher-level tools. Initially, there were no load editors, no linkers, and no modern compilers—programs were hand-assembled, and output was literally transferred between machines on paper tape. Yet the team longed for a language that combined low-level efficiency with higher-level abstractions, easing the programming burden and making code more portable.
The earliest stepping stone was the language B, created by Thompson around 1969-1970. B drew inspiration from Martin Richards’s BCPL, a typeless language known for its portability and simplicity. Where BCPL introduced the concept of a “word” as the universal data type, B inherited this notion and fit it into the cramped memory of the PDP-7. However, B programs ran as interpreted “threaded code,” making them noticeably slower than hand-tuned assembly. Despite these limitations, B proved that a higher-level language could be used for systems programming—an idea that guided subsequent developments.
By 1971-1972, the Unix team acquired a more capable machine: the DEC PDP-11. This hardware introduced new challenges and possibilities. The PDP-11’s byte-addressable memory highlighted the need for richer data types, especially to handle characters and, in the future, floating-point numbers. Portability and performance were also pressing concerns: if the team wanted to recast the entire Unix kernel and utilities in a higher-level language, that language had to be efficient enough to rival assembly.
These demands led Dennis Ritchie to evolve B into NB (“new B”) and, by 1972-1973, into the language we now call C. Ritchie’s key innovations included adding a robust type system with distinct data types like int
and char
, as well as arrays and pointers that mapped naturally onto memory addresses. He introduced syntax where complex declarations mirrored the expressions that used them, a hallmark of C’s style. Moreover, C abandoned the purely typeless model of B and BCPL, allowing pointers and arithmetic types to be distinguished and handled more intelligently. C’s compiler generated direct machine instructions, shedding the interpretive overhead and providing the performance needed for systems-level work.
Armed with C, the Unix team took a bold step in 1973: they rewrote the Unix kernel for the PDP-11 in C, demonstrating that an entire operating system could be expressed in a high-level language without sacrificing efficiency. This achievement revolutionized software development, proving that portable and maintainable operating systems were no longer a pipe dream. As Unix spread to new hardware platforms in the mid-1970s, C’s portability and adaptability shone through. Steve Johnson’s portable C compiler (pcc) and supporting libraries helped transplant Unix and C to various architectures, and by the late 1970s and early 1980s, C had become a lingua franca of systems programming.
The language continued to evolve. Structures became fully integrated, enum
types and unsigned
integers were added, and, crucially, argument type-checking improved. Although early C compilers enforced few type constraints, the 1980s introduced tools like lint
to encourage better coding practices. The 1978 publication of “The C Programming Language” by Brian Kernighan and Dennis Ritchie (often called “K&R”) served as the de facto reference for several years, further popularizing and codifying the language.
By the 1980s and 1990s, C had become firmly entrenched beyond Unix. Its syntax and semantics influenced newer languages like C++, Objective-C, and countless others. In 1989 (with ANSI) and 1990 (with ISO), the language underwent formal standardization, ensuring consistent implementations worldwide. The ANSI standard introduced function prototypes for better type safety and refined the language’s rules without fundamentally altering its character.
C has evolved from a tiny, low-level tool on a PDP-7, into a widely standardized systems language. It is used in everything from microcontrollers to supercomputer. This evolution was driven by practical needs and guided by the principle of creating a language that was powerful, efficient, and close to the machine while retaining key high-level features. This pragmatic balance explains why C remains a dominant and highly influential language in computing to this day.