Since World War II, Western governments, including the U.S. and its NATO allies, have regulated the export of cryptography for national security considerations, and, for a time, defined cryptography to be a munition.
In light of the enormous impact of cryptanalysis in WWII, it was abundantly clear to these governments that denying current and potential enemies access to cryptographic systems looked to be militarily valuable. They also wished to monitor the diplomatic communications of other nations, including the many new nations that were emerging in the post-colonial period and whose position on Cold War issues was regarded as vital.
Since the U.S. and U.K. had, they believed, developed more advanced cryptographic capabilities than others, there arose a notion that controlling all dissemination of the more effective crypto techniques might be beneficial. The First Amendment made controlling all use of cryptography inside the U.S. difficult, but controlling access to U.S. developments by others was thought to be more practical — there were at least no constitutional impediments.
Accordingly, regulations were introduced as part of munitions controls which required licenses to export cryptographic methods (and even their description); the regulations established that cryptography beyond a certain strength (defined by algorithm and length of key) would not be licensed for export except on a case-by-case basis. The expectation seems to have been that this would further national interests in reading 'their' communications and prevent others from reading 'ours'. This policy was also adopted elsewhere for various reasons.
The development, and public release, of DES and asymmetric key techniques in the 1970s, the rise of the Internet, and the willingness of some to risk and resist prosecution, eventually made this policy impossible to enforce, and by the late 1990s it was being relaxed in the US, and to some extent (e.g. France) elsewhere. Nevertheless, some officials in the U.S. believe that widespread availability of strong cryptography world-wide has hampered the ability of the NSA to read intercepted communications that might reveal important information about intentions hostile to the United States. Others feel that the export controls in place in the last half of the 20th century discouraged incorporation of widely known cryptographic tools into commercial products, particularly personal computer operating systems, and are a root cause of the present crisis in information security, aside from interfering with U.S. trade in such products. They observe that many of the advances, including asymmetric key cryptography and many of its algorithms, were already public in any case.
Two types of technology were protected: technology associated only with weapons of war and dual use technology, which also had commercial applications. In the U.S., dual use technology export was controlled by the Department of Commerce, while munitions were controlled by the State Department. Encryption technology (techniques as well as equipment and, after computers became important, crypto software) was classified as a munition. However, this hardly mattered in practice since secure encryption was not, certainly in the immediate post War period, available to the general public. By the 1960s, however, financial organisations were beginning to require strong commercial encryption on the rapidly growing field of wired money transfer.
The U.S. Government's introduction of the Data Encryption Standard in 1975 meant that commercial uses of high quality encryption would become common, and serious problems of export control began to arise. Generally these were dealt with through case-by-case export license request proceedings brought by computer manufacturers, such as IBM, and by their large corporate customers.
SSL-encrypted messages used the RC4 cipher, and used 128-bit keys. U.S. government export regulations would not permit crypto systems using 128-bit keys to be exported. At this stage Western governments had, in practice, a split personality when it came to encryption; policy was made by the military cryptanalysts, who were solely concerned with preventing their 'enemies' acquiring secrets, but that policy was then communicated to commerce by officials whose job was to support industry.
The longest key size allowed for export without individual license proceedings was 40 bits, so Netscape developed two versions of its web browser. The "U.S. edition" had the full 128-bit strength. The "International Edition" had its effective key length reduced to 40 bits by revealing 88 bits of the key in the SSL protocol. Acquiring the 'U.S. domestic' version turned out to be sufficient hassle that most computer users, even in the U.S., ended up with the 'International' version, whose weak 40-bit encryption could be broken in a matter of days using a single personal computer. Much the same thing happened with Lotus Notes and for the same reasons.
Legal challenges by Peter Junger and other civil libertarians and privacy advocates, the widespread availability of encryption software outside the U.S., and the perception by many companies that adverse publicity about weak encryption was limiting their sales and the growth of e-commerce, led to a series of relaxations in US export controls, culminating in 1996 in the effective elimination of export controls on commercial and open source software containing cryptography (which, in any case, a "rogue state" could have downloaded, and subsequently verified, from file sharing networks or servers outside the US).