Analysis & Opinions - Georgetown Journal of International Affairs

GPTs, Software Engineering, and a New Age of Hacking

| Aug. 16, 2023

Generative Pre-trained Transformers (GPTs) are the technology of the moment. From GPT-based like Chat-GPT and Bard to programming assistants like Copilot, this newest form of machine-learning-based AI has generated excitement, consternation, calls for outlawing or stopping development, and societal predictions ranging from a utopia to a robot apocalypse.

While many still worry that this technology will collapse society, better-informed commentators have begun to prevail. As we begin to understand how GPTs work and how they are best utilized, the debate has become more productive and less panic-ridden.

Further discussions must focus on how GPTs can be used in the policy arena. GPTs are another example of a dual-use technology: beneficial in some applications but concerning in others. As governments exert influence in the global security landscape, many ponder how GPTs will change the balance of power between offense and defense in cybersecurity. In particular, some worry that GPTs will allow vulnerabilities to be discovered and exploited at an increased rate, swinging the delicate balance in cybersecurity even more in favor of the attackers.

To begin to understand issues around the use of GPTs, we should understand how these models work. Models created by GPTs are large statistical models trained on large amounts of text. Such models can then use existing content to predict what words will come next. If you ask ChatGPT, for example, to tell you about Paul Revere, the program will begin generating sentences similar to what you would be likely to find if you were reading parts of a training set that contained the words “Paul Revere.” The results appear as though they were written by a human because the system was trained on what humans write.

The ability to generate statistically likely phrases also makes GPTs useful as a coding tool. Much of writing code is fairly formulaic, but writing code is only a small part of programming, a distinction we will discuss below. For many tasks, a fair amount of boilerplate code must be written. Many examples of this kind of code already exist on the web, either in tutorials or on web-accessible repositories like GitHub (which was used to train CoPilot). So ChatGPT can write the boilerplate code.

For more information on this publication: Belfer Communications Office
For Academic Citation: Waldo, James and Angela Wu.“GPTs, Software Engineering, and a New Age of Hacking.” Georgetown Journal of International Affairs, August 16, 2023.

The Authors