Back in 2019, the AI research and development firm OpenAI completed work on GPT-2, its second AI natural language generator. At the time, the company declined to release it to the public, claiming that there were too many ways it might be used to ill effect.
But before the year was through, they had changed their mind. What followed was a torrent of zany – often hilarious - uses of the technology, mostly as a tool for amusement. It seemed that OpenAI's worst fears about releasing it hadn't been realized.
Things went so well that OpenAI proceeded to its next version, aptly named GPT-3, which is said to be even more powerful than its predecessor. And although it's not available for free to the general public yet (and it may never be), there are already some remarkable examples of the types of text it can create.
It turns out, though, that GPT-3 is good at some other important tasks. One of them is creating computer code from scratch. Here's a look at some of the ways it's being (and will be) used to improve the lives of programmers everywhere.
Back in September of 2020, OpenAI announced that it had sold the exclusive rights to the technology behind GPT-3 to global software giant Microsoft (that's one of the reasons it may never see a free public release). And as a result, the software giant got a head start on everyone in putting the new technology to use. And it didn't take them long to do so.
By May of 2021, they announced the availability of Microsoft Power Fx. It's a low-code programming language that merges the natural language processing abilities of GPT-3 with the preexisting formula language that's been built into programs like Excel for decades. The idea is to allow users already familiar with that formula language to use it to create customized interfaces without learning anything new.
The concept is at the core of a platform called the Microsoft Power Apps canvas. It's a product ecosystem designed to allow businesses to build custom apps that interact with and manipulate data they have stored in the cloud. Microsoft envisions the platform as a one-stop-shop for companies to build low-code apps that eliminate the need to hire outside developers or send their employees for any programming classes.
Another simple way that GPT-3 is already helping programmers is by giving them code suggestions as they work. The GitHub Copilot is an excellent example of this idea in action. Its users receive helpful code suggestions as they work, which the AI engine draws from GitHub's vast library of existing code.
What's more, Copilot can also allow a programmer to preview multiple approaches to the same function. This opens up the possibility of programmers using Copilot as a learning tool – something akin to requesting coding help from an assignment expert – but without the wait. And because it can understand and work in multiple programming languages, Copilot can help programmers port their work into unfamiliar languages in no time.
Although not many people or companies have gotten access to GPT-3 yet, some have and are already building products that make use of its new features. One of them, called Debuild, is testing a system that it claims can create fully functional web apps with no input other than a plain-English description of what you want.
The implications of such a tool are enormous. If it's even marginally successful, it would be nothing short of revolutionary. Businesses would be able to use it to generate custom software on-demand. And web developers could use it to build fast and functional prototypes, freeing them up to focus on more high-level work.
Even though GPT-3 was initially built to understand and deal with natural human speech, it may have found its true calling in the world of computer programming. As you can see, it's already powering some useful tools – and they're likely only the beginning. But the biggest obstacle standing in the way to even more progress is the fact that OpenAI has so far declined to make the technology available for public use.
But that may not matter for much longer. A group of AI researchers has banded together to try and replicate the technology that makes GPT-3 tick. The result of their work (so far) is the open-source GPT-J, which has similar capabilities to the original. And according to many of the people already using it to tackle programming tasks, it outperforms the original in several ways.
That means it might not be long until the examples detailed here are joined by countless other amazing and helpful programming tools. And because they'll be built on open-source technology, they'll likely be available to the public to use and modify as they see fit. So keep a sharp lookout – because there are going to be some amazing new developments to come.