Unless you live in a cave, you’re aware of ChatGPT. It’s in the process of roto-tilling pretty much everything compute-related. While not generally seen as a threat to much, we all see the writing on the wall; the computing world is changing in profound ways with LLMs. The rate of advancement AGI has demonstrated in recent weeks and months is obvious.
Yesterday, OpenAI opened the [next] kimono to reveal yet another massive game changer; GPT Plugins.
Let that sink in.
While all aspects of this new GPT capability are significant, one flies in the stratosphere of all other possibilities - to run computations. While everyone is drawn toward “access up-to-date information” in a mesmerizing mental trance like a Star Wars tractor beam, the most stunning part of the announcement will be vastly overlooked.
We crave the day when GPT can see the Interwebs in real-time. GPT plugins make this happen in an instant with yesterday’s announcement. But as cool as that is, it will ultimately be seen as a distraction away from the tectonic plates that are about to collide.
Run Computations
What will the future be like when you can ask a system to build a process (as code), provide it with some data, and then run that process against it?
As demonstrated here, OpenAI has revealed:
- ChatGPT is a full compute stack.
- ChatGPT has CPUs and GPUs available for everyone and any reasonable objective.
- ChatGPT has a file system.
- ChatGPT can transform your words into processes and execute them.
Let that sink in a little deeper.
Imagine Zapier comes along and says to OpenAI -
Hey, with that new plugin architecture, what if we made it possible to use our five thousand connectors in natural conversations to get data, documents, and other information that the LLM can use to answer questions, write reports, and otherwise serve as the dynamic gateway to everything the LLM wasn’t trained on?
This hypothetical is not fiction; it’s a done deal.
No-Code Threat?
And then Zapier says -
BTW, what if we could also provide the database backend for all manner of natural languages to store or transform data?
This part is hypothetical but a likely reality in the near future. I warned over here not long ago that Zapier is coming for more than your adhesive dollars; it wants your data. Data Blaze may be positioned to embrace the data angle as well.
Suppose you can interface with any datastore, retrieve data, perform computations on that data, and build your apps’ computational aspects by using your natural prompts in any language; who (or what) stands to be disrupted?
For starters - everything related to integration. GPT can understand how to POST or GET data to and from any endpoint based purely on API documentation. It has this capacity today. GPT plugins take this innate integration ability to another level - they can execute data interchanges, build log files, track performance, and report exceptions.
How will this new AI "app store" model fit with Text Blaze?