GPT Plugins; Where Will Text Blaze Take This?

Unless you live in a cave, you’re aware of ChatGPT. It’s in the process of roto-tilling pretty much everything compute-related. While not generally seen as a threat to much, we all see the writing on the wall; the computing world is changing in profound ways with LLMs. The rate of advancement AGI has demonstrated in recent weeks and months is obvious.

Yesterday, OpenAI opened the [next] kimono to reveal yet another massive game changer; GPT Plugins.

CleanShot 2023-03-24 at 00.40.39@2x

Let that sink in.

While all aspects of this new GPT capability are significant, one flies in the stratosphere of all other possibilities - to run computations. While everyone is drawn toward “access up-to-date information” in a mesmerizing mental trance like a Star Wars tractor beam, the most stunning part of the announcement will be vastly overlooked.

We crave the day when GPT can see the Interwebs in real-time. GPT plugins make this happen in an instant with yesterday’s announcement. But as cool as that is, it will ultimately be seen as a distraction away from the tectonic plates that are about to collide.

Run Computations

What will the future be like when you can ask a system to build a process (as code), provide it with some data, and then run that process against it?

As demonstrated here, OpenAI has revealed:

  • ChatGPT is a full compute stack.
  • ChatGPT has CPUs and GPUs available for everyone and any reasonable objective.
  • ChatGPT has a file system.
  • ChatGPT can transform your words into processes and execute them.

Let that sink in a little deeper.

Imagine Zapier comes along and says to OpenAI -

Hey, with that new plugin architecture, what if we made it possible to use our five thousand connectors in natural conversations to get data, documents, and other information that the LLM can use to answer questions, write reports, and otherwise serve as the dynamic gateway to everything the LLM wasn’t trained on?

This hypothetical is not fiction; it’s a done deal.

No-Code Threat?

And then Zapier says -

BTW, what if we could also provide the database backend for all manner of natural languages to store or transform data?

This part is hypothetical but a likely reality in the near future. I warned over here not long ago that Zapier is coming for more than your adhesive dollars; it wants your data. Data Blaze may be positioned to embrace the data angle as well.

Suppose you can interface with any datastore, retrieve data, perform computations on that data, and build your apps’ computational aspects by using your natural prompts in any language; who (or what) stands to be disrupted?

For starters - everything related to integration. GPT can understand how to POST or GET data to and from any endpoint based purely on API documentation. It has this capacity today. GPT plugins take this innate integration ability to another level - they can execute data interchanges, build log files, track performance, and report exceptions.

How will this new AI "app store" model fit with Text Blaze?

1 Like

Good read Bill, thanks for posting. A lot of what you mentioned is stuff I've been thinking about also.

Currently I've been using TB to build up complex prompts that are then fed into ChatGPT.

Having the ability to have access to near live data in the future will make this all the more powerful and relevant. Personally I can't wait to see what can be done with this technology going forward.

As someone who started their computing career using punch cards to program, then getting my first ZX80 computer and then using Acoustic coupler to talk to other computers, this new tech is pretty mind-blowing.

LOL. I'm hip. CSU, 1975; IBM mainframe. Back of the line to submit jobs. :wink:

When TRS-80 and Benton Harbor Basic arrived, I said goodbye, CPA! Hello self-taught programmer.

1 Like

Yes, these generative models are very exciting. The plugin model OpenAI came up with where the model figures out how to use the plugin is also a really neat concept.

In regards to Text Blaze were exploring out how to maximize that value of models like ChatGPT to Text Blaze users. We've launched a couple of things (Auto Write, the Open AI command pack) and have more in the works.

If there are specific use cases where you want deeper AI capabilities or features in Text Blaze, please let us know!

1 Like

I have many ideas swirling through my brain wrinkles, but this one observation is probably useful as any team tries to determine AI's place in their product.

AI, specifically AGI (artificial general intelligence), has its place, and there are vast opportunities to employ it for great user benefit. Innovators often mistake newfound AI capabilities as features.

Artificial general intelligence (AGI) straightens the line between what is known and what we need to know while adding our own context.

It's not an app or a feature; it's a UI and the user's own data is the API.

Can users expect new features or improvements in GPT plugins integration with Text Blaze in the near future?

ChatGPT seems to moving away from the plugins model at this point. They now appear to be focusing on the "GPT's" approach.

Could you tell us more about what you would like to do with plugins?

ChatGPT Plugins are dead. It was a bad idea from the start. The data also shows ChatGPT was never intended to be a destination AI tool and it's growth has flatlined over the past six months. I predicted this more than a year ago.

Custom GPTs may also struggle because of security, prompt leakage, and various other architectural challenges. One of the poor aspects of Custom GPTs is the inability to call them from TextBlaze. You have to build your own version of a custom GPT inside TB - indeed, any recipe is fundamentally grounds for a custom GPT.