Recently, I posted an article covering the immense value behind ChatGPT's powerful "Custom Instructions" feature.
Yesterday, OpenAI announced the beta preview availability of "GPTs," a way to build, reuse, and share custom-trained ChatGPT instances with similar strengths.
The idea is born from the desire to reduce or eliminate the need for Prompt Engineering and all the learning and struggles that can entail for average users of OpenAI platforms like ChatGPT and DALL-E.
Think of a persona you need with any regularity, perhaps a personal financial advisor. You will now be able to train your very own GPT for a singular purpose by starting a conversation as normal and giving it the knowledge and instructions upfront. You will then define what your GPT can do, such as web searches, image generation, or data analysis.
Shareable GPTs provide a great way for knowledgeable users to help others without as much time training them on prompt engineering tactics.
Even more interesting is the inclusion of a GPT Store in the announcement. The GPT Store, expected online in late November, will highlight the best examples of community-created GPTs. There is no indication as to what kind of marketplace this may be or whether they expect to charge or allow creators to charge for the use of GPTs in the GPT Store.
OpenAI included in their announcement an indication that developers will be able to connect ChatGPT to outside systems via APIs. I'm unclear whether this is an extension or a pivot of the existing plugin system or if it's an entirely new thing.
In any event, a developer's ability to connect ChatGPT directly to unfamiliar data sets and have it intuitively learn from and act upon that data's availability is a qualified step forward, which I think will see ChatGPT grow by leaps and bounds.
Private Enterprise GPTs
Of particular interest for OpenAI's enterprise customers is the ability to keep GPTs private and internal only. This holds the promise of being a turnkey internal subject matter expert for much of an enterprise if crafted properly.
Internal GPTs could be incredible tools for company departments implementing chatbots with more complexity than these new GPTs might entail.
Imagine an IT department giving ChatGPT access to their ServiceNow or Jira instances and letting the customer ask questions, provide updates, order equipment, and more right from a simple chat interface without the need to understand a complicated form with dozens of fields and duplicative content.
Imagine a GPT with access to a company's specific HR policies and procedures documentation and real-time access to changes/updates to that information. Suddenly, HR business partners can spend less time on things they think should be self-service but are hidden in complex interfaces their colleagues might touch only a handful of times per year.
There's significant value hidden in this part of the announcement for enterprises to consider alongside GPT and
Public Enterprise GPTs
It also holds great promise for enterprise customers if they build and embed that GPT on their website and/or mobile apps. Think of a Kroger GPT that could connect to the various Kroger systems associated with your profile with the company, including your Kroger Plus Card details, inventory at your local store, and whether your prescriptions are ready at the pharmacy.
A customer of Kroger could engage in a simple chat while logged in and tell this GPT what they want to add to their cart, and then checkout online without having to browse around, clicking through various sections of the app/site.
As a customer, this is very attractive. This GPT could suggest available deals, similar products, etc., based on the semantic details and determine if the conversant is receptive to such offers (avoid doing so if a customer indicates to the contrary.)
WHAT TO EXPECT
I expect this new feature to replace the existing "Custom Instructions" implementation rather quickly.
I also expect community-generated GPTs to be very popular, especially as power users create especially well-honed examples.
I expect demand from community creators of GPTs to be able to monetize their GPTs to be high and that subscribing for use will be a revenue source for OpenAI as well. It will be interesting to see how creators value their work and whether the broader community accepts the market model.
This announcement from OpenAI has significant implications for using LLMs for conversational integration with enterprise systems, not just training sets with public data.
The ChatGPT community will need to feel out about the GPT Store, what sorts of monetization opportunities they might demand, and how much they're willing to pay for the GPTs created by others.
Developer integrations for use by GPTs leveraging APIs will be interesting to watch. Will this be laborious? Or will developers point ChatGPT at their API documentation?
There's a lot to unpack with the announcement, and many exciting developments await us with GPTs.
Which part is most exciting to you? What might you create a GPT for?
Sign up for GiddyUp
My personal journey, musings, and things.
No spam. Unsubscribe anytime.