OpenAI ChatGPT API Client For Delphi and Lazarus

A topic for discussion about the blog post OpenAI ChatGPT API Client For Delphi and Lazarus.

What are your thoughts?

Thank you for this OpenAI client software. In order to get it to work with fpc, I needed to add {$mode delphi} to the OpenAIJson.pas unit. I may have added it to a couple of other’s but I only remember needing it for this one unit for sure. I had written my own console version, but I see you have added many more functions. One tip to improve the experience with GPT is to add a TStringlist variable (I call it ‘conversation’), adding your question and GPT’s response to the conversation and sending the whole conversation up to GPT using the ‘.text’ property of TStringlist. This allows GPT to contextualize it’s response throughout the conversation. I also have a preamble text with a USER and GPT variable to prime the conversation at the beginning. I am happy to share this as soon as I add it to your client sample, if this is desired.

But my question is about how we can create a fine-tune model using the methods you so painstakenly have put together. I have been slowly converting python examples in my preparation for doing this with the thought that some code would have to stay in python since Pytorch and Tensorflow libraries are often used.

Is there any thoughts regarding this? Has anyone already done this?

Again. Thanks for your work on this.

Hi @Terry, thank you for the comments.

We will add the Delphi directives to the units.

The OpenAI documentation recommends using their command-line tool to train your models:

There’s a new ‘model’ for OpenAI called gpt-3.5-turbo. It has a new endpoint /v1/chat/completions, and a parameter called messages[]. I am able to use this new engine in my small cobbled together console version, but the nice full oop version made here is too involved for me to figure out where all the changes are supposed to be made.

@Terry, indeed. We have an open issue to support it in our OpenAI library:

Thank you! Look forward to using it.

Scripting is also possible in maXbox4 or ps with TRestClient:
https://softwareschule.code.blog/2023/04/01/how-to-chat-with-gpt/

Well it took me no time at all to integrate the sample OpenAI code into an app.
Its very nice and want to thank you for making it so easy by providing source code.
Of course after trying it I discovered that there is a bit more to it than connecting to OpenAI and asking a few things. There is a combined Token and RPM throttle which shouldn’t be overrun.
It is up to the developer to asses the outgoing text and determine the amount of ‘tokens’ it has and the RPM which is not difficult to do but needs to be done by the developer. Failing that you’re gonna hit your limit rate very and I mean very quickly unless you’re on a paid plan.

So whilst its easy to implement please visit:

How to use OpenAI GPT-3 tokens • Gptforwork.com

for more on this.

Thank you for the kind words, @Winston_Wyndham-Quin. Regarding the tokens, I understand you just wanted to warn other users about it?

Hi Wagner. Yes I was just letting people know so they don’t get a surprise. I’m extremely impressed and am doing my homework. I’ve noticed that there is a lot to it regarding the ada, babbage, davinci models etc. Also the different endpoints etc.
All the same, thanks guys. Great work.

Thank you for the library! I’am a german former Turbo-Pascal and Delphi programmer and have tried it with Lazarus. It works fine with ‘text-davinci-003’ model, but i could not use it with ‘gpt-3.5-turbo’ or ‘gpt-4’. My code is:

Request.Prompt := 'Was ist ein ' + SQLQuery1.FieldByName ('Title').AsString + '?';
Request.Model := 'gpt-3.5-turbo';
Request.MaxTokens := 2048; // Be careful as this can quickly consume your API quota.
Response := Client.OpenAI.CreateCompletion(Request);

Where can i get information about i could do it right?

And yes, i have a ‘ChatGPT Plus Subscription’ and sold API key. Auto-GPT runs fine.

Thomas

Hi, I saw this and though it might help you. I has some good info: How to chat with GPT – Code Blog

Ok, this runs with Python, or? Must Python installed on the pc? Or is it integrated in maxBox? And your solution did not use the unit from this site? And i must install maxBox? I’m not sure wether this is the ideal solution.

It is actually Delphi and Lazarus Pascal code. The latest versions of Delphi can run Python but you would need Python installed on your computer. I personally prefer just using Delphi and Pascal. Delphi compiler will let you run your final product on Mac, IOS, Windows, Linux and Android.
You can install and run Python without having Delphi. Lazarus is also an excellent free oprion.
Personally I don’t use Python but it is quite easy to use. Pascal was developed as a learning lanuguage and Delphi is a really good Application Programmers Interface with excellent components which will give your app users a nice ‘end user experience’ and will save you hours of frustration. The sample code provided here is an excellent starting point.
Please note that I’m a visitor to this blog and I am just giving you my opinion and others might disagree.

I completely understand. I use python for little experiments, but for my new project i want use Lazarus, because i can compile the program and i want the window gui. I have years ago developed a database based tree editor with delphi and lazarus and want it to use for my new project. The unit from ‘landgraf’ seems perfectly for me, until i understood that i cannot call up the complete gpt-4 power.

1 Like

It should possible the described way on ‘softwareschule.code.blog’ to do in Lazarus, only with ‘HttpRequest’, or?

You can use tRestClient , tRestRequest and tRestResponse with Json.
Its all TCPIP so anything TCPIP would bring you to the same result.
THTTPClient is suffiecient.

I will update the OpenAI ChatGPT API Client for Delphi and Lazarus as soon as I can to support GPT-4 and the new API endpoints. Sorry for the delay but got very busy last weeks. It’s coming, though. Thanks for all your support and contribution.

1 Like

@Landgraf: That would be great, of course!

Well, you don’t have to apologise! For me, it’s great that something like this exists! I was really happy when I discovered it. But now I’m sitting here on hot coals and can’t wait! I’m working on a tool for automatic book creation and everything works so far, but the text quality with ‘text-davinci-003’ is not so good and I can’t wait to try out GPT-4. :slight_smile: