Files
go-openai/README.md
sashabaranov 6758ec4d96 Streaming support (#61)
* Add streaming support feature (#54)

* Add streaming support feature

removes golangci linting deprecation warnings
See: [Issue #49](https://github.com/sashabaranov/go-gpt3/issues/49)

* remove dead token

* Remove the goroutines from previous implementation

Set up separate test and file for streaming support
Add client code under cmd dir

* Supress CI errors

Need to update import path to test under feature/streaming-support
branch

* suppress linting errors

---------

Co-authored-by: sashabaranov <677093+sashabaranov@users.noreply.github.com>

* remove main.go

* remove code duplication

* use int64

* finalize streaming support

* lint

* fix tests

---------

Co-authored-by: e. alvarez <55966724+ealvar3z@users.noreply.github.com>
2023-02-07 20:42:53 +04:00

1.5 KiB

go-gpt3

GoDoc Go Report Card

OpenAI GPT-3 API wrapper for Go

Installation:

go get github.com/sashabaranov/go-gpt3

Example usage:

package main

import (
	"context"
	"fmt"
	gogpt "github.com/sashabaranov/go-gpt3"
)

func main() {
	c := gogpt.NewClient("your token")
	ctx := context.Background()

	req := gogpt.CompletionRequest{
		Model:     gogpt.GPT3Ada,
		MaxTokens: 5,
		Prompt:    "Lorem ipsum",
	}
	resp, err := c.CreateCompletion(ctx, req)
	if err != nil {
		return
	}
	fmt.Println(resp.Choices[0].Text)
}

Streaming response example:

package main

import (
	"errors"
	"context"
	"fmt"
	"io"
	gogpt "github.com/sashabaranov/go-gpt3"
)

func main() {
	c := gogpt.NewClient("your token")
	ctx := context.Background()

	req := gogpt.CompletionRequest{
		Model:     gogpt.GPT3Ada,
		MaxTokens: 5,
		Prompt:    "Lorem ipsum",
		Stream:    true,
	}
	stream, err := c.CreateCompletionStream(ctx, req)
	if err != nil {
		return
	}
	defer stream.Close()

	for {
		response, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			fmt.Println("Stream finished")
			return
		}

		if err != nil {
			fmt.Printf("Stream error: %v\n", err)
			return
		}


		fmt.Printf("Stream response: %v\n", response)

	}
}