OPENAI API Completion not returning text
Asked Answered
C

3

17

I am using node.js and want to use openai API

I just copied the code from openai playground and it looks like this

export const askOpenAi = async () => {
const response = await openai.createCompletion("text-davinci-001", {
    prompt: "\ninput: What is human life expectancy in the United States?\n",
    temperature: 0,
    max_tokens: 100,
    top_p: 1,
    frequency_penalty: 0,
    presence_penalty: 0,
    stop: ["\n", "\ninput:"],
});
return response.data;
}

openai's Returning data look like this

{
  id: '~~~',
  object: 'text_completion',
  created: ~~~,
  model: 'text-davinci:001',
  choices: [ { text: '', index: 0, logprobs: null, finish_reason: 'stop' } ]
}

In the playground, this code works very well.

In the playground, this code works very well.

How can I get right response?

Consecutive answered 28/1, 2022 at 11:52 Comment(1)
Oh.. I just fixed it Change prompt into -> prompt: \n\nQ: ${question}\nA:,Consecutive
S
9

It should be something like this:

export const askOpenAi = async () => {
const prompt = `input: What is human life expectancy in the United States?
output:`
const response = await openai.createCompletion("text-davinci-001", {
    prompt: prompt,
    temperature: 0,
    max_tokens: 100,
    top_p: 1,
    frequency_penalty: 0,
    presence_penalty: 0,
    stop: ["input:"],
});
return response.data;
}

Here, first of all, remove the \n from stop array, because then it will stop the completion after every newline (any answer could be in multiple lines). secondly, no need to add extra \n before input:. it doesnt matter actually.

Finally, remember to give some clue about the completion you're expecting by adding output: at the last of your prompt.

Btw, these type of question ask completion can be achieved through openAI's new instruct mode as well.

const prompt = `Answer the following question:
What is human life expectancy in the United States?
{}`
const response = await openai.createCompletion("text-davinci-001", {
    prompt: prompt,
    temperature: .7,
    max_tokens: 100,
    top_p: 1,
    frequency_penalty: 0,
    presence_penalty: 0,
    stop: ["{}"],
});
Saga answered 14/6, 2022 at 4:59 Comment(0)
D
1

Just change stop: ["\n"] in something different. Worked in my case!

Desalinate answered 11/1, 2023 at 13:21 Comment(0)
L
0

I had this same issue. What solved it for me, was to add a whitespace (not a \n or {}) at the end of the prompt, and omit the stop key entirely from the request params.

Lalita answered 1/5, 2023 at 13:47 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.