app.MapPost("plugins/{pluginName}/invoke/{functionName}", async (HttpContext context, Query query, string pluginName, string functionName) =>
WRITE EXACTLY ONE JOKE or HUMOROUS STORY ABOUT THE TOPIC BELOW
JOKE MUST BE:
- G RATED
- WORKPLACE/FAMILY SAFE
NO SEXISM, RACISM OR OTHER BIAS/BIGOTRY
BE CREATIVE AND FUNNY. I WANT TO LAUGH.
+++++
{{$input}}
+++++
{
"schema": 1,
"description": "Generate a funny joke",
"type": "completion",
"completion": {
"max_tokens": 1000,
"temperature": 0.9,
"top_p": 0.0,
"presence_penalty": 0.0,
"frequency_penalty": 0.0
},
"input": {
"parameters": [
{
"name": "input",
"description": "Joke subject",
"defaultValue": ""
}
]
}
}
Getting back to the code, the next step in the endpoint is to read the header values passed in the request. These are the Azure OpenAI or OpenAI information needed to execute the query. Below is an example with Azure OpenAI.
var headers = context.Request.Headers;
var model = headers["x-sk-web-app-model"];
var endpoint = headers["x-sk-web-app-endpoint"];
var key = headers["x-sk-web-app-key"];
var kernel = new KernelBuilder()
.WithAzureTextCompletionService(model!, endpoint!, key!)
.Build();
var pluginDirectory = "Plugins";
var plugInFunctions = kernel!.ImportSemanticSkillFromDirectory(pluginDirectory, pluginName);
And finally, invoke the Semantic Kernel function and return the result.
var result = await plugInFunctions[functionName].InvokeAsync(query.Value);
SKResponse response = new SKResponse();
response.Value = result.Result.Trim();
return Results.Json(response);
If you are using Postman, ask the service for a joke.
{
"value":"A bear was looking for a job and decided to become an Uber driver. He was a great driver and always got five-star reviews from his passengers. One day, a passenger asked him why he was so good at his job. The bear replied, \"It's simple, I just follow the bear necessities of life!\""
}
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.