Module: OmniAI::Llama::Chat::ResponseSerializer
- Defined in:
- lib/omniai/llama/chat/response_serializer.rb
Overview
Overrides response serialize / deserialize for the following payload:
{
completion_message: {
content: {
type: "text",
text: "Hello!",
},
role: "assistant",
stop_reason: "stop",
tool_calls: [],
},
metrics: [
{
metric: "num_completion_tokens",
value: 2,
unit: "tokens",
},
{
metric: "num_prompt_tokens",
value: 3,
unit: "tokens",
},
{
metric: "num_total_tokens",
value: 4,
unit: "tokens",
},
],
}
Class Method Summary collapse
Class Method Details
.deserialize(data, context:) ⇒ OmniAI::Chat::Response
41 42 43 44 45 46 |
# File 'lib/omniai/llama/chat/response_serializer.rb', line 41 def self.deserialize(data, context:) usage = OmniAI::Chat::Usage.deserialize(data["metrics"], context:) if data["metrics"] choice = OmniAI::Chat::Choice.deserialize(data["completion_message"], context:) OmniAI::Chat::Response.new(data:, choices: [choice], usage:) end |