Module: OmniAI::Llama::Chat::UsageSerializer
- Defined in:
- lib/omniai/llama/chat/usage_serializer.rb
Overview
Overrides response serialize / deserialize for the following payload:
[
{
metric: "num_completion_tokens",
value: 2,
unit: "tokens",
},
{
metric: "num_prompt_tokens",
value: 3,
unit: "tokens",
},
{
metric: "num_total_tokens",
value: 4,
unit: "tokens",
},
]
Defined Under Namespace
Modules: Metric
Class Method Summary collapse
Class Method Details
.deserialize(data) ⇒ OmniAI::Chat::Response
35 36 37 38 39 40 41 42 43 44 45 |
# File 'lib/omniai/llama/chat/usage_serializer.rb', line 35 def self.deserialize(data, *) prompt = data.find { |metric| metric["metric"] == Metric::NUM_PROMPT_TOKENS } completion = data.find { |metric| metric["metric"] == Metric::NUM_COMPLETION_TOKENS } total = data.find { |metric| metric["metric"] == Metric::NUM_TOTAL_TOKENS } input_tokens = prompt ? prompt["value"] : 0 output_tokens = completion ? completion["value"] : 0 total_tokens = total ? total["value"] : 0 OmniAI::Chat::Usage.new(input_tokens:, output_tokens:, total_tokens:) end |