Class: OmniAI::Llama::Chat

Inherits:
Chat
  • Object
show all
Defined in:
lib/omniai/llama/chat.rb,
lib/omniai/llama/chat/stream.rb,
lib/omniai/llama/chat/usage_serializer.rb,
lib/omniai/llama/chat/choice_serializer.rb,
lib/omniai/llama/chat/content_serializer.rb,
lib/omniai/llama/chat/response_serializer.rb

Overview

An Llama chat implementation.

Usage:

completion = OmniAI::Llama::Chat.process!(client: client) do |prompt|
  prompt.system('You are an expert in the field of AI.')
  prompt.user('What are the biggest risks of AI?')
end
completion.choice.message.content # '...'

Defined Under Namespace

Modules: ChoiceSerializer, ContentSerializer, Model, ResponseSerializer, UsageSerializer Classes: Stream

Constant Summary collapse

JSON_RESPONSE_FORMAT =
{ type: "json_object" }.freeze
DEFAULT_MODEL =
Model::LLAMA_4_SCOUT
CONTEXT =

Returns:

  • (Context)
Context.build do |context|
  context.deserializers[:response] = ResponseSerializer.method(:deserialize)
  context.deserializers[:choice] = ChoiceSerializer.method(:deserialize)
  context.deserializers[:content] = ContentSerializer.method(:deserialize)
  context.deserializers[:usage] = UsageSerializer.method(:deserialize)
end