Skip to content

add a chatbox example #205

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions Examples/Package.resolved

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

10 changes: 10 additions & 0 deletions Examples/Package.swift
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ let package = Package(
url: "https://github.com/stackotter/swift-bundler",
revision: "d42d7ffda684cfed9edcfd3581b8127f1dc55c2e"
),
.package(
url: "https://github.com/MacPaw/OpenAI",
from: "0.4.4"
),
],
targets: [
.executableTarget(
Expand Down Expand Up @@ -72,6 +76,12 @@ let package = Package(
.executableTarget(
name: "WebViewExample",
dependencies: exampleDependencies
),
.executableTarget(
name: "ChatbotExample",
dependencies: exampleDependencies + [
.product(name: "OpenAI", package: "OpenAI")
]
)
]
)
47 changes: 47 additions & 0 deletions Examples/Sources/ChatbotExample/ChatbotApp.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import DefaultBackend
import SwiftCrossUI

#if canImport(SwiftBundlerRuntime)
import SwiftBundlerRuntime
#endif

// MARK: - Main App

@main
@HotReloadable
struct ChatbotApp: App {
@SwiftCrossUI.State private var viewModel = ChatbotViewModel()
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use @State instead of @SwiftCrossUI.State


var body: some Scene {
WindowGroup("ChatBot") {
#hotReloadable {
NavigationSplitView {
// Sidebar content
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove these section comments from this app body. The parts are all relatively self explanatory already imo.

ThreadSidebarView(
threads: viewModel.threadsBinding,
selectedThread: viewModel.selectedThreadBinding,
onNewThread: viewModel.createNewThread,
onSelectThread: viewModel.selectThread,
onDeleteThread: viewModel.deleteThread
)
} detail: {
// Main chat area
MainChatView(viewModel: viewModel)
}
.overlay {
// Settings Overlay
if viewModel.showSettings {
ChatSettingsDialog(
isPresented: viewModel.showSettingsBinding,
selectedModel: viewModel.selectedLLMBinding,
openAIService: viewModel.openAIService,
apiKeyStorage: viewModel.apiKeyStorage,
onSave: viewModel.reloadAPIKey
)
}
}
}
}
.defaultSize(width: 1200, height: 800)
}
}
38 changes: 38 additions & 0 deletions Examples/Sources/ChatbotExample/Models/ChatError.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
import Foundation

// MARK: - Error Types

Comment on lines +3 to +4
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove the MARKs (just a codebase style preference)

enum ChatError: Error, LocalizedError {
case missingAPIKey
case invalidURL
case encodingError
case decodingError
case invalidResponse
case apiError(Int)

var errorDescription: String? {
switch self {
case .missingAPIKey:
return "Please enter your OpenAI API key"
case .invalidURL:
return "Invalid URL"
case .encodingError:
return "Failed to encode request"
case .decodingError:
return "Failed to decode response"
case .invalidResponse:
return "Invalid response from server"
case .apiError(let code):
switch code {
case 401:
return "API Error 401: Invalid API key. Please check your OpenAI API key and make sure it's valid and has sufficient credits."
case 429:
return "API Error 429: Rate limit exceeded. Please wait a moment and try again."
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove the error code from the errors with nice human readable errors. E.g. the 401 error should read "Invalid API key...".

Keep default case and the server error case as they are because the error codes might carry more information.

case 500, 502, 503:
return "API Error \(code): OpenAI server error. Please try again later."
default:
return "API error: \(code)"
}
}
}
}
10 changes: 10 additions & 0 deletions Examples/Sources/ChatbotExample/Models/ChatMessage.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
import Foundation

// MARK: - Data Models

struct ChatMessage: Identifiable {
let id = UUID()
let content: String
let isUser: Bool
let timestamp: Date
}
47 changes: 47 additions & 0 deletions Examples/Sources/ChatbotExample/Models/ChatThread.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import Foundation

// MARK: - Thread Models

struct ChatThread: Identifiable, Codable {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may want to consider making the fields of your model types var instead of let so that you don't have to manually create member-wise initialisers. To achieve the same init signatures as you have now you can then just provide initial default values for fields that currently have default values provided by the initialiser (which you won't be able to do if all of the fields are let of course).

let id: String
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make the id a UUID like you did for ChatMessage.

let title: String
let createdAt: Date
let lastMessageAt: Date
let openAIThreadId: String? // OpenAI Thread ID for API integration

init(id: String = UUID().uuidString, title: String, openAIThreadId: String? = nil) {
self.id = id
self.title = title
self.createdAt = Date()
self.lastMessageAt = Date()
self.openAIThreadId = openAIThreadId
}

func updated(with lastMessageTime: Date = Date()) -> ChatThread {
return ChatThread(
id: self.id,
title: self.title,
openAIThreadId: self.openAIThreadId
)
}
}

// MARK: - Thread Message

struct ThreadMessage: Identifiable, Codable {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this to a separate file

let id: String
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make the id a UUID like you did for ChatMessage.

let threadId: String
let content: String
let isUser: Bool
let timestamp: Date
let openAIMessageId: String? // OpenAI Message ID for API integration

init(id: String = UUID().uuidString, threadId: String, content: String, isUser: Bool, openAIMessageId: String? = nil) {
self.id = id
self.threadId = threadId
self.content = content
self.isUser = isUser
self.timestamp = Date()
self.openAIMessageId = openAIMessageId
}
}
4 changes: 4 additions & 0 deletions Examples/Sources/ChatbotExample/Models/LLMType.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
import OpenAI
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename this file to LLM.swift


// Type alias to avoid confusion between OpenAI's Model and our ViewModels
typealias LLM = Model
29 changes: 29 additions & 0 deletions Examples/Sources/ChatbotExample/Services/APIKeyStorage.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import Foundation

// MARK: - API Key Storage

class APIKeyStorage {
private let userDefaults = UserDefaults.standard
private let apiKeyKey = "OpenAI_API_Key"

func saveAPIKey(_ key: String) {
userDefaults.set(key, forKey: apiKeyKey)
userDefaults.synchronize() // Force immediate synchronization
print("🔑 API key saved to disk")
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment applies to all print statements in this example. None of the other examples print stuff except in cases of unrecoverable errors not presentable to the user. Remove all of these print statements.

}

func loadAPIKey() -> String? {
let key = userDefaults.string(forKey: apiKeyKey)
if let key = key, !key.isEmpty {
print("🔑 API key loaded from disk successfully")
return key
}
return nil
}

func deleteAPIKey() {
userDefaults.removeObject(forKey: apiKeyKey)
userDefaults.synchronize() // Force immediate synchronization
print("🗑️ API key deleted from disk")
}
Comment on lines +9 to +28
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SwiftCrossUI example apps must run on all platforms supported by SwiftCrossUI so we can't use UserDefaults. See NotesExample for an example of how we do persistence at the moment.

@bbrk24 is currently working on a persistence solution but it's WIP so for now the easiest option is storing in a json file.

}
130 changes: 130 additions & 0 deletions Examples/Sources/ChatbotExample/Services/OpenAIService.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
import Foundation
import SwiftCrossUI
import OpenAI

// MARK: - OpenAI Service

class OpenAIService {
private var openAI: OpenAI?

func configure(apiKey: String) {
openAI = OpenAI(apiToken: apiKey)
print("🔑 OpenAI client configured successfully")
}

// MARK: - Thread-based Chat

func sendMessageToThread(_ message: String, threadMessages: [ThreadMessage], model: LLM) async throws -> String {
guard let openAI = openAI else {
print("❌ OpenAI client not configured")
throw ChatError.missingAPIKey
}

print("📤 Sending message to thread conversation")
print("📝 Message length: \(message.count) characters")
print("🤖 Model: \(model)")
print("📚 Context messages: \(threadMessages.count)")

// Build conversation messages array from thread history
var allMessages = threadMessages.compactMap { threadMessage in
if threadMessage.isUser {
return ChatQuery.ChatCompletionMessageParam(role: .user, content: threadMessage.content)
} else {
return ChatQuery.ChatCompletionMessageParam(role: .assistant, content: threadMessage.content)
}
}

// Add the new user message
if let newMessage = ChatQuery.ChatCompletionMessageParam(role: .user, content: message) {
allMessages.append(newMessage)
}

let query = ChatQuery(
messages: allMessages,
model: model
)

do {
let result = try await openAI.chats(query: query)
let response = result.choices.first?.message.content ?? "No response"
print("✅ Successfully received response from thread conversation")
return response
} catch let error as APIError {
print("❌ OpenAI API Error: \(error)")
throw ChatError.invalidResponse
} catch {
print("❌ Network error: \(error)")
throw ChatError.invalidResponse
}
}

func fetchAvailableModels() async throws -> [Model] {
guard let openAI = openAI else {
print("❌ OpenAI client not configured")
throw ChatError.missingAPIKey
}

print("📋 Fetching available models from OpenAI API")

do {
let modelsResponse = try await openAI.models()
let availableModels = modelsResponse.data.compactMap { modelData -> Model? in
// Filter for the 5 core chat models we support
let id = modelData.id
switch id {
case "gpt-4o":
return .gpt4_o
case "gpt-4o-mini":
return .gpt4_o_mini
case "gpt-4-turbo":
return .gpt4_turbo
case "gpt-4":
return .gpt4
case "gpt-3.5-turbo":
return .gpt3_5Turbo
default:
return nil
}
}

print("✅ Found \(availableModels.count) available chat models")
return Array(Set(availableModels)) // Remove duplicates
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the api actually returning duplicates? If not this round-trip through Set can be avoided.

} catch {
print("❌ Failed to fetch models: \(error)")
// Return default models as fallback
return [.gpt4_o, .gpt4_o_mini, .gpt4_turbo, .gpt4, .gpt3_5Turbo]
}
}

func sendMessage(_ message: String, model: LLM) async throws -> String {
guard let openAI = openAI else {
print("❌ OpenAI client not configured")
throw ChatError.missingAPIKey
}

print("📤 Sending message to OpenAI API")
print("📝 Message length: \(message.count) characters")
print("🤖 Model: \(model)")

let query = ChatQuery(
messages: [
.user(.init(content: .string(message)))
],
model: model
)

do {
let result = try await openAI.chats(query: query)
let response = result.choices.first?.message.content ?? "No response"
print("✅ Successfully received response (\(response.count) characters)")
return response
} catch let error as APIError {
print("❌ OpenAI API Error: \(error)")
// Handle different API errors
throw ChatError.invalidResponse
} catch {
print("❌ Network error: \(error)")
throw ChatError.invalidResponse
}
}
}
Loading