Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/main' into fe-unit-test
Browse files Browse the repository at this point in the history
  • Loading branch information
rikaa15 committed Nov 11, 2023
2 parents 1c9e022 + 881d22a commit 56d3ffd
Show file tree
Hide file tree
Showing 11 changed files with 205 additions and 55 deletions.
6 changes: 6 additions & 0 deletions progress/amanda-bancroft.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
2023-11-13 Mon:

---

2023-11-10 Fri:

2 changes: 1 addition & 1 deletion progress/sergey-karasev.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
2023-11-10 Fri: Working on [truncating the accumulating context to a maximum of 512 characters](https://github.com/harmony-one/x/pull/159) and adding unit tests for the "limiter"
2023-11-10 Fri: Working on [truncating the accumulating context to a maximum of 512 characters](https://github.com/harmony-one/x/pull/159) and adding unit tests for the "limiter", [added the same logic to harmony one bot](https://github.com/harmony-one/HarmonyOneBot/pull/343)

2023-11-09 Thu: I have added [a window to display a link to the application](https://github.com/harmony-one/x/pull/142) (share feature) after the user taps "new session" button for the seventh time, also [added a throttler](https://github.com/harmony-one/x/pull/144) to the reset session function (in order not to interrupt the greeting). Clarified the code regarding tap-to-speak and [fixed the play-pause button state](https://github.com/harmony-one/x/pull/149)

Expand Down
8 changes: 7 additions & 1 deletion progress/theo-fandrich.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,10 @@
2023-11-07 Tue: Product tested, listed out bugs, delegated fixes to engineers, submitted to App store connect.
2023-11-10 Fri: Made a 9am push that integrated some new support for in App purchases coming soon. Working on few small bugs still in place such has long press triggering tap functionality, overlap of "Tap to Speak" & "Press & Hold" functionality, and will improve logic for free credits.

2023-11-09 Thu: Made a huge push for 3 initial releases to the app store today. Download voice AI on the App Store [here](x.country/app).

2023-11-08 Wed: Worked on engineers to fix bugs regarding UI inconsistencies and conducted product testing for rate limiting.

2023-11-07 Tue: Product tested, listed out bugs, delegated fixes to engineers, submitted to App Store connect.

2023-11-06 Mon: Worked on preparing app store submission. Calculated pricing for Voice AI and documented how the backend and iOS app should communicate to support the bahavior. The updated UI will be implemented by Tuesday morning.

Expand Down
2 changes: 1 addition & 1 deletion progress/yuriy-menkov.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
2023-11-10 Fri: Resolved issue with long press actions (Ensure long press actions do not trigger tap actions vice versa). Working on tracking active using app time and showing suggestions to share with friends and share on Twitter.
2023-11-10 Fri: [Resolved](https://github.com/harmony-one/x/pull/161) issue with long press actions (Ensure long press actions do not trigger tap actions vice versa). Working on tracking active using app time and showing suggestions to share with friends and share on Twitter.

2023-11-9 Thu: [Added](https://github.com/harmony-one/x/pull/148/files) the ability to repeat the current session to resolve repeat bug (When hitting "Repeat" during the first stream, it says "Hey" while the stream is going. It should just start again from the beginning instead.)

Expand Down
22 changes: 19 additions & 3 deletions voice/voice-ai/Voice AI.xcodeproj/project.pbxproj
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,8 @@
B36367562AFC69F2000409FC /* RandomFactTests.swift in Sources */ = {isa = PBXBuildFile; fileRef = B31CDCE62AFC644D00AB39EE /* RandomFactTests.swift */; };
B36367572AFC6A04000409FC /* MockGenerator.swift in Sources */ = {isa = PBXBuildFile; fileRef = F7F1150D2AF825F700BC191C /* MockGenerator.swift */; };
B38ADB9D2AFE0A0F006BDC93 /* AppConfig.plist in Resources */ = {isa = PBXBuildFile; fileRef = F72708412AFD672E000DE81D /* AppConfig.plist */; };
B3C083B32AF1BADB0069232C /* MessageContextTests.swift in Sources */ = {isa = PBXBuildFile; fileRef = B3C083B22AF1BADB0069232C /* MessageContextTests.swift */; };
B3C083B32AF1BADB0069232C /* OpenAITests.swift in Sources */ = {isa = PBXBuildFile; fileRef = B3C083B22AF1BADB0069232C /* OpenAITests.swift */; };
B3C0FE4B2AFEC68800B712E7 /* IAPTests.swift in Sources */ = {isa = PBXBuildFile; fileRef = B3C0FE4A2AFEC68800B712E7 /* IAPTests.swift */; };
B3D0A3442AF29B1B00E8B0DA /* MockNetworkService.swift in Sources */ = {isa = PBXBuildFile; fileRef = B3D0A3432AF29B1B00E8B0DA /* MockNetworkService.swift */; };
B91681282AFBA3A80006E463 /* ReviewRequester.swift in Sources */ = {isa = PBXBuildFile; fileRef = B9B331A02AFB803500F6A9C9 /* ReviewRequester.swift */; };
B919B7BF2AF3C3F7006335D1 /* AudioEngineAndSessionTests.swift in Sources */ = {isa = PBXBuildFile; fileRef = B919B7BE2AF3C3F7006335D1 /* AudioEngineAndSessionTests.swift */; };
Expand Down Expand Up @@ -233,7 +234,8 @@
B346DF772AF562020023FC87 /* ThemeManager.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ThemeManager.swift; sourceTree = "<group>"; };
B3B3BA792AFB40A300D8F8C6 /* Theme.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = Theme.swift; sourceTree = "<group>"; };
B3C083B12AF1948D0069232C /* OpenAIService.xctestplan */ = {isa = PBXFileReference; lastKnownFileType = text; name = OpenAIService.xctestplan; path = x/OpenAIService.xctestplan; sourceTree = SOURCE_ROOT; };
B3C083B22AF1BADB0069232C /* MessageContextTests.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = MessageContextTests.swift; sourceTree = "<group>"; };
B3C083B22AF1BADB0069232C /* OpenAITests.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = OpenAITests.swift; sourceTree = "<group>"; };
B3C0FE4A2AFEC68800B712E7 /* IAPTests.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = IAPTests.swift; sourceTree = "<group>"; };
B3D0A3432AF29B1B00E8B0DA /* MockNetworkService.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = MockNetworkService.swift; sourceTree = "<group>"; };
B919B7BE2AF3C3F7006335D1 /* AudioEngineAndSessionTests.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = AudioEngineAndSessionTests.swift; sourceTree = "<group>"; };
B930AADC2ADE2DE5009F9F8C /* Color.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = Color.swift; sourceTree = "<group>"; };
Expand Down Expand Up @@ -597,14 +599,23 @@
F7F114FB2AF8172800BC191C /* VibrationManagerTests.swift */,
CD0D134C2ADA73B400031EDD /* ActionHandlerTests.swift */,
B9C4A81E2AEE594900327529 /* MockSpeechRecognition.swift */,
<<<<<<< HEAD
B3C083B22AF1BADB0069232C /* MessageContextTests.swift */,
=======
B9C4A8202AEE612A00327529 /* MockTextToSpeechConverter.swift */,
B3C083B22AF1BADB0069232C /* OpenAITests.swift */,
>>>>>>> origin/main
B3D0A3432AF29B1B00E8B0DA /* MockNetworkService.swift */,
B919B7BE2AF3C3F7006335D1 /* AudioEngineAndSessionTests.swift */,
F7F1150D2AF825F700BC191C /* MockGenerator.swift */,
B31CDCE62AFC644D00AB39EE /* RandomFactTests.swift */,
6E35CB4A2AFE86130004D2D2 /* OpenAIUtilsTests.swift */,
<<<<<<< HEAD
F723CC1E2AFECE2000B2A23A /* TextToSpeechConverterTests.swift */,
F723CC202AFEFFA400B2A23A /* MockAVSpeechSynthesizer.swift */,
=======
B3C0FE4A2AFEC68800B712E7 /* IAPTests.swift */,
>>>>>>> origin/main
);
path = xTests;
sourceTree = "<group>";
Expand Down Expand Up @@ -1007,11 +1018,16 @@
6E35CB4C2AFE86130004D2D2 /* OpenAIUtilsTests.swift in Sources */,
B31CDCE52AFC565800AB39EE /* Theme.swift in Sources */,
B9C4A82A2AEE95C900327529 /* SpeechRecognition.swift in Sources */,
<<<<<<< HEAD
=======
B3C0FE4B2AFEC68800B712E7 /* IAPTests.swift in Sources */,
B9C4A8212AEE612A00327529 /* MockTextToSpeechConverter.swift in Sources */,
>>>>>>> origin/main
F7C16FE82AFC576000D11529 /* ThemeManager.swift in Sources */,
F7F115072AF8173300BC191C /* PermissionTests.swift in Sources */,
6E53AF4C2AF012760022A8F2 /* GridButton.swift in Sources */,
B9C4A82F2AEE96AE00327529 /* Choices.swift in Sources */,
B3C083B32AF1BADB0069232C /* MessageContextTests.swift in Sources */,
B3C083B32AF1BADB0069232C /* OpenAITests.swift in Sources */,
B9C4A82C2AEE967200327529 /* AudioPlayer.swift in Sources */,
6E53AF512AF012A40022A8F2 /* Color.swift in Sources */,
);
Expand Down

This file was deleted.

64 changes: 61 additions & 3 deletions voice/voice-ai/x/Actions/GridButton.swift
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,74 @@ struct GridButton: View {
var foregroundColor: Color
var active: Bool = false
var isPressed: Bool = false

@State private var timeAtPress = Date()
@State private var isDragActive = false

var image: String? = nil
var colorExternalManage: Bool = false
var action: () -> Void
let buttonSize: CGFloat = 100
let imageTextSpacing: CGFloat = 40
@Environment(\.verticalSizeClass) var verticalSizeClass
@Environment(\.horizontalSizeClass) var horizontalSizeClass


func onDragEnded() {
self.isDragActive = false
}

func onDragStart() {
if(!self.isDragActive) {
self.isDragActive = true

self.timeAtPress = Date()
}
}

var body: some View {
Button(action: {
action()
let drag = DragGesture(minimumDistance: 0)
.onChanged({ drag in
self.onDragStart()
})
.onEnded({ drag in
self.onDragEnded()
})

let hackyPinch = MagnificationGesture(minimumScaleDelta: 0.0)
.onChanged({ delta in
self.onDragEnded()
})
.onEnded({ delta in
self.onDragEnded()
})

let hackyRotation = RotationGesture(minimumAngleDelta: Angle(degrees: 0.0))
.onChanged({ delta in
self.onDragEnded()
})
.onEnded({ delta in
self.onDragEnded()
})

let hackyPress = LongPressGesture(minimumDuration: 0.0, maximumDistance: 0.0)
.onChanged({ _ in
self.onDragEnded()
})
.onEnded({ delta in
self.onDragEnded()
})

let combinedGesture = drag
.simultaneously(with: hackyPinch)
.simultaneously(with: hackyRotation)
.exclusively(before: hackyPress)

return Button(action: {
let elapsed = Date().timeIntervalSince(self.timeAtPress)

if(elapsed < 3) {
action()
}
}) {
VStack(spacing: imageTextSpacing) {
Image(pressEffectButtonImage()) // button.image)
Expand All @@ -34,6 +91,7 @@ struct GridButton: View {
.alignmentGuide(.bottom) { _ in 0.5 }
}
.buttonStyle(PressEffectButtonStyle(theme: currentTheme, active: active, invertColors: button.action == .speak && button.pressedLabel == nil))
.simultaneousGesture(combinedGesture)
}

private func pressEffectButtonImage() -> String {
Expand Down
21 changes: 21 additions & 0 deletions voice/voice-ai/xTests/IAPTests.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
import XCTest

class PersistenceTests: XCTestCase {

func testIncreaseConsumablesCount() {
// Given
let initialCreditsCount = UserDefaults.standard.integer(forKey: Persistence.creditsCountKey)
let creditsAmount = 5

// When
Persistence.increaseConsumablesCount(creditsAmount: creditsAmount)

// Then
let updatedCreditsCount = UserDefaults.standard.integer(forKey: Persistence.creditsCountKey)
XCTAssertEqual(updatedCreditsCount, initialCreditsCount + creditsAmount)
}

// Add more test cases as needed

}

Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,59 @@ class OpenAIServiceTests: XCTestCase {
}
}

class OpenAIResponseTests: XCTestCase {

func testInit() throws {
// Given
let json = """
{
"id": "123",
"object": "response",
"created": 1635790771,
"model": "gpt-3.5-turbo",
"choices": [
{
"message": {
"role": "user",
"content": "Hi"
},
"finish_reason": "OK",
"index": 1
},
],
"usage": {
"prompt_tokens": 10,
"completion_tokens": 50,
"total_tokens": 60
}
}
"""

// When
let jsonData = Data(json.utf8)
let response = try JSONDecoder().decode(OpenAIResponse.self, from: jsonData)

// Then
XCTAssertEqual(response.id, "123")
XCTAssertEqual(response.object, "response")
XCTAssertEqual(response.created, 1635790771)
XCTAssertEqual(response.model, "gpt-3.5-turbo")

XCTAssertEqual(response.choices?.count, 1)
XCTAssertEqual(response.choices?[0].message?.role, "user")
XCTAssertEqual(response.choices?[0].message?.content, "Hi")

XCTAssertNotNil(response.usage)
XCTAssertEqual(response.usage?.prompt_tokens, 10)
XCTAssertEqual(response.usage?.completion_tokens, 50)
XCTAssertEqual(response.usage?.total_tokens, 60)
}

// Add more test cases as needed

}

class MessageTests: XCTestCase {
func testInitialization() {
// Test initializing a Message instance
Expand Down
33 changes: 23 additions & 10 deletions voice/voice-ai/xTests/OpenAIUtilsTests.swift
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,11 @@ final class OpenAIUtilsTests: XCTestCase {
Voice_AI.Message(role: "assistant", content: "Please adhere to the community guidelines."),
];

let limitedConversation = Voice_AI.OpenAIUtils.limitConversationContext(conversation, charactersCount: 43)
let limitedConversation = Voice_AI.OpenAIUtils.limitConversationContext(conversation, charactersCount: 52)

XCTAssertEqual(limitedConversation.count, 2, "conversation should contain 2 messages")

XCTAssertEqual(limitedConversation[0].content, "confirmed.")
XCTAssertEqual(limitedConversation[1].content, "Please adhere to the community guidelines.")
}

func testShouldReturnAllMessages() throws {
Expand All @@ -49,33 +50,45 @@ final class OpenAIUtilsTests: XCTestCase {
let limitedConversation = Voice_AI.OpenAIUtils.limitConversationContext(conversation, charactersCount: 100)

XCTAssertEqual(limitedConversation.count, 3, "conversation should contain all messages")

XCTAssertEqual(limitedConversation[0].content, "Welcome to the platform!")
XCTAssertEqual(limitedConversation[1].content, "Your order has been confirmed.")
XCTAssertEqual(limitedConversation[2].content, "Please adhere to the community guidelines.")
}

func testShouldFilterEmptyConversation() throws {
let emptyConversation: [Voice_AI.Message] = [];

let limitedEmpty = Voice_AI.OpenAIUtils.limitConversationContext(emptyConversation, charactersCount: 100)


XCTAssertEqual(limitedEmpty.count, 0, "conversation should contain all messages")
}

func testShouldFilterEmptyMessages() throws {
let emptyConversation: [Voice_AI.Message] = [
let conversation: [Voice_AI.Message] = [
Voice_AI.Message(role: "assistant", content: ""),
Voice_AI.Message(role: "assistant", content: "Please adhere to the community guidelines."),
Voice_AI.Message(role: "assistant", content: ""),
Voice_AI.Message(role: "assistant", content: nil),
];

let limitedEmpty = Voice_AI.OpenAIUtils.limitConversationContext(emptyConversation, charactersCount: 100)

let cleanConversation = Voice_AI.OpenAIUtils.limitConversationContext(conversation, charactersCount: 100)

XCTAssertEqual(limitedEmpty.count, 1, "conversation should contain all messages")
XCTAssertEqual(cleanConversation.count, 1)
XCTAssertEqual(cleanConversation[0].content, "Please adhere to the community guidelines.")
}



func shouldPreserveOrderOfMessages() throws {
let conversation: [Voice_AI.Message] = [
Voice_AI.Message(role: "assistant", content: "one"),
Voice_AI.Message(role: "assistant", content: "two"),
Voice_AI.Message(role: "assistant", content: "three"),
];

let limitedc = Voice_AI.OpenAIUtils.limitConversationContext(conversation, charactersCount: 100)

XCTAssertEqual(limitedc[0].content, "one")
XCTAssertEqual(limitedc[1].content, "two")
XCTAssertEqual(limitedc[3].content, "three")
}

}
26 changes: 13 additions & 13 deletions voice/voice-ai/xTests/TextToSpeechConverterTests.swift
Original file line number Diff line number Diff line change
Expand Up @@ -41,19 +41,19 @@ class TextToSpeechConverterTests: XCTestCase {
XCTAssertTrue(textToSpeechConverter.synthesizer.isSpeaking)
}

// func testConvertTextToSpeechNoVoiceAvailable() {
// let textToSpeechConverter = TextToSpeechConverter()
//
// // Set a preferred language that you know does not have an available voice
// let unsupportedLanguage = "xx-XX" // Replace with a language code that has no voice support
//
// // Call the method to convert text to speech
// textToSpeechConverter.convertTextToSpeech(text: "Test speech")
//
// // Now, check that the synthesizer's voice is set to the default language
// let defaultLanguage = Locale.preferredLanguages.first ?? "en-US"
// XCTAssertEqual(textToSpeechConverter.synthesizer.voice?.language, defaultLanguage)
// }
func testConvertTextToSpeechNoVoiceAvailable() {
let textToSpeechConverter = TextToSpeechConverter()

// Set a preferred language that you know does not have an available voice
let unsupportedLanguage = "xx-XX" // Replace with a language code that has no voice support

// Call the method to convert text to speech
textToSpeechConverter.convertTextToSpeech(text: "Test speech")

// Now, check that the synthesizer's voice is set to the default language
let defaultLanguage = Locale.preferredLanguages.first ?? "en-US"
XCTAssertEqual(textToSpeechConverter.synthesizer.voice?.language, defaultLanguage)
}



Expand Down

0 comments on commit 56d3ffd

Please sign in to comment.