Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local Setup Calling function isn't Working #27

Open
Sr1v47s4n opened this issue Sep 23, 2024 · 11 comments
Open

Local Setup Calling function isn't Working #27

Sr1v47s4n opened this issue Sep 23, 2024 · 11 comments

Comments

@Sr1v47s4n
Copy link

task_manager = TaskManager(self.agent_config.get("agent_name", self.agent_config.get("assistant_name")),
bolna-app-1   |   File "/app/bolna/agent_manager/task_manager.py", line 58, in __init__
bolna-app-1   |     if task['tools_config']["llm_agent"] and task['tools_config']["llm_agent"]['llm_config'].get('assistant_id', None) is not None:
bolna-app-1   | KeyError: 'llm_config'
bolna-app-1   | 2024-09-23 08:41:21.133 ERROR {quickstart_server} [websocket_endpoint] error in executing 'llm_config'

While Setting up the Local Server, The Bolna-app is throwing above error as we passing the following "llm_agent": { "agent_flow_type": "streaming", "provider": "openai", "request_json": true, "model": "gpt-3.5-turbo-16k", "use_fallback": true },, It would be really helpful if you could assist me in the setting up

@AlexM4H
Copy link
Contributor

AlexM4H commented Sep 28, 2024

What could be the reason? Unfortunately I did not find any hint in the README file

@prateeksachan
Copy link
Member

@Sr1v47s4n , @AlexM4H this is now fixed in the latest release.

@Sr1v47s4n
Copy link
Author

Thanks for getting back @AlexM4H, Sorry that I couldn't respond earlier as I missed out the notification. Now the local setup is working thanks

@Sr1v47s4n Sr1v47s4n reopened this Oct 1, 2024
@Sr1v47s4n
Copy link
Author

bolna-app-1   | 2024-10-01 03:29:47.547 INFO {quickstart_server} [create_agent] Data for DB {'agent_name': 'Alfred', 'agent_type': 'other', 'tasks': [{'tools_config': {'llm_agent': {'agent_flow_type': 'streaming', 'agent_type': 'simple_llm_agent', 'routes': None, 'llm_config': {'model': 'gpt-4o', 'max_tokens': 100, 'family': 'openai', 'temperature': 0.1, 'request_json': True, 'stop': None, 'top_k': 0, 'top_p': 0.9, 'min_p': 0.1, 'frequency_penalty': 0.0, 'presence_penalty': 0.0, 'provider': 'openai', 'base_url': None, 'routes': None, 'agent_flow_type': 'streaming', 'extraction_details': None, 'summarization_details': None}}, 'synthesizer': {'provider': 'elevenlabs', 'provider_config': {'voice': 'Daniel', 'voice_id': 'onwK4e9ZLuTAKqWW03F9', 'model': 'eleven_turbo_v2_5', 'temperature': 0.5, 'similarity_boost': 0.5}, 'stream': True, 'buffer_size': 100, 'audio_format': 'wav', 'caching': True}, 'transcriber': {'model': 'nova-2', 'language': 'en', 'stream': True, 'sampling_rate': 16000, 'encoding': 'linear16', 'endpointing': 400, 'keywords': None, 'task': 'transcribe', 'provider': 'deepgram'}, 'input': {'provider': 'twilio', 'format': 'pcm'}, 'output': {'provider': 'twilio', 'format': 'pcm'}, 'api_tools': None}, 'toolchain': {'execution': 'parallel', 'pipelines': [['transcriber', 'llm', 'synthesizer']]}, 'task_type': 'conversation', 'task_config': {'optimize_latency': True, 'hangup_after_silence': 30, 'incremental_delay': 100, 'number_of_words_for_interruption': 1, 'interruption_backoff_period': 100, 'hangup_after_LLMCall': False, 'call_cancellation_prompt': None, 'backchanneling': False, 'backchanneling_message_gap': 5, 'backchanneling_start_delay': 5, 'ambient_noise': False, 'ambient_noise_track': 'convention_hall', 'call_terminate': 90, 'use_fillers': False, 'trigger_user_online_message_after': 6, 'check_user_online_message': 'Hey, are you still there', 'check_if_user_online': True}}], 'agent_welcome_message': 'How are you doing Bruce?', 'assistant_status': 'seeding'}
bolna-app-1   | 2024-10-01 03:29:47.547 INFO {quickstart_server} [create_agent] Setting up follow up tasks
bolna-app-1   | 2024-10-01 03:29:47.548 INFO {utils} [store_file] Writing to agent_data/082f847a-d9ca-4aa9-8382-0d0c611ce80c/conversation_details.json 
bolna-app-1   | Agent type: simple_llm_agent
bolna-app-1   | Value type: <class 'dict'>
bolna-app-1   | Value: {'agent_flow_type': 'streaming', 'provider': 'openai', 'request_json': True, 'model': 'gpt-4o'}
bolna-app-1   | value deepgram, PROVIDERS ['deepgram', 'whisper', 'bodhi']
bolna-app-1   | INFO:     152.58.221.200:62395 - "POST /agent HTTP/1.1" 200 OK
twilio-app-1  | telephony_host: https://7b71-13-200-100-116.ngrok-free.app
twilio-app-1  | bolna_host: None
twilio-app-1  | INFO:     152.58.221.200:62401 - "POST /call HTTP/1.1" 200 OK

While I was trying to call the agent with default setup and using Twilio, i was receiving the call from my Twilio but the code wasn't getting executed, where i was asked to press any key to continue and when i press it the call is getting ended

@llvee
Copy link

llvee commented Oct 1, 2024

Is your team looking for more help to help resolve issues like this?

@prateeksachan
Copy link
Member

prateeksachan commented Oct 1, 2024

@Sr1v47s4n your bolna_host: None. Have you correctly setup ngrok? Can you verify if both bolna_host and telephony_host are being tunneled via ngrok.

@Sr1v47s4n
Copy link
Author

Is your team looking for more help to help resolve issues like this?

Yes

@Sr1v47s4n
Copy link
Author

@Sr1v47s4n your bolna_host: None. Have you correctly setup ngrok? Can you verify if both bolna_host and telephony_host are being tunneled via ngrok.

Yes I have correctly setup the ngrok

@prateeksachan
Copy link
Member

@Sr1v47s4n can you share a screenshot of the http://localhost:4040/status page? It seems your tunneling is not happening for the bolna_app due to which it's url is coming as None.

@Sr1v47s4n
Copy link
Author

@Sr1v47s4n can you share a screenshot of the http://localhost:4040/status page? It seems your tunneling is not happening for the bolna_app due to which it's url is coming as None.

image

@prateeksachan
Copy link
Member

The telephony_host url doesn't match your ngrok twilio-app url. Can you give it a try again and make sure that the url matches with your ngrok tunneled url? Please ensure that the latest docker image is being used.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants