-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Text-to-SQL proof of concept #5788
base: main
Are you sure you want to change the base?
Conversation
Welcome!
Hello there, congrats on your first PR! We're excited to have you contributing to this project. TODOs/FIXMEs:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
- Added "Ask AI" command to command menu
- Introduced Text-to-SQL GraphQL resolver and service
- Added
langchain
dependency and updatedtypeorm
- Integrated new feature flag
IS_ASK_AI_ENABLED
- Updated core engine module to include
TextToSQLModule
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.resolver.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.resolver.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.service.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.service.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.service.ts
Outdated
Show resolved
Hide resolved
😍 amazing! Looking forward to reviewing this first proof of concept! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's great work! It's 100% in the right direction.
I tried it locally and the results seemed to be disappointing (only got Error executing raw query for workspace ....: syntax error at or near ...
for every query I tried), any idea why? I didn't dig into it / didn't even print the intermediate sql result to investigate yet
packages/twenty-front/src/modules/command-menu/components/CommandMenu.tsx
Show resolved
Hide resolved
packages/twenty-front/src/modules/command-menu/components/CommandMenu.tsx
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.module.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/text-to-sql/text-to-sql.resolver.ts
Outdated
Show resolved
Hide resolved
const workspaceSchemaName = | ||
this.workspaceDataSourceService.getSchemaName(workspaceId); | ||
|
||
const workspaceDataSource = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should hook into WorkspaceQueryRunner service which already has similar code?
As of now WorkspaceQueryRunner forces query execution through graphql.resolve but maybe refactoring a bit we could rely on SET search_path
etc and not duplicate that logic? Not 100% sure I understand every single line but it seems similar to what you did here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
(updates since last review)
- Introduced 'Ask AI' feature to convert user questions into SQL queries using LLM
- Added new components and hooks for 'Ask AI' in the right drawer
- Updated command menu to include 'Ask AI' command
- Modified existing components to support new feature
- Added new environment variables and dependencies for OpenAI and Langfuse integration
...wenty-server/src/engine/api/graphql/workspace-query-runner/workspace-query-runner.service.ts
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/ask-ai/ask-ai.resolver.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/ask-ai/ask-ai.service.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/ask-ai/ask-ai.service.ts
Outdated
Show resolved
Hide resolved
...ges/twenty-front/src/modules/activities/ask-ai/right-drawer/components/RightDrawerAIChat.tsx
Outdated
Show resolved
Hide resolved
@@ -0,0 +1,95 @@ | |||
import { Injectable } from '@nestjs/common'; | |||
|
|||
import { ChatOpenAI } from '@langchain/openai'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you use https://platform.openai.com/docs/guides/text-generation/json-mode and https://platform.openai.com/docs/guides/function-calling? You probably should?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand why we would eventually not want to use function calling eventually to avoid coupling too much with Open AI? But JSON mode can only be positive?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll check this out if non-SQL repsonses from the LLM keep being an issue after iterating the prompt!
|
||
const sqlQuery = await sqlQueryGeneratorChain.invoke( | ||
{ | ||
schema: await db.getTableInfo(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed via Discord you can look into workspaceCacheVersion ; graphql and sql schema can both be synced and used the same version / invalidation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! Some minor comments that could be addressed later, except for getRecordMetadataById
where I feel you're headed in a wrong direction (although I might have not understood it very well)
...ges/twenty-front/src/modules/activities/ask-ai/right-drawer/components/RightDrawerAIChat.tsx
Outdated
Show resolved
Hide resolved
...ges/twenty-front/src/modules/activities/ask-ai/right-drawer/components/RightDrawerAIChat.tsx
Outdated
Show resolved
Hide resolved
recordMetadataById={data.getAskAI.recordMetadataById} | ||
/> | ||
) : ( | ||
'Invalid SQL query.' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
packages/twenty-server/src/engine/core-modules/ask-ai/ask-ai.service.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/ask-ai/ask-ai.service.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/integrations/environment/environment-variables.ts
Outdated
Show resolved
Hide resolved
promptTemplateName: string, | ||
): Promise<PromptTemplate<PromptTemplateInput<T>, any>> { | ||
const filePath = join( | ||
resolveAbsolutePath('.llm-prompt-templates'), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand why you've made this choice and it's a good choice. But subjectively I would prefer to keep the root clean if you just put it in within /llm-prompt-template/
folder. Maybe create folders within driver (/file/file.driver.ts, /langfuse/langfuse.driver.ts ; and put it in /llm-prompt-template/drivers/file/templates/*.txt). Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I moved the prompt to a variable and deleted LLMPromptTemplateDriver for now.
It's probably not as useful as I thought to store the prompt template in a separate file or in Langfuse, because the most frequently changed and important parts of the prompt will be dynamically generated (not editable in Langfuse).
...src/engine/integrations/llm-prompt-template/interfaces/llm-prompt-template-name.interface.ts
Outdated
Show resolved
Hide resolved
packages/twenty-server/src/engine/core-modules/ask-ai/ask-ai.service.ts
Outdated
Show resolved
Hide resolved
return recordMetadataById; | ||
} | ||
|
||
async query( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One challenge is pagination / how to handle a large number of results. It feels like there's a divergence between UI and CSV/download
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could be for exemple, show count and limit to 30 results in view (because we enrich the data which is costly and poorly optimized). When downloading CSV view all results because we don't enrich?
Added:
No security concerns have been addressed, this is only a proof-of-concept and not intended to be enabled in production.
All changes are behind a feature flag called
IS_ASK_AI_ENABLED
.