Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to run this fully local, so sensitive PII PDFs dont leave the network? #57

Open
AIMads opened this issue Apr 24, 2024 · 1 comment

Comments

@AIMads
Copy link

AIMads commented Apr 24, 2024

Hey I work with some PII data PDFs, I would love to use this tool for handling them, but is it possible to run it without the PDF data leaving the network?

@pratiksinghchauhan
Copy link

@AIMads Do you mean removing the Flask server in between, merging the nlm-ingestor and llmsherpa, and integrating it directly into the application without any additional server? I am thinking along similar lines and feel it is possible here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants