Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OutOfMemoryException on Large Files #8

Closed
AlexVPerl opened this issue Mar 21, 2024 · 3 comments
Closed

OutOfMemoryException on Large Files #8

AlexVPerl opened this issue Mar 21, 2024 · 3 comments
Assignees

Comments

@AlexVPerl
Copy link

Hello - thank you for creating this library, its very useful.

I'm getting OutOfMemoryException when trying to scan large files 500mb and bigger. Sharing the details below:

I took a look at the CopyToByteArray() method implementation, but not sure what should be modified. I'm able to successfully open the same big file using File.OpenRead (returns FileStream) and File.ReadAllBytes (returns byte array). I wonder if CopyToByteArray() could be modified to handle large files. Is this something that could be addressed?

Thanks in advance!

System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
        at MalwareScan.AMSI.MalwareScanner.CopyToByteArray(Stream stream)
        at MalwareScan.AMSI.MalwareScanner.HasVirus(Stream stream, String filename)
@flytzen flytzen self-assigned this Jul 29, 2024
@flytzen
Copy link
Member

flytzen commented Jul 29, 2024

Apologies @AlexVPerl, I missed this.
Yes, looking at that code it is horribly inefficient and will do way to many memory allocations. It would be much easier with more modern .Net versions.
I am no longer actively maintaining this and I don't have the relevant frameworks on my machine.
The code change should be as simple as doing this: https://stackoverflow.com/a/33611922/11534

(Ironically, the code currently in the library is from a different answer on that same SO question :)).

@flytzen
Copy link
Member

flytzen commented Jul 30, 2024

Hi @AlexVPerl. I have fixed this in v 1.2... The memory usage for a 500mb file is now about a third of what it was.
Happy scanning.

@flytzen flytzen closed this as completed Jul 30, 2024
@AlexVPerl
Copy link
Author

@flytzen thank you so much for your response and for adding the fix! Will take a look and try it out. Cheers!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants