Replies: 3 comments 14 replies
-
Hmm, I don't have a large data file to test this. If it is possible to view yours - you can email me a zip of the project folder, I can test and see what could be done (if anything). If it's a very large text file - can you edit it, i.e. clean it with another software before importing into QualCoder. |
Beta Was this translation helpful? Give feedback.
-
OK I did one file update in the latest code (3.6) which might help a little. The main thing to make the editing faster is to do this: The problem is that whenever a character is changed (added or deleted), QualCoder is using the difflib module, which is slow for large files. |
Beta Was this translation helpful? Give feedback.
-
OK I have updated the code to use diff-lib-patch module instead of the difflib module. This makes editing text 20x faster and no visible slowness. |
Beta Was this translation helpful? Give feedback.
-
Hello,
I’ve been encountering an issue where large files continuously cause QualCoder to crash during the data cleaning process. These files seem to exceed the program's capacity to handle memory effectively, especially with extensive datasets and multiple coding processes. Here are the key details:
I’ve tried reducing the file size, but this is not feasible for my workflow. Is there a workaround for handling larger datasets or a solution that could help improve memory management?
Any assistance or guidance would be greatly appreciated.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions