We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我有200多G的历史记录,十年没删除了,我想全部导出来,结果前面导出很快,后面就很慢,已经放在固态硬盘了,好像是图片转base64计算很慢,导出十几天才120G左右,一天大概就跑几G,有没有更快的办法
The text was updated successfully, but these errors were encountered:
就是很慢的, 不清楚你200GB是单指Msg3.0.db文件还是加上Image目录, 我的db文件12GB+图片目录大概210GB, 导出的mht有264GB (和纯base64的编码放大率不太一致, 估计图片目录下有很多失效的图片), 在赛扬G3930T的机器上导出花了大概13天, 看负载模式就是单线程、base64编码为主。
Msg3.0.db
如果要更快、进阶提取, 建议直接解Msg3.0.db, 参考 #5 (comment)
Sorry, something went wrong.
就是很慢的, 不清楚你200GB是单指Msg3.0.db文件还是加上Image目录, 我的db文件12GB+图片目录大概210GB, 导出的mht有264GB (和纯base64的编码放大率不太一致, 估计图片目录下有很多失效的图片), 在赛扬G3930T的机器上导出花了大概13天, 看负载模式就是单线程、base64编码为主。 如果要更快、进阶提取, 建议直接解Msg3.0.db, 参考 #5 (comment)
谢谢解答,我是全部文件200多G,我会去研究一下
No branches or pull requests
我有200多G的历史记录,十年没删除了,我想全部导出来,结果前面导出很快,后面就很慢,已经放在固态硬盘了,好像是图片转base64计算很慢,导出十几天才120G左右,一天大概就跑几G,有没有更快的办法
The text was updated successfully, but these errors were encountered: