浏览代码

use cached_hash also to generate all-zero replacement chunks

at least for major amounts of fixed-size replacement hashes,
this will be much faster. also less memory management overhead.
Thomas Waldmann 4 年之前
父节点
当前提交
ef19d937ed
共有 1 个文件被更改,包括 2 次插入2 次删除
  1. 2 2
      src/borg/archive.py

+ 2 - 2
src/borg/archive.py

@@ -1662,8 +1662,8 @@ class ArchiveChecker:
             If a previously missing file chunk re-appears, the replacement chunk is replaced by the correct one.
             """
             def replacement_chunk(size):
-                data = bytes(size)
-                chunk_id = self.key.id_hash(data)
+                chunk = Chunk(None, allocation=CH_ALLOC, size=size)
+                chunk_id, data = cached_hash(chunk, self.key.id_hash)
                 cdata = self.key.encrypt(data)
                 csize = len(cdata)
                 return chunk_id, size, csize, cdata