Browse Source

use cached_hash also to generate all-zero replacement chunks

at least for major amounts of fixed-size replacement hashes,
this will be much faster. also less memory management overhead.
Thomas Waldmann 4 years ago
parent
commit
ef19d937ed
1 changed files with 2 additions and 2 deletions
  1. 2 2
      src/borg/archive.py

+ 2 - 2
src/borg/archive.py

@@ -1662,8 +1662,8 @@ class ArchiveChecker:
             If a previously missing file chunk re-appears, the replacement chunk is replaced by the correct one.
             """
             def replacement_chunk(size):
-                data = bytes(size)
-                chunk_id = self.key.id_hash(data)
+                chunk = Chunk(None, allocation=CH_ALLOC, size=size)
+                chunk_id, data = cached_hash(chunk, self.key.id_hash)
                 cdata = self.key.encrypt(data)
                 csize = len(cdata)
                 return chunk_id, size, csize, cdata