Difficulty comparing generated and google cloud storage provided CRC32c checksums
Asked Answered
E

3

7

I am attemptting to get a CRC32c checksum on my local file so I can compare it to the blob.crc32c provided by the gcloud library. Google says I should be using the crcmod module in order to actually calculate CRC32c hashes of my data.

modifiedFile.txt has already been downloaded from a Google Cloud Storage bucket onto my local filesystem.

The goal here is to set should_download to true only if modifiedFile.txt has a different CRC32c on my local client vs my remote server. How do I get them to generate matching CRC32c in the event that my local filesystem and my gcloud Blob both have the same content?

from crcmod import PredefinedCrc
from gcloud import storage

# blob is a gcloud Blob object

should_download = True

with open('modifiedFile.txt') as f:
  hasher = PredefinedCrc('crc-32c')
  hasher.update(f.read())
  crc32c = hasher.digest()
  print crc32c # \207\245.\240
  print blob.crc32c # CJKo0A==
  should_download = crc32c != blob.crc32c

Unfortunately, it currently always fails as I don't actually know how to compare the checksum I build with crcmod to the attribute I am seeing in the matching Blob object.

Evert answered 21/5, 2016 at 20:38 Comment(0)
D
12

Here's an example md5 and crc32c for the gsutil public tarball:

$ gsutil ls -L gs://pub/gsutil.tar.gz | grep Hash
    Hash (crc32c):      vHI6Bw==
    Hash (md5):     ph7W3cCoEgMQWvA45Z9y9Q==

I'll copy it locally to work with:

$ gsutil cp gs://pub/gsutil.tar.gz /tmp/
Copying gs://pub/gsutil.tar.gz...
Downloading file:///tmp/gsutil.tar.gz:                           2.59 MiB/2.59 MiB    

CRC values are usually displayed as unsigned 32-bit integers. To convert it:

>>> import base64
>>> import struct
>>> struct.unpack('>I', base64.b64decode('vHI6Bw=='))
(3161602567,)

To obtain the same from the crcmod library:

>>> file_bytes = open('/tmp/gsutil.tar.gz', 'rb').read()
>>> import crcmod
>>> crc32c = crcmod.predefined.Crc('crc-32c')
>>> crc32c.update(file_bytes)
>>> crc32c.crcValue
3161602567L

If you want to convert the value from crcmod to the same base64 format used by gcloud/gsutil:

>>> base64.b64encode(crc32c.digest()).decode('utf-8')
'vHI6Bw=='
Disprove answered 23/5, 2016 at 15:41 Comment(0)
K
2

In 2022 I still had trouble finding a definitive answer. Here's what I came up with that seems to work with large files.

import google_crc32c
import collections

def generate_file_crc32c(path, blocksize=2**20):
    """
    Generate a base64 encoded crc32c checksum for a file to compare with google cloud storage.
    
    Returns a string like "4jvPnQ=="
    
    Compare with a google storage blob instance:
      blob.crc32c == generate_file_crc32c("path/to/local/file.txt")
    """
    crc = google_crc32c.Checksum()
    read_stream = open(path, "rb")
    collections.deque(crc.consume(read_stream, blocksize), maxlen=0)
    read_stream.close()
    return base64.b64encode(crc.digest()).decode("utf-8")


Kuska answered 6/11, 2022 at 0:51 Comment(0)
U
1

From the linked documentation: "CRC32c checksum, as described in RFC 4960, Appendix B; encoded using base64 in big-endian byte order"

It looks like you are not decoding the base64 string.

If you are on a Windows machine, you would need to open the text file in binary mode.

Unbuckle answered 22/5, 2016 at 0:59 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.