colab offers free TPUs. It's easy to see how many cores are given, but I was wondering if its possible to see how much memory per core?
In google colab, is there a way to check what TPU verison is running?
Asked Answered
As far as I know we don't have an Tensorflow op or similar for accessing memory info, though in XRT we do. In the meantime, would something like the following snippet work?
import os
from tensorflow.python.profiler import profiler_client
tpu_profile_service_address = os.environ['COLAB_TPU_ADDR'].replace('8470', '8466')
print(profiler_client.monitor(tpu_profile_service_address, 100, 2))
Output looks like:
Timestamp: 22:23:03
TPU type: TPU v2
Utilization of TPU Matrix Units (higher is better): 0.000%
TPUv2 has 8GB per-core and TPUv3 has 16GB HBM per-core (https://cloud.google.com/tpu).
How do you check the number of available TPU cores? –
Secondclass
If you're using JAX, then you can use
jax.devices()
to get the number of TPU cores (or devices, more generally). –
Melodrama © 2022 - 2024 — McMap. All rights reserved.