I want to load the data from the directory where I have around 5000 images (type 'png'). But it returns me an error saying that there are no images when obviusly there are images. This code:
width=int(wb-wa)
height=int(hb-ha)
directory = '/content/drive/My Drive/Colab Notebooks/Hair/Images'
train_ds = tf.keras.preprocessing.image_dataset_from_directory(
directory, labels=densitat, label_mode='int',
color_mode='rgb', batch_size=32, image_size=(width, height), shuffle=True, seed=1,
validation_split=0.2, subset='training', follow_links = False)
Returns:
ValueError: Expected the lengths of `labels` to match the number of files in the target directory. len(labels) is 5588 while we found 0 files in /content/drive/My Drive/Colab Notebooks/Hair/Images.
I can see the images: Colab view of the folder structure with the images
Where is the problem? I need to use this function to load data in batchs as i have a large dataset
parent = ltifs_lib[0].parents[0].parents[0].parents[0].__str__()
(whereltifs_lib[0]
is the path to the first tif as given by.glob("**/*.tif")
frompathlib
) applying.parents[0]
between 0 and 3 times. Doesn't help. Moreover, I provided a list with the labels. – Loudhailer