Google Colab can't access drive content
Asked Answered
M

9

5

Even though I defined my Google Drive(and my dataset in it) to google colab but when I run my code I give this error:FileNotFoundError: [Errno 2] No such file or directory: 'content/drive/My Drive/....

I already defined google drive in google colab and I can access to it through google colab but when I run my code I give this error

from keras.models import Sequential
from keras.layers import Convolution2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
model=Sequential()
model.add(Convolution2D(32,3,3,input_shape=(64,64,3),activation='relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Convolution2D(32,3,3,activation='relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Flatten())
model.add(Dense(output_dim=128,activation='relu'))
model.add(Dense(output_dim=1,activation='sigmoid'))
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])

from keras.preprocessing.image import ImageDataGenerator
train_datagen=ImageDataGenerator(
    rescale=1./255,
    shear_range=0.2,
    zoom_range=0.2,
    horizontal_flip=True)
test_datagen=ImageDataGenerator(rescale=1./255)

training_set=train_datagen.flow_from_directory(
    directory='content/drive/My Drive/Convolutional_Neural_Networks/dataset/training_set',
    target_size=(64,64),
    batch_size=32,
    class_mode='binary')
test_set=test_datagen.flow_from_directory(
    directory='content/drive/My Drive/Convolutional_Neural_Networks/dataset/test_set',
    target_size=(64,64),
    batch_size=32,
    class_mode='binary')

#train
model.fit_generator(
    training_set,
    samples_per_epoch=8000,
    nb_epoch=2,
    validation_data=test_set,
    nb_val_samples=1000)

import numpy as np
from keras.preprocessing import image
test_image=image.load_img('sunn.jpg',target_size=(64,64))
test_image=image.img_to_array(test_image)
test_image=np.expand_dims(test_image,axis=0)
result=model.predict(test_image)
training_set.class_indices
if result[0][0] >= 0.5:
    prediction='dog'
else:
    prediction='cat'
print(prediction)
Millicent answered 19/1, 2019 at 16:24 Comment(0)
M
7

After mounting, move into the dataset folder.

cd content/drive/My Drive/Convolutional_Neural_Networks/dataset/

Don't use the !. Then set your directory as ./training_set

Mahmud answered 9/6, 2019 at 9:4 Comment(3)
Can you explain why this works? I thought ! was necessary in order to do bash commands like cd?Incised
I was having the same trouble, so tried removing the ! which worked. If you have noticed, you can put only on bash command in a cell. I believe it is how colab's jupyter notebook works.Mahmud
yes, you can use one bash command per cell without using !.Mahmud
T
5

I think you are missing a leading / in your /content/drive... path.

It's typical to mount you Drive files via

from google.colab import drive
drive.mount('/content/drive')

https://colab.research.google.com/notebooks/io.ipynb#scrollTo=u22w3BFiOveA

Transilient answered 19/1, 2019 at 16:55 Comment(0)
T
2

I have been trying, and for those curious, it has not been possible for me to use flow from directory with a folder inside google drive. The collab file environment does not read the path and gives a "Folder does not exist" error. I have been trying to solve the problem and searching stack, similar questions have been posted here Google collaborative and here Deep learnin on Google Colab: loading large image dataset is very long, how to accelerate the process? , with no effective solution and for some reason, many downvotes to those who ask.

The only solution I find to reading 20k images in google colab, is uploading them and then processing them, wasting two sad hours to do so. It makes sense, google identifies things inside the drive with ids, flow from directory requires it to be identified both the dataset, and the classes with folder absolute paths, not being compatible with google drives identification method. Alternative might be using a google cloud enviroment instead I suppose and paying.We are getting quite a lot for free as it is. This is my novice understanding of the situation, please correct me if wrong.

edit1: I was able to use flow from directory on google collab, google does identify things with path also, the thing is that if you use os.getcwd(), it does not work properly, if you use it it will give you that the current working directory is "/content", when in truth is "/content/drive/My Drive/foldersinsideyourdrive/...../folderthathasyourcollabnotebook/. If you change in the traingenerator the path so that it includes this setting, and ignore os, it works. I had however, problems with the ram even when using flow from directory, not being able to train my cnn anyway, might be something that just happens to me though.

Thin answered 14/4, 2019 at 12:20 Comment(1)
I had something similar. I can read a csv file in that colab location using pandas. So I know the file can be read. However, the load_dataset function does not work on exactly the same file location. So I am doing a read in pandas then using 'dataset = Dataset.from_pandas(df)'Tarttan
L
2

from google.colab import drive drive.mount('/content/drive')

using above code you can load your drive in colab, when to load images use:

directory='drive/My Drive/Convolutional_Neural_Networks/dataset/test_set',

not this :

directory='content/drive/My Drive/Convolutional_Neural_Networks/dataset/test_set',

for keras imagedatagenerator dataset strcut:


Lurleen answered 6/10, 2019 at 17:8 Comment(1)
Actually directory='/content/drive/MyDrive/Convolutional_Neural_Networks/dataset/test_set' works well.Suannesuarez
E
1

So, I started by the default commands of Colab

from google.colab import drive
drive.mount('/gdrive', force_remount=True)

And the main changes that I did was here

img_width, img_height = 64, 64
train_data_dir = '/gdrive/My Drive/Colab Notebooks/dataset/training_set'
validation_data_dir = '/gdrive/My Drive/Colab Notebooks/dataset/test_set'

from tensorflow.keras.preprocessing.image import ImageDataGenerator

train_datagen = ImageDataGenerator(
    rescale=1./255,
    shear_range=0.2,
    zoom_range=0.2,
    horizontal_flip=True)

test_datagen = ImageDataGenerator(rescale=1./255)

train_generator = train_datagen.flow_from_directory(
 train_data_dir,
 target_size=(64, 64),
 batch_size=32,
 class_mode='binary')

validation_generator = test_datagen.flow_from_directory(
 validation_data_dir,
 target_size=(64, 64),
 batch_size=32,
 class_mode='binary')

classifier.fit_generator(
 train_generator,
 steps_per_epoch=8000, # Number of images in train set
 epochs=25,
 validation_data=validation_generator,
 validation_steps=2000)

This worked for me and I hope this helps someone. P.s. Ignore the indentation.

Eirene answered 12/11, 2019 at 20:44 Comment(0)
D
1

After mounted at /content/drive

from google.colab import drive
drive.mount('/content/drive')

# Change working directory to folder created previously
cd '/content/drive/My Drive/PLANT DISEASE RECOGNITION'

This causes me the error that we cannot change the directory. To solve this error we may use

%cd /content/drive/My\ Drive/PLANT DISEASE RECOGNITION
Detention answered 10/4, 2022 at 1:51 Comment(1)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Impostor
W
0

for some reason you have to %cd into your google drive folder and then execute your code in order to access files from your drive or write files there.

first mount your google drive:

from google.colab import drive
drive.mount('/gdrive', force_remount=True)

then cd into your google drive and then run your code:

%cd /content/drive/My\ Drive/
directory='./Convolutional_Neural_Networks/dataset/training_set'
Willis answered 20/6, 2020 at 18:17 Comment(0)
A
0

Try removing "content", it worked for me after 1-hour of troubleshooting here

cd drive/My Drive/dissertation
Ayo answered 17/7, 2021 at 19:8 Comment(0)
T
0

After following the mount drive advice:

from google.colab import drive
drive.mount('/content/drive', force_remount=True)

I realised that referencing the dataset directly, by name, didn't work. Loading the path (parent) of my dataset did work.

This didn't work:

dataset = load_dataset("/content/drive/MyDrive/my_filename.json")

This did work:

dataset = load_dataset("/content/drive/MyDrive")
Tarttan answered 2/1, 2023 at 11:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.