Which Google Cloud Platform service is the easiest for running Tensorflow?
J

6

25

While working on Udacity Deep Learning assignments, I encountered memory problem. I need to switch to a cloud platform. I worked with AWS EC2 before but now I would like to try Google Cloud Platform (GCP). I will need at least 8GB memory. I know how to use docker locally but never tried it on the cloud.

  1. Is there any ready-made solution for running Tensorflow on GCP?
  2. If not, which service (Compute Engine or Container Engine) would make it easier to get started?
  3. Any other tip is also appreciated!
Jeroboam answered 28/4, 2016 at 13:49 Comment(8)
AWS has GPU instances which should be a much better fit for tensorflow. By like two orders of magnitude.Comose
See cloud datalab cloud.google.com/datalab/docs/quickstartLegging
@MattW. Thanks for the tip. I will definitely consider it for serious projects.Jeroboam
@Legging could you elaborate a bit more on that. It seemed a bit complicated for my use case.Jeroboam
I am not sure what could make it simpler. Enable billing on an account, enable app engine, and storage. then go to datalab launcher and it will start a google managed compute instance which you access through Jupyter notebooks in a web page. It is tensorflow in the cloud with one click.Legging
@Legging It worked very easily. I couldn't change the machine type though!Jeroboam
You can add a query string in the browser to launch a customised instance. That allows you to change number of CPUs, RAM, disk storage...Legging
That's very convenient. One more question, how can I install Python packages. I needed to install 'pillow' package but I couldn't figure out.Jeroboam
J
22

Summing up the answers:

Instructions to manually run TensorFlow on Compute Engine:

  1. Create a project
  2. Open the Cloud Shell (a button at the top)
  3. List machine types: gcloud compute machine-types list. You can change the machine type I used in the next command.
  4. Create an instance:
gcloud compute instances create tf \
  --image container-vm \
  --zone europe-west1-c \
  --machine-type n1-standard-2
  1. Run sudo docker run -d -p 8888:8888 --name tf b.gcr.io/tensorflow-udacity/assignments:0.5.0 (change the image name to the desired one)
  2. Find your instance in the dashboard and edit default network.
  3. Add a firewall rule to allow your IP as well as protocol and port tcp:8888.
  4. Find the External IP of the instance from the dashboard. Open IP:8888 on your browser. Done!
  5. When you are finished, delete the created cluster to avoid charges.

This is how I did it and it worked. I am sure there is an easier way to do it.

More Resources

You might be interested to learn more about:

Good to know

  • "The contents of your Cloud Shell home directory persist across projects between all Cloud Shell sessions, even after the virtual machine terminates and is restarted"
  • To list all available image versions: gcloud compute images list --project google-containers

Thanks to @user728291, @MattW, @CJCullen, and @zain-rizvi

Jeroboam answered 29/4, 2016 at 0:36 Comment(7)
in Step by step instructions to run TensorFlow on Compute Engine, number 6, what do you call your ip ?Boiled
IP of your computer :)Jeroboam
Should this : i.imgur.com/5hgPeO2.png be enough, 192.168.1.14 being my IP ? Cause I still can get no access..Boiled
That's your local IP. A friendly advice though, if you don't know what 192.168 is, you shouldn't work on cloud.Jeroboam
... thank you anyway. It's true, I don't know the first thing about computer networking and firewalls, but am planning to learn, and without that knowledge, already succeeded in running an IPython notebook on an AWS EC2 instance. Now it is all for the purpose of the Udacity assignment, but i guess switching to Linux would make things simpler. Thank you again for your timeBoiled
I didn't mean to discourage you, just for the sake of security :S. Cheers and good luck :)Jeroboam
Note that, as of the end of September, the Google Cloud Machine Learning service is open to all users as a beta product.Coucal
C
10

Google Cloud Machine Learning is open to the world in Beta form today. It provides TensorFlow as a Service so you don't have to manage machines and other raw resources. As part of the Beta release, Datalab has been updated to provide commands and utilities for machine learning. Check it out at: http://cloud.google.com/ml.

Conciliate answered 29/9, 2016 at 17:24 Comment(0)
C
3

Google has a Cloud ML platform in a limited Alpha.

Here is a blog post and a tutorial about running TensorFlow on Kubernetes/Google Container Engine.

If those aren't what you want, the TensorFlow tutorials should all be able to run on either AWS EC2 or Google Compute Engine.

Cytochemistry answered 28/4, 2016 at 20:30 Comment(1)
It is now in open beta: cloud.google.com/blog/big-data/2016/09/…Touzle
M
2

You now can also use pre-configured DeepLearning images. They have everything that is required for the TensorFlow.

Mascon answered 9/8, 2018 at 21:30 Comment(0)
L
1

This is an old question but there's are new, even easier options now:

If you want to run TensorFlow with Jupyter Lab

GCP AI Platform Notebooks, which gives you on-click access to a Jupyter Lab Notebook with Tensorflow pre-installed (you can also use Pytorch, R, or a few other libraries instead if you prefer).

If you just want to use a raw VM

If you don't care about Jupyer Lab and just want a raw VM with Tensorflow pre-installed, you can instead create a VM using GCP's Deep Learning VM Image. These DLVM images give you a VM with Tensorflow pre-installed and are all setup to use GPUs if you want. (The AI Platform Notebooks use these DLVM images under the hood)

If you'd like to run it on both your laptop and the cloud

Finally, if you want to be able to run tensorflow both on your personal laptop and in the cloud and are comfortable using Docker, you can use GCP's Deep Learning Container Images. It contains the exact same setup as the DLVM images, but packaged as a container instead, so you can launch these anywhere you like.

Extra benefit: If you're running this container image on your laptop, it's 100% free :D

Lobo answered 2/8, 2019 at 22:52 Comment(0)
A
0

Im not sure there if there is a need for you to stay on the Google Cloud platform. If you are able to use other products you might save a lot of time, and some money.

If you are using TensorFLow I would recommend a platform called TensorPort. It is exclusively for TesnorFlow and is the easy platform I am aware of. Code and data are loaded with git and they provide a python module for automatic toggling of paths between remote and your local machine. They also provide some boiler plate code for setting up distributed computing if you need it. Hope this helps.

Arum answered 2/8, 2017 at 18:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.