While looking through the pricing of some cloud computing hosting services like Google App engine, Amazon, etc, I see terms like $0.0x per instance per hour, etc. What exactly does that mean? Is an instance = X page views, or is there any other way to estimate how many instances I would need?
Generally 1 instance == 1 machine/server (often a virtual machine).
See e.g. http://aws.amazon.com/ec2/instance-types/ and https://developers.google.com/appengine/docs/adminconsole/instances
The hierarchy is in this way, cloud->data centers->host computers->instances(Virtual Machines).
Consider an example to understand each term. Consider a public cloud like AWS or Google App Engine, each public cloud will have many data centers at different geographical locations where there would be many servers, computers, disks for storage of data etc. and other hardware components which are required to provide the cloud services.
In each data centers there would be a group(cluster) of dedicated hardware which provides specialized services or processes and these are known as hosts.
The instance type determines the hardware of the host computer used for your instance. Each instance type offers different compute, memory, and storage capabilities and are grouped in instance families based on these capabilities. Instance are a kind of virtual environment which are used for running the users process or application.
Whenever a user wants to avail a particular service or wants to deploy a certain kind of app on the cloud, then the user needs to create an instance of that particular type.
For further information refer to the following links For aws: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instance-types.html
For Google Cloud Platform: https://cloud.google.com/appengine/docs/java/how-instances-are-managed
© 2022 - 2024 — McMap. All rights reserved.