We recently changed some of our system requirements on a light weight application (it is essentially a thin gui client that connects to a "mainframe" that runs IBM UniVerse). We didn't change our minimum requirements at all, but changed our recommended requirements to match those of Windows 7 and Vista (since we run on those machines).
Some system requirements are fairly easy to determine (ie: network card, hard drive space, etc...). But CPU and RAM are harder to nail down.
Our current list of minimum requirements for CPU and RAM both state that you have to meet the minimum's for your operating system. That seems fairly reasonable to us, since our app uses only 15MB or active memory and very little CPU (it's a simple GUI, in this case), so that works. This seems fine, no one complains about that.
When it comes to recommended requirements though, we've run into trouble nailing down specifics, especially nowadays, when saying minimum 1.6 gHz (or similar) can mean anything when you start talking about multi-core processors, atom processors, etc... The thin client is starting to do more intensive stuff (it now contains an embedded web browser to help display more user friendly html pages, for example).
- What would be a good way to go about determining recommended values for CPU and RAM?
- Do you take the recommended for an O/S and add your usage values on top (so do we then say 1GB for Vista machines?)?
- Is there a better way to do so?
(Note: this is similar in nature to the server question here, but from an application base instead)
metrics that are important to your user
-- read more at Evaluate software minimum requirements - Stack Overflow – Jornada