In general, my indexing strategy would be something like this (I'm using SQL Server exclusively for now - adapt to your own database system as needed):
pick a good clustering key - not a GUID, not a VARCHAR(250)
or something - a good clustering key is narrow, unique, stable, ever-increasing - something like a INT IDENTITY
is perfect. Makes this your clustered primary key -> gives you your first index on the table
for any column that is being used as a foreign key into another table - add an index. It can either be a single column index - or it might be a compound index - whatever works best for your case. It's important that the foreign key column be the first column in that index (if you're using a compound index) - otherwise, the benefits for the JOIN
's or for checking referential integrity won't be available to your system
And that's it for now.
Then: run your system - observe, and measure - establish a baseline. Is the app fast enough? If yes -> you're done - go home and enjoy your spare time.
If not: then start collecting data and indications as to why the app isn't fast enough. Look at e.g. things like the DMV's in SQL Server that tell you about the worst performing queries, or the missing index DMV. Analyze those. See what you could improve. Add one index at a time and again: observe, measure, compare to your baseline.
If you have improvement -> leave that index in place and this measurement is your new baseline. Rinse and repeat until you (and your users) are happy with the app's performance (and then go home and enjoy your time off).
Over-indexing in SQL Server can be worse than not having any indexes. Don't start out with too many indices to begin with! Only establish good clustered PK and foreign key nonclustered indices - that's all - then observe, measure, optimize & repeat that cycle.