I agree with the general answer: use a search engine like Sphinx (and limit the number of results returned); they are designed to do exactly what you want.
However, while millions of records may sound like a lot, you should first determine what is taking a long time. I have major love for Sphinx and ThinkingSphinx -- they take what is a rather complex process and make it pretty simple and easy. But, in the end, a search engine it's another system to manage, configure, learn and know. If you don't have to go there, it's easier not to, right?
It might be the query, it might be time spent returning the data (limit
is your friend!). Or it might be that you're getting hundreds of requests per second, perhaps because the delay on auto-complete is too short -- if a lookup occurs at every character, fast typists or multiple users can easily overrun the server with queries that provide no utility for the user.
Watch the Rails logs and see what's really going on. If it's a simple query performance issue, doing a complicated full-text search, then, yeah, that's going to be slow and Sphinx is going to be worth the effort. Your database has an explain
tool that, with some work, can help you understand what the database is doing to get the result. It's not uncommon that an index doesn't get used.
What about caching? Memcached is a fantastic tool. Or maybe even just your buffer size settings for the database can allow it to use more memory for caching.