We have some queries that return even more data, in some cases well over 6 million records. However, I am careful with the larger datasets to include only the bare minimum fields needed. Excluding text fields in particular boost performance, as they can be quite large. We are able to run it by reducing the selected columns.
You basically can reduce the result size in two main ways: pare down the list of columns you select, or reduce the rows by filtering in your WHERE clause. If you ask for all of the data, you will eventually get it, but it will take longer. I limit all of our queries to just what we need, to make them run faster. Doing a select * on some tables does take a very long time. Time spent tuning your query to return the minimal dataset will yield better results than any server changes, as at some point you are going to be constrained by raw network speed rather than server performance.