<@UAUNAFHHR> are you doing a select * from that ta...
# suiteanalytics
s
@KRISHNA are you doing a select * from that table? We are on a shared server, but I am able to get half a million records back from one table in 3 minutes, but I am doing a targeted select of just the fields I need. If there are large text fields in the table, it can seriously slow down the response time.
k
@scottvonduhn I have more than 6 million transaction lines. The thing is I work on large datasets before which has more then 10 million rows. So while pulling a year of sales data takes more time using this ODBC
So, when your results are more than a million how you are able to run it from your end as you are on the shared server.
s
We have some queries that return even more data, in some cases well over 6 million records. However, I am careful with the larger datasets to include only the bare minimum fields needed. Excluding text fields in particular boost performance, as they can be quite large. We are able to run it by reducing the selected columns. You basically can reduce the result size in two main ways: pare down the list of columns you select, or reduce the rows by filtering in your WHERE clause. If you ask for all of the data, you will eventually get it, but it will take longer. I limit all of our queries to just what we need, to make them run faster. Doing a select * on some tables does take a very long time. Time spent tuning your query to return the minimal dataset will yield better results than any server changes, as at some point you are going to be constrained by raw network speed rather than server performance.
k
Thanks for sharing the info. We are upgrading to t3 not just because of the ODBC but for handling other large transactions volume and customizations.