Monday, February 13, 2012

*********How to settle this...***********

I have a big database to test my report, the report is made by reporting
service.
But when the query for the report get too many data such as more than
100,000 rows for the dataset, My computer is down because of lacking memory.
What can I do ?If you are on SQL Server, you can add TOP 20 or so to the query while you test. Like:
SELECT TOP 20 col1, col2
FROM ...
...
But it might be better to tune the query, look at the execution plan, perhaps create indexes to
support the query etc. This depends on where the bottleneck is. Perhaps it is the RD environment
which becomes sluggish with that many rows, and tweaking in SQL server wouldn't help in that case.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Cao tt" <Cao tt@.discussions.microsoft.com> wrote in message
news:4F14BC17-A381-4219-B124-EC2D6D5B3256@.microsoft.com...
> I have a big database to test my report, the report is made by reporting
> service.
> But when the query for the report get too many data such as more than
> 100,000 rows for the dataset, My computer is down because of lacking memory.
> What can I do ?|||Are you wanting the report to have 100,000 row result set? If so, you are
going to find that there will be problems with RS. It does certain actions
in memory and that large a resultset is a problem. Note that it is the
number of rows returned that matters. Not the size of the table on the
backend. I have 20 million row tables I am going against. The main issue
when going against large tables is do not use filters. Filters return all
the data and then filters it. Use query parameters and have only the data
you need downloaded. Or if you have aggregates, have that occur via the
query as well. Another useful technique is to use drill through instead of
drill down. With drill down you can end up with huge datasets. By using
drill through instead you can limit the data.
This issue (100,000 row resultset) is not really specific to RS. It is
always best to limit the data going to the client. No human is going to look
at 100,000 rows of data.
Bruce L-C
"Cao tt" <Cao tt@.discussions.microsoft.com> wrote in message
news:4F14BC17-A381-4219-B124-EC2D6D5B3256@.microsoft.com...
> I have a big database to test my report, the report is made by reporting
> service.
> But when the query for the report get too many data such as more than
> 100,000 rows for the dataset, My computer is down because of lacking
memory.
> What can I do ?

No comments:

Post a Comment