Welcome to my blog, stay tunned :
Home | Blogs | Stephane Eyskens's blog

SharePoint 2010 - Overcome the 10000 default search results limit


I don't know if you ever noticed but by default since SharePoint 2010, search results are limited to 10000 results per query maximum.

I never noticed that before but for a project, we (me & another fellow SharePoint developer Jeroen Van Bastelaere) had to cache all the user profiles with a few properties in memory and make use of that cache for later use.

For performance and scalability reasons, we decided to rely on the search engine to build the cached object but we didn't anticipate that limit of 10000 items max per query and since we had around 20000 user profiles, we quickly hit that limit.

So, basically, it means that if you run this piece of code against a SharePoint 2010 Search Service Application :

FullTextSqlQuery PeopleQuery = new FullTextSqlQuery(ServerContext.Current);
PeopleQuery.QueryText = "SELECT AccountName FROM Scope() WHERE \"Scope\"='People'";
PeopleQuery.ResultTypes = ResultType.RelevantResults;
PeopleQuery.RowLimit = 10001;
ResultTableCollection PeopleQueryResults = PeopleQuery.Execute();
ResultTable PeopleQueryResultsTable = PeopleQueryResults[ResultType.RelevantResults];

You'll end up with the following error:

which admittedly is not very meaningful but this is due to the RowLimit that is above 10000, indeed, running the same code but specifying a RowLimit of 10000(or less) will just work correctly. Also, trying to specify a StartRow won't help...

So, this limit is new in SharePoint 2010 and its primarily purpose is to prevent SQL Server from executing expensive queries and to protect against potential attacks...

Here is how you can overcome it but you should of course do it with caution and within a fully controlled environment to avoid exposing your environments at risks.

Open a SharePoint PowerShell console and execute these commands:

$ssa = Get-SPEnterpriseSearchServiceApplication
$ssa.UpdateSetting("Config:qp_MaxResultsReturned", 20000)

Specify another limit than 10000 (here 20000 for instance).

You must also perform an IISRESET on the server that provides the Search Service Application to make the change effective.

Digging a bit deeper in the darkness of SharePoint Search, I found out other config properties of type SearchSettingDatabaseType.Config that we can more probably adjust to specific needs. I wrote a small console program that lists them all along with their default values :

static void Main(string[] args)
    string[] properties = new string[] {

    SPFarm farm = SPFarm.Local;
    SearchServiceApplication searchApp = (SearchServiceApplication)
        "Search Service Application");
    foreach (string property in properties)
        Console.WriteLine("{0} : {1}",property,searchApp.GetSetting(property));

Which outputs this:

Config:qp_SecTrimCacheSize : 10000
Config:qp_MaxResultsReturned : 20000 => here is adjusted via the above tip
Config:qp_SecTrimMultiplier : 2.6
Config:qp_NearDupMultiplier : 1.8
Config:qp_JoinMultiplier : 10
Config:qp_JoinSdidMultiplier : 1.01
Config:qp_UseSqlFirstJoinStrategy : False
Config:qp_AvoidSqlOuterJoins : False
Config:ForceAliasNormalisationEnabled : False
Config:UseDefaultDecimalPlaces : False

I would of course strongly advise against changing any of these properties without making first an impact analysis but this is worth to know that there are ways to fine tune/adjust some parameters!

Happy Coding & Configuring!


not all the words are searchable

I'm running SharePoint 2010 Foundation. I've uploaded a file into Shared Documents, containing all English words (about 160,000). Some of the words cannot be found, the message from Search is "We did not find any results for...". In fact, it looks like only about first 70,000 of these words are searchable, but I cannot be completely sure about that. Is that because of limited result size, or it has to do with some indexing problems?


Search limits


That is not linked to the search results limits but to the indexing indeed. This is the case since alwys :). Here is a link describing that behavior:


Best Regards

file not completely indexed

Thanks for the answer. Problem is, my file is only 1.7M, far below 16M boundary. And yet less than one half of the words are searchable. Any idea why?

File not indexed correctly


I'd still have a look at the crawl logs...I know that you can increase the limit but you can also decrease it. I suppose it's not the case in your environment but you might consider to have a look.

Best Regards

Other ways to overcome the result limit

Is is possible to page through greater than 10000 items in steps of say 2000 - avoiding the 10000 result limit?



Well, there are two ways:

1) You get all the results back in the datatable from search even if it's more than 10 000 and you apply paging logic on your datatable. The inconvenient is that you bring back everything so that consumes memory and possible bandwitdh

2) If your context permits, you can issue several queries one after the other to only retrieve bunch of 200 records but that implies having conditions on indices so make query such as "where index > 0 AND index < 201" etc...

Best Regards