Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Do not keep searching for recent
If userA has a lot of recent files. But only shares 1 file with userB
(that has no files at all). We could keep searching until we run out of
recent files for userA.

Now assume the inactive userB has 20 incomming shares like that from
different users. getRecent then basically keeps consuming huge amounts
of resources and with each iteration the load on the DB increases
(because of the offset).

This makes sure we do not get more than 3 times the limit we search for
or more than 5 queries.

This means we might miss some recent entries but we should fix that
separatly. This is just to make sure the load on the DB stays sane.

Signed-off-by: Roeland Jago Douma <roeland@famdouma.nl>
  • Loading branch information
rullzer authored and Backportbot committed Jul 17, 2019
commit 7bd081ff8417981a673f635acf154aa72947c2d7
7 changes: 6 additions & 1 deletion lib/private/Files/Node/Folder.php
Original file line number Diff line number Diff line change
Expand Up @@ -383,6 +383,8 @@ public function getRecent($limit, $offset = 0) {
// Search in batches of 500 entries
$searchLimit = 500;
$results = [];
$searchResultCount = 0;
$count = 0;
do {
$searchResult = $this->recentSearch($searchLimit, $offset, $storageIds, $folderMimetype);

Expand All @@ -391,14 +393,17 @@ public function getRecent($limit, $offset = 0) {
break;
}

$searchResultCount += count($searchResult);

$parseResult = $this->recentParse($searchResult, $mountMap, $mimetypeLoader);

foreach ($parseResult as $result) {
$results[] = $result;
}

$offset += $searchLimit;
} while (count($results) < $limit);
$count++;
} while (count($results) < $limit && ($searchResultCount < (3 * $limit) || $count < 5));

return array_slice($results, 0, $limit);
}
Expand Down