FIX: Reduce input of to_tsvector to follow limits (#13806)

Long posts may have `cooked` fields that produce tsvectors longer than
the maximum size of 1MiB (1,048,576 bytes). This commit uses just the
first million characters of the scrubbed cooked text for indexing.

Reducing the size to exactly 1MB (1_048_576) is not sufficient because
sometimes the output tsvector may be longer than the input and this
gives us some breathing room.
This commit is contained in:
Dan Ungureanu 2021-07-28 18:25:14 +03:00 committed by GitHub
parent b673fee946
commit 823c3f09d4
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 5 additions and 1 deletions

View File

@ -120,7 +120,11 @@ class SearchIndexer
a_weight: topic_title,
b_weight: category_name,
c_weight: topic_tags,
d_weight: scrub_html_for_search(cooked)
# Length of a tsvector must be less than 1_048_576 bytes.
# The difference between the max ouptut limit and imposed input limit
# accounts for the fact that sometimes the output tsvector may be
# slighlty longer than the input.
d_weight: scrub_html_for_search(cooked)[0..1_000_000]
) do |params|
params["private_message"] = private_message
end