From a9daa5cb903b6adb64ecd5b4268c0c02c5751940 Mon Sep 17 00:00:00 2001 From: Jim Ferenczi Date: Fri, 5 Oct 2018 15:42:00 +0200 Subject: [PATCH] =?UTF-8?q?[DOCS]=C2=A0Remove=20beta=20label=20from=20norm?= =?UTF-8?q?alizers=20(#34326)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/reference/analysis/normalizers.asciidoc | 2 -- 1 file changed, 2 deletions(-) diff --git a/docs/reference/analysis/normalizers.asciidoc b/docs/reference/analysis/normalizers.asciidoc index e4bd710900c..1d4d6e74213 100644 --- a/docs/reference/analysis/normalizers.asciidoc +++ b/docs/reference/analysis/normalizers.asciidoc @@ -1,8 +1,6 @@ [[analysis-normalizers]] == Normalizers -beta[] - Normalizers are similar to analyzers except that they may only emit a single token. As a consequence, they do not have a tokenizer and only accept a subset of the available char filters and token filters. Only the filters that work on