PERF: limits use of redis cache while building emojis list (#19013)

We were doing get on Redis two times for each emoji while building the custom/standard/all lists which where resulting in ~3710 Redis calls. Given the emoji DB file is loaded in memory while we build/cache the emojis list this is unnecessary and slow.

As a simplification in pseudo code here is an explanation of what we were doing:

```ruby
emojis.each |emoji_name|
  aliases = get_aliases_from_redis_cache(emoji_name)
  is_tonable = get_is_tonable_from_redis_cache(emoji_name)
  build_emoji(emoji_name, aliases, is_tonable)
end
```

The two redis calls are now simplified to a simple hash access: `@db[emoji_name]`
This commit is contained in:
Joffrey JAFFEUX 2022-11-14 13:38:50 +01:00 committed by GitHub
parent c6949a26c5
commit 6493ddce17
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 4 additions and 4 deletions

View File

@ -29,11 +29,11 @@ class Emoji
end
def self.aliases
Discourse.cache.fetch(cache_key("aliases_emojis")) { db['aliases'] }
db['aliases']
end
def self.search_aliases
Discourse.cache.fetch(cache_key("search_aliases_emojis")) { db['searchAliases'] }
db['searchAliases']
end
def self.translations
@ -45,7 +45,7 @@ class Emoji
end
def self.tonable_emojis
Discourse.cache.fetch(cache_key("tonable_emojis")) { db['tonableEmojis'] }
db['tonableEmojis']
end
def self.custom?(name)
@ -118,7 +118,7 @@ class Emoji
end
def self.clear_cache
%w{custom standard aliases search_aliases translations all tonable}.each do |key|
%w{custom standard translations all}.each do |key|
Discourse.cache.delete(cache_key("#{key}_emojis"))
end
global_emoji_cache.clear