我在我的表的text列上有大约1100万行的全文索引。
表结构:
CREATE TABLE `review` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`comments` text COLLATE utf8mb4_unicode_ci,
`title` varchar(1000) COLLATE utf8mb4_unicode_ci DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `reviewer_id` (`reviewer_id`),
FULLTEXT KEY `comments` (`comments`)
) ENGINE=InnoDB AUTO_INCREMENT=273001866 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci ROW_FORMAT=COMPRESSED;
我试着像这样搜索:
SELECT
id
FROM
review
WHERE MATCH (comments) AGAINST ('"This is review is only for Campus tours and not for University itself as no one can write a review on University"' IN BOOLEAN MODE)
这将抛出以下错误:
ERROR 188 (HY000): FTS query exceeds result cache limit
谷歌表示,这是一个错误,mysql在5.7中已经修复了这个错误。我使用的是5.7.19。有没有办法解决这个问题。复制粘贴ft变量:
mysql> show global variables like 'innodb_ft%';
+---------------------------------+--------------------+
| Variable_name | Value |
+---------------------------------+--------------------+
| innodb_ft_aux_table | |
| innodb_ft_cache_size | 8000000 |
| innodb_ft_enable_diag_print | OFF |
| innodb_ft_enable_stopword | ON |
| innodb_ft_max_token_size | 84 |
| innodb_ft_min_token_size | 3 |
| innodb_ft_num_word_optimize | 2000 |
| innodb_ft_result_cache_limit | 2000000000 |
| innodb_ft_server_stopword_table | local/my_stopwords |
| innodb_ft_sort_pll_degree | 2 |
| innodb_ft_total_cache_size | 640000000 |
| innodb_ft_user_stopword_table | |
+---------------------------------+--------------------+
12 rows in set (0.00 sec)
发布于 2019-10-23 13:46:59
在向大型表(10mln行)中的text
列添加全文索引后,我遇到过这个问题。以前我通过重启服务器来解决这个问题,但现在我不能重启它,因为它正在执行一些计算。
通过调整此设置(超过默认值的2倍)解决了此问题:
SET GLOBAL innodb_ft_result_cache_limit = 4000000000;
https://stackoverflow.com/questions/46172668
复制相似问题