我一直在尝试完成一个相当复杂的SQL查询(也许很简单?)压缩包含重复信息的表。我在SequelPro中使用MySQL 5.7.14。我是一个SQL新手,对joins,unions等有基本的了解。我认为这需要一个带有一些组bys的子查询,但我不知道如何做得最好。下面的表格展示了我正在尝试做的一个简单的例子:
对于每个col_1重复输入,当col_2和3(分别是范围的开始和结束)设置的范围重叠时,我希望压缩成一个单独的输入。对于col_4和5,应报告落在此范围内的条目中的最大值。在上面的例子中,在col_1中,a有三个重叠的范围,我想将其压缩为col_1的最小值和col_2的最大值,col_4的最大值和5。对于col_2中的'b‘,有两个不重叠的范围(31-50,12-15),因此它将按原样返回这两行。对于c,它将返回一行,范围为100-300,对于col_4和col_5,返回值分别为3和2。本示例所需的完整结果如下所示:
我应该补充说,在某些地方有'null‘值,应该被视为零。有谁知道最好、最简单的方法吗?提前谢谢你!!
更新:我已经尝试使用建议的范围设置查询,但我得到一个错误。查询如下:
WITH a AS (SELECT range
, lower(col_2) AS startdate
, max(upper(col_3)) OVER (ORDER BY range) AS `end`
FROM `combine`
)
, b AS (
SELECT *, lag(`end`) OVER (ORDER BY range) < `start` OR NULL AS step
FROM a
)
, c AS (
SELECT *, count(step) OVER (ORDER BY range) AS grp
FROM b
)
SELECT daterange(min(`start`), max(`end`)) AS range
FROM c
GROUP BY grp
ORDER BY 1;
我收到的错误是:您的SQL语法中有一个错误;请查看与您的MySQL服务器版本对应的手册,以获取正确的语法,以便在第1行中使用接近'a AS (SELECT range,lower(col_2) AS startdate,max(upper(col_3)) OVE‘的语法
发布于 2018-09-06 05:54:52
这不是微不足道的,但可以在一个查询中完成。
困难的部分是将一组间隔组合成可能的最大连续间隔。this post中详细介绍了解决方案。
为了得到你想要的结果,你现在需要:
根据您的示例值,结果将是:
col_1 lower_bound upper_bound
a 20 60
b 12 15
b 31 50
c 100 300
your_table
中的每一行相关联。每行只能有一个这样的间隔,所以让我们使用INNER JOINSELECT my_table.*, large_intervals.lower_bound, large_intervals.upper_bound FROM my_table INNER JOIN (my_awesome_query(your_table)) large_intervals ON large_intervals.col1 = my_table.col1 AND large_intervals.lower_bound <= my_table.col2 AND large_intervals.upper_bound >= my_table.col3
你会得到:
col1 col2 col3 col4 col5 lower_bound upper_bound
a 45 50 1 0 20 60
a 50 61 6 0 20 60
a 20 45 0 5 20 60
b 31 50 0 1 31 50
b 12 15 5 0 12 15
c 100 200 3 2 100 300
c 150 300 1 2 100 300
然后很简单,只需按col1,lower_bound,
SELECT col1, lower_bound AS col2, upper_bound AS col3, MAX(col4) AS col4, MAX(col5) AS col5 FROM (query above) decorated_table GROUP BY col1, lower_bound, upper_bound
你就会得到你想要的结果。
回到困难的部分:上面提到的帖子公开了PostgreSQL的解决方案。MySQL没有range类型,但解决方案是可以修改的。例如,直接使用下限col2
而不是lower(range)
。该解决方案还使用了窗口函数,即lag
和lead
,但这是由MySQL和with the same syntax支持的,所以这里没有问题。还要注意,它们使用COALESCE(upper(range), 'infinity')
来保护未绑定的范围。因为你的范围是有限的,你不需要关心这个,你可以直接使用上面的范围,也就是col3
。下面是它的改编:
WITH a AS (
SELECT
col2,
col3,
col2 AS lower_bound,
MAX(col3) OVER (ORDER BY col2, col3) AS upper_bound
FROM combine
)
, b AS (
SELECT *, lag(upper_bound) OVER (ORDER BY col2, col3) < lower_bound OR NULL AS step
FROM a
)
, c AS (
SELECT *, count(step) OVER (ORDER BY col2, col3) AS grp
FROM b
)
SELECT
MIN(lower_bound) AS lower_bound,
MAX(upper_bound) AS range
FROM c
GROUP BY grp
ORDER BY 1;
这适用于单个组。如果你想通过col1得到范围,你可以像这样调整它:
WITH a AS (
SELECT
col1,
col2,
col3,
col2 AS lower_bound,
MAX(col3) OVER (PARTITION BY col1 ORDER BY col2, col3) AS upper_bound
FROM combine
)
, b AS (
SELECT *, lag(upper_bound) OVER (PARTITION BY col1 ORDER BY col2, col3) < lower_bound OR NULL AS step
FROM a
)
, c AS (
SELECT *, count(step) OVER (PARTITION BY col1 ORDER BY col2, col3) AS grp
FROM b
)
SELECT
MIN(lower_bound) AS lower_bound,
MAX(upper_bound) AS range
FROM c
GROUP BY col1, grp
ORDER BY 1;
结合所有,我们得到以下结果(在您提供的示例上进行了测试),完全返回您预期的输出:
WITH a AS (
SELECT
col1,
col2,
col3,
col2 AS lower_bound,
MAX(col3) OVER (PARTITION BY col1 ORDER BY col2, col3) AS upper_bound
FROM combine
)
, b AS (
SELECT *, lag(upper_bound) OVER (PARTITION BY col1 ORDER BY col2, col3) < lower_bound OR NULL AS step
FROM a
)
, c AS (
SELECT *, count(step) OVER (PARTITION BY col1 ORDER BY col2, col3) AS grp
FROM b
)
, large_intervals AS (
SELECT
col1,
MIN(lower_bound) AS lower_bound,
MAX(upper_bound) AS upper_bound
FROM c
GROUP BY col1, grp
ORDER BY 1
)
, combine_with_large_interval AS (
SELECT
combine.*,
large_intervals.lower_bound,
large_intervals.upper_bound
FROM combine
INNER JOIN large_intervals
ON large_intervals.col1 = combine.col1
AND large_intervals.lower_bound <= combine.col2
AND large_intervals.upper_bound >= combine.col3
)
SELECT
col1,
lower_bound AS col2,
upper_bound AS col3,
MAX(col4) AS col4,
MAX(col5) AS col5
FROM combine_with_large_interval
GROUP BY col1, lower_bound, upper_bound
ORDER BY col1, col2, col3;
瞧!
https://stackoverflow.com/questions/52190794
复制相似问题