spark:expression 'xxx' is neither present in the group by
同一段SQL,在MYSQL中测试通过,挪到大数据执行报错:
org.apache.spark.sql.AnalysisException: expression 'xxx' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first(or first_value)...
大数据查询遇到这个错误,解决方法已经提示。就是在xxx外面包裹first()函数。
至于原因:
上善若水,水利万物而不争。