Spark写hive insertInto和saveAsTable的差别和错误
insertInto必须保证源表字段顺序和hive表字段顺序一致,不然会插入乱序,如果字段数不一致,会报错
saveAsTable append模式下,如果字段数不一致,会报错,overwrite模式下,会重建表
如果是外部表,已经存在了数据,saveAsTable会报错
org.apache.spark.sql.AnalysisException: Can not create the managed table