我的蜂巢Metastore verison是2.1.0。但是当我启动我的Spark时,它会将版本更新为1.2.0。
17/06/11 12:04:03 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/root/spark-2.1.1-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/root/spark/jars/datanucleus-core-3.2.10.jar."
17/06/11 12:04:07 ERROR metastore.ObjectStore: Version information found in metastore differs 2.1.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version.
17/06/11 12:04:09 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException这导致我的蜂箱停止工作。我尝试在spark-defaults.conf....then中设置spark-defaults.conf....then 2.1.0,我的火花外壳不起作用。请帮我处理这个
发布于 2018-01-09 07:07:07
您应该能够通过更新hive-site.xml来禁用版本验证。
<name>hive.metastore.schema.verification</name>
<!-- <value>true</value> -->
<value>false</value>
<description>
Enforce metastore schema version consistency.
True: Verify that version information stored in is compatible with one from Hive jars. Also disable automatic
schema migration attempt. Users are required to manually migrate schema after Hive upgrade which ensures
proper metastore schema migration. (Default)
False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
</description>
</property>
<property>https://stackoverflow.com/questions/44470270
复制相似问题