我在intellij 2017.1.6 ide中使用SBT1.8.0作为我的星火scala项目。我想要创建一个父项目和它的子项目模块。到目前为止,这是我的build.sbt中的内容:
lazy val parent = Project("spark-etl-parent",file("."))
.settings(
name := "spark-etl-parent_1.0",
scalaVersion := "2.11.1",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided")
)
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent)
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val redshiftBasin = Project("spark-etl-
redshiftBasin",file("redshiftBasin"))
.dependsOn(parent)
.settings(
name := "spark-etl-redshiftBasin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
lazy val s3Basin = Project("spark-etl-s3Basin",file("s3Basin"))
.dependsOn(parent)
.settings(
name := "spark-etl-s3Basin_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
现在,我可以从父模块中的星火流或星火池库依赖项导入任何类,但不能在任何子模块中导入和使用它们。只有当我在any子模块中显式地将它们指定为库依赖项时,我才能使用它们。
请帮我解决这个问题。
发布于 2018-11-23 13:25:38
我的多模块项目只使用父项目构建所有内容,并委托运行到“服务器”项目:
lazy val petstoreRoot = project.in(file(".")).
aggregate(sharedJvm, sharedJs, server, client)
.settings(organizationSettings)
.settings(
publish := {}
, publishLocal := {}
, publishArtifact := false
, isSnapshot := true
, run := {
(run in server in Compile).evaluated
}
)
我将设置(例如依赖项)分组到另一个文件中,例如:
lazy val sharedDependencies: Seq[Def.Setting[_]] = Def.settings(libraryDependencies ++= Seq(
"org.julienrf" %%% "play-json-derived-codecs" % "4.0.0"
...
, "org.scalatest" %%% "scalatest" % scalaTestV % Test
))
现在,每个子模块只添加所需的内容,例如:
lazy val server = (project in file("server"))
.settings(scalaJSProjects := Seq(client))
.settings(sharedSettings(Some("server"))) // shared dependencies used by all
.settings(serverSettings)
.settings(serverDependencies)
.settings(jvmSettings)
.enablePlugins(PlayScala, BuildInfoPlugin)
.dependsOn(sharedJvm)
您在这里找到的整个项目:https://github.com/pme123/scala-adapters
有关依赖项,请参见project/Settings
文件。
发布于 2021-05-06 07:58:57
在provided->provided
中使用dependsOn
帮助我解决了一个类似的问题:
所以,就像:
lazy val etl = Project("spark-etl-etl",file("etl"))
.dependsOn(parent % "compile->compile;test->test;provided->provided")
.settings(
name := "spark-etl-etl_1.0",
version := "1.0",
scalaVersion := "2.11.1"
)
https://stackoverflow.com/questions/53446212
复制相似问题