Skip to content

Commit

Permalink
Update Build.sbt to include shaded jars (#27)
Browse files Browse the repository at this point in the history
* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class (#16)

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Fix sbt-spark-package issue (#22)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Update builds.sbt for sbt spark packaging (#23)

* Removed environment variable from path in README example (#14)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT Support (#21)

* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class

* Add classifier to bring in correct shaded jar and class (#16)

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* use filterNot

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Add TensorFlow hadoop jar to Spark package (#24)

* Update credentials in build.sbt (#25)

* Removed environment variable from path in README example (#14)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT Support (#21)

* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class

* Add classifier to bring in correct shaded jar and class (#16)

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* use filterNot

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Update build.sbt (#26)

* Removed environment variable from path in README example (#14)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT Support (#21)

* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class

* Add classifier to bring in correct shaded jar and class (#16)

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* use filterNot

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Remove extra license tag
  • Loading branch information
karthikvadla authored Feb 18, 2017
1 parent e7d123e commit e6173c6
Show file tree
Hide file tree
Showing 2 changed files with 58 additions and 17 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ $SPARK_HOME/bin/spark-shell --jars target/spark-tensorflow-connector-1.0-SNAPSHO

SBT Jars
```sh
$SPARK_HOME/bin/spark-shell --jars target/scala-2.11/spark-tensorflow-connector-assembly-1.0-SNAPSHOT.jar
$SPARK_HOME/bin/spark-shell --jars target/scala-2.11/spark-tensorflow-connector-assembly-1.0.0.jar
```

The following code snippet demonstrates usage.
Expand Down
73 changes: 57 additions & 16 deletions build.sbt
Original file line number Diff line number Diff line change
@@ -1,5 +1,17 @@
name := "spark-tensorflow-connector"

organization := "org.trustedanalytics"

scalaVersion in Global := "2.11.8"

spName := "trustedanalytics/spark-tensorflow-connector"

sparkVersion := "2.1.0"

sparkComponents ++= Seq("sql", "mllib")

version := "1.0.0"

def ProjectName(name: String,path:String): Project = Project(name, file(path))

resolvers in Global ++= Seq("https://tap.jfrog.io/tap/public" at "https://tap.jfrog.io/tap/public" ,
Expand All @@ -20,21 +32,6 @@ val `org.scalatest_scalatest_2.11` = "org.scalatest" % "scalatest_2.11" % "2.2.6

val `org.tensorflow_tensorflow-hadoop` = "org.tensorflow" % "tensorflow-hadoop" % "1.0-01232017-SNAPSHOT"


spName := "spark-tensorflow-connector"

sparkVersion := "2.1.0"

sparkComponents ++= Seq("sql", "mllib")

spIgnoreProvided := true

version := "1.0-SNAPSHOT"

name := "spark-tensorflow-connector"

organization := "org.trustedanalytics"

libraryDependencies in Global ++= Seq(`org.tensorflow_tensorflow-hadoop` classifier "shaded-protobuf",
`org.scalatest_scalatest_2.11` % "test" ,
`org.apache.spark_spark-sql_2.11` % "provided" ,
Expand All @@ -49,4 +46,48 @@ assemblyExcludedJars in assembly := {
"tensorflow-hadoop-1.0-01232017-SNAPSHOT-shaded-protobuf.jar").contains(x.data.getName)}
}

licenses := Seq("Apache License 2.0" -> url("http://www.apache.org/licenses/LICENSE-2.0.html"))
/********************
* Release settings *
********************/

spIgnoreProvided := true

spAppendScalaVersion := true

// If you published your package to Maven Central for this release (must be done prior to spPublish)
spIncludeMaven := false

publishMavenStyle := true

licenses += ("Apache-2.0", url("http://www.apache.org/licenses/LICENSE-2.0"))

pomExtra :=
<url>https://github.com/tapanalyticstoolkit/spark-tensorflow-connector</url>
<scm>
<url>git@github.com:tapanalyticstoolkit/spark-tensorflow-connector.git</url>
<connection>scm:git:git@github.com:tapanalyticstoolkit/spark-tensorflow-connector.git</connection>
</scm>
<developers>
<developer>
<id>karthikvadla</id>
<name>Karthik Vadla</name>
<url>https://github.com/karthikvadla</url>
</developer>
<developer>
<id>skavulya</id>
<name>Soila Kavulya</name>
<url>https://github.com/skavulya</url>
</developer>
<developer>
<id>joyeshmishra</id>
<name>Joyesh Mishra</name>
<url>https://github.com/joyeshmishra</url>
</developer>
</developers>

credentials += Credentials(Path.userHome / ".ivy2" / ".sbtcredentials") // A file containing credentials

// Add assembly jar to Spark package
test in assembly := {}

spShade := true

0 comments on commit e6173c6

Please sign in to comment.