Skip to content

Commit

Permalink
[Release] v0.3.8 (#269)
Browse files Browse the repository at this point in the history
* Remove unused code (#141)

* Revert "Setting version to 0.3.4-SNAPSHOT"

This reverts commit 2f1d7be.

* README: update to 0.3.3

* README: fix javadoc badge

* remove unused param

* [sbt] version updates

* [sbt] disable build for scala 2.12

* [conf] allow not_analyzed string fields (#145)

* [not-analyzed-fields] do not analyzed fields ending with _notanalyzed

* [sbt] version updates

* [sbt] disable build for scala 2.12

* [conf] allow not_analyzed string fields (#145)

* [not-analyzed-fields] do not analyzed fields ending with _notanalyzed

* [sbt] version updates

* [sbt] disable build for scala 2.12

* [conf] allow not_analyzed string fields (#145)

* [not-analyzed-fields] do not analyzed fields ending with _notanalyzed

* Revert "Revert "Setting version to 0.3.5-SNAPSHOT""

This reverts commit a6da0af.

* [build] update Lucene to 7.7.0

* Hotfix: issue 150 (#151)

* Remove unused code (#141)

* Revert "Setting version to 0.3.4-SNAPSHOT"

This reverts commit 2f1d7be.

* README: update to 0.3.3

* README: fix javadoc badge

* remove unused param

* [sbt] version updates

* [conf] allow not_analyzed string fields (#145)

* [not-analyzed-fields] do not analyzed fields ending with _notanalyzed

* [hotfix] fixes issue 150

* [tests] issue 150

* fix typo

* [blockEntityLinkage] drop queryPartColumns

* [sbt] version updates

* [scripts] fix shell

* Block linkage: allow a block linker with Row to  Query (#154)

* [linkage] block linker with => Query

* [linkage] block linker is Row => Query

* remove Query analyzer on methods

* [sbt] set version to 0.3.6-SNAPSHOT

* Feature: allow custom analyzers during compile time (#160)

* [analyzers] custom analyzer

* test return null

* [travis] travis_wait 1 min

* Revert "[travis] travis_wait 1 min"

This reverts commit c79456e.

* use lucene examples

* custom analyzer return null

* fix java reflection

* add docs

* Update to Lucene 8 (#161)

* [lucene] upgrade to version 8.0.0

* [lucene] remove ngram analyzer

* delete ngram analyzer

* minor fix

* add scaladoc

* LuceneRDDResponseSpec.collect() should work when no results are found - Issue #166 (#168)

* [sbt] update scalatest 3.0.7

* [sbt] update spark 2.4.1

* [build.sbt] add credentials file

* [plugins] update versions

* [sbt] update to 0.13.18

* Allow Lucene Analyzers per field (#164)

* [issue_163] per field analysis

* [sbt] update scalatest to 3.0.7

* [issue_163] fix docs; order of arguments

* fixes on ShapeLuceneRDD

* [issue_163] fix test

* issue_163: minor fix

* introduce LuceneRDDParams case class

* fix apply in LuceneRDDParams

* [issue_163] remove duplicate apply defn

* add extra LuceneRDD.apply

* [issue_165] throw runtime exception; use traversable trait (#170)

[issue_165] throw runtime exception; handle multi-valued fields in DataFrames

* [config] refactor; add environment variables in config (#173)

* [refactor] configuration loading

* [travis] code hygiene

* Make LuceneRDDResponse to extend RDD[Row] (#175)

* WIP

* fix tests

* remove SparkDoc class

* make test compile

* use GenericRowWithSchema

* tests: getDouble score

* score is a float

* fix casting issue with Seq[String]

* tests: LuceneDocToSparkRowpec

* tests: LuceneDocToSparkRowpec

* more tests

* LuceneDocToSparkRowpec: more tests

* LuceneDocToSparkRowpec: fix tests

* LuceneDocToSparkRow: fix Number type inference

* LuceneDocToSparkRowpec: fix tests

* implicits: remove StoredField for Numeric types

* implicits: revert remove StoredField for Numeric types

* fix more tests

* fix more tests

* [tests] fix LuceneRDDResponse .toDF()

* fix multivalued fields

* fix score type issue

* minor

* stored fields for numerics

* hotfix: TextField must be stored using StoredField

* hotfix: stringToDocument implicit

* link issue 179

* fix tests

* remove _.toRow() calls

* fix compile issue

* [sbt] update to spark 2.4.2

* [travis] use spark 2.4.2

* [build] minor updates

* Remove sbt-spark-package plugin (#181)

* [sbt] remove sbt-spark-package

* WIP

* [sbt] add spark-mllib

* [sbt] make spark provided

* update to sbt to 1.X.X (#182)

* [wip] update to sbt 1.X.X

* [travis] fix script

* [sbt] update to 1.2.8

* [sbt] update all plugins

* [sbt] spark update v2.4.3 (#183)

* [sbt] spark update v2.4.3

* minor update joda-time

* [sbt] update spark-testing

* [sbt] lucene 8.1.0 update (#184)

* [sbt] lucene update 8.1.1 (#185)

* [scalatest] update to 3.0.8

* [sbt] joda-time patch update

* [release-info] add sonatype credentials

* [sbt] lucene 8.2.0 update (#187)

* [sbt] update plugins

* [sbt] update spark 2.4.4 (#188)

* [sbt] update joda to 2.10.4

* [sbt] update joda / typesafe config (#189)

* [sbt] update Lucene 8.3.0 (#191)

* [sbt] version updates (#194)

* Update Lucene to version 8.3.1
* Update Twitter algebird to version 0.13.6
* Update scalatest/scalactic to version 3.1.0

* [github-actions] add scala.yml (#193)

* [github-actions] add scala.yml

* [sbt] update to version 1.3.3 (#195)

* [plugins] update sbt plugins (#196)

* [lucene] update version 8.4.0 (#197)

* fix version to SNAPSHOT

* code hygiene

* [sbt] update sbt-sonatype to 3.8.1

* [sbt] update sbt-scoverage to 1.6.1

* Patch updates (#222)

* patch updates

* [sbt] sbt-release update to 1.0.13

* Update spark version to 2.4.5 (#226)

* [sbt] update spark to 2.4.5

* [sbt] sbt update to 1.3.8

* [sbt] scalatest 3.1.1 (#231)

* Update spark-testing-base to 2.4.3_0.14.0 (#235)

* Revert "Setting version to 0.3.9-SNAPSHOT"

This reverts commit 82ed3b6.

* Update spark-testing-base to 2.4.3_0.14.0

Co-authored-by: Anastasios Zouzias <[email protected]>

* Update sbt to 1.3.9 (#236)

* Revert "Setting version to 0.3.9-SNAPSHOT"

This reverts commit 82ed3b6.

* Update sbt to 1.3.9

* Update version.sbt

Co-authored-by: Anastasios Zouzias <[email protected]>
Co-authored-by: Anastasios Zouzias <[email protected]>

* Update to Spark 2.4.6 (#246)

* [sbt] update to spark 2.4.6

* fix version

* [sbt] update deps (#247)

* update deps (#248)

* [maintainance] minor updates (#252)

Update:
* jts-core to 1.17.0
* sbt to 1.3.13
* sbt-assembly to 1.15.0

* [sbt] update scalatest 3.x; minor updates

* [sbt] minor updates

* update sbt version

* update tests

* update sbt plugins

* update spark

* [sbt] minor updates (#268)

Changelog:
    Fixes #266
    Fixes #259

* updare README

Co-authored-by: Linh Nguyen <[email protected]>
Co-authored-by: Anastasios Zouzias <[email protected]>
Co-authored-by: Yeikel <[email protected]>
Co-authored-by: Scala Steward <[email protected]>
  • Loading branch information
5 people authored Nov 29, 2020
1 parent f94b842 commit 6aa7c16
Show file tree
Hide file tree
Showing 27 changed files with 153 additions and 66 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ You can link against this library (for Spark 1.4+) in your program at the follow
Using SBT:

```
libraryDependencies += "org.zouzias" %% "spark-lucenerdd" % "0.3.7"
libraryDependencies += "org.zouzias" %% "spark-lucenerdd" % "0.3.8"
```

Using Maven:
Expand All @@ -57,15 +57,15 @@ Using Maven:
<dependency>
<groupId>org.zouzias</groupId>
<artifactId>spark-lucenerdd_2.11</artifactId>
<version>0.3.7</version>
<version>0.3.8</version>
</dependency>
```

This library can also be added to Spark jobs launched through `spark-shell` or `spark-submit` by using the `--packages` command line option.
For example, to include it when starting the spark shell:

```
$ bin/spark-shell --packages org.zouzias:spark-lucenerdd_2.11:0.3.7
$ bin/spark-shell --packages org.zouzias:spark-lucenerdd_2.11:0.3.8
```

Unlike using `--jars`, using `--packages` ensures that this library and its dependencies will be added to the classpath.
Expand All @@ -76,9 +76,9 @@ The project has the following compatibility with Apache Spark:

Artifact | Release Date | Spark compatibility | Notes | Status
------------------------- | --------------- | -------------------------- | ----- | ----
0.3.8-SNAPSHOT | | >= 2.4.2, JVM 8 | [develop](https://github.com/zouzias/spark-lucenerdd/tree/develop) | Under Development
0.3.9-SNAPSHOT | | >= 3.x, JVM 8 | [develop](https://github.com/zouzias/spark-lucenerdd/tree/develop) | Under Development
0.3.8 | 2020-11-30 | >= 2.4.7, JVM 8 | [tag v.0.3.8](https://github.com/zouzias/spark-lucenerdd/tree/v0.3.8) | Released
0.3.7 | 2019-04-26 | >= 2.4.2, JVM 8 | [tag v.0.3.7](https://github.com/zouzias/spark-lucenerdd/tree/v0.3.7) | Released
0.3.6 | 2019-03-11 | >= 2.4.0, JVM 8 | [tag v0.3.6](https://github.com/zouzias/spark-lucenerdd/tree/v0.3.6) | Released
0.2.8 | 2017-05-30 | 2.1.x, JVM 7 | [tag v0.2.8](https://github.com/zouzias/spark-lucenerdd/tree/v0.2.8) | Released
0.1.0 | 2016-09-26 | 1.4.x, 1.5.x, 1.6.x| [tag v0.1.0](https://github.com/zouzias/spark-lucenerdd/tree/v0.1.0) | Cross-released with 2.10/2.11

Expand Down
16 changes: 8 additions & 8 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -79,16 +79,16 @@ pomExtra := <scm>

credentials += Credentials(Path.userHome / ".sbt" / ".credentials")

val luceneV = "8.4.0"
val sparkVersion = "2.4.4"
val luceneV = "8.4.1"
val sparkVersion = "2.4.7"


// scalastyle:off
val scalactic = "org.scalactic" %% "scalactic" % "3.1.0"
val scalatest = "org.scalatest" %% "scalatest" % "3.1.0" % "test"
val scalactic = "org.scalactic" %% "scalactic" % "3.2.3"
val scalatest = "org.scalatest" %% "scalatest" % "3.2.3" % "test"

val joda_time = "joda-time" % "joda-time" % "2.10.5"
val algebird = "com.twitter" %% "algebird-core" % "0.13.6"
val joda_time = "joda-time" % "joda-time" % "2.10.8"
val algebird = "com.twitter" %% "algebird-core" % "0.13.7"
val joda_convert = "org.joda" % "joda-convert" % "2.2.1"
val spatial4j = "org.locationtech.spatial4j" % "spatial4j" % "0.7"

Expand All @@ -101,7 +101,7 @@ val lucene_expressions = "org.apache.lucene" % "lucene-expre
val lucene_spatial = "org.apache.lucene" % "lucene-spatial" % luceneV
val lucene_spatial_extras = "org.apache.lucene" % "lucene-spatial-extras" % luceneV

val jts = "org.locationtech.jts" % "jts-core" % "1.16.1"
val jts = "org.locationtech.jts" % "jts-core" % "1.17.1"
// scalastyle:on


Expand All @@ -126,7 +126,7 @@ libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-mllib" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % s"2.4.3_0.12.0" % "test" intransitive(),
"com.holdenkarau" %% "spark-testing-base" % s"2.4.5_0.14.0" % "test" intransitive(),
"org.scala-lang" % "scala-library" % scalaVersion.value % "compile"
)

Expand Down
2 changes: 1 addition & 1 deletion project/build.properties
Original file line number Diff line number Diff line change
@@ -1 +1 @@
sbt.version=1.3.6
sbt.version=1.4.3
14 changes: 7 additions & 7 deletions project/plugins.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -17,20 +17,20 @@

resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.9.0")
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.10.0")

addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.5.0")
addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.5.1")

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.10")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.15.0")

addSbtPlugin("com.github.gseitz" % "sbt-release" % "1.0.12")
addSbtPlugin("com.github.gseitz" % "sbt-release" % "1.0.13")

addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "1.0.0")

addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.5.1")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.6.1")

addSbtPlugin("org.scoverage" % "sbt-coveralls" % "1.2.7")

addSbtPlugin("org.xerial.sbt" % "sbt-sonatype" % "2.5")
addSbtPlugin("org.xerial.sbt" % "sbt-sonatype" % "3.8.1")

addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.2")
addSbtPlugin("com.jsuereth" % "sbt-pgp" % "1.1.2-1")
2 changes: 1 addition & 1 deletion spark-shell.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ CURRENT_DIR=`pwd`
SPARK_LUCENERDD_VERSION=`cat version.sbt | awk '{print $5}' | xargs`

# You should have downloaded this spark version under your ${HOME}
SPARK_VERSION="2.4.4"
SPARK_VERSION="2.4.6"

echo "==============================================="
echo "Loading LuceneRDD with version ${SPARK_LUCENERDD_VERSION}"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,14 @@ import org.apache.lucene.index.Term
import org.apache.lucene.search.{Query, TermQuery}
import org.apache.spark.SparkConf
import org.apache.spark.sql.{Row, SparkSession}
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

import org.zouzias.spark.lucenerdd.testing.Person

class BlockingDedupSpec extends FlatSpec
class BlockingDedupSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,13 @@ import org.apache.lucene.index.Term
import org.apache.lucene.search.{Query, TermQuery}
import org.apache.spark.SparkConf
import org.apache.spark.sql.{Row, SparkSession}
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._
import org.zouzias.spark.lucenerdd.testing.Person

class BlockingLinkageSpec extends FlatSpec
class BlockingLinkageSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,18 @@ package org.zouzias.spark.lucenerdd
import java.io.{Reader, StringReader}

import org.apache.lucene.document.{Document, DoublePoint, Field, FloatPoint, IntPoint, LongPoint, StoredField, TextField}
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.zouzias.spark.lucenerdd.models.SparkScoreDoc
import org.zouzias.spark.lucenerdd.models.SparkScoreDoc.{DocIdField, ScoreField, ShardField}

import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._


import scala.collection.JavaConverters._

class LuceneDocToSparkRowpec extends FlatSpec
class LuceneDocToSparkRowpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,13 @@ package org.zouzias.spark.lucenerdd

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class LucenePrimitiveTypesSpec extends FlatSpec with Matchers

class LucenePrimitiveTypesSpec extends AnyFlatSpec with Matchers
with BeforeAndAfterEach
with SharedSparkContext {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,14 @@ package org.zouzias.spark.lucenerdd

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.zouzias.spark.lucenerdd.testing.Person
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class LuceneRDDCustomCaseClassImplicitsSpec extends FlatSpec

class LuceneRDDCustomCaseClassImplicitsSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,14 @@ import java.util
import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.zouzias.spark.lucenerdd.testing.{FavoriteCaseClass, MultivalueFavoriteCaseClass}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class LuceneRDDDataFrameImplicitsSpec extends FlatSpec

class LuceneRDDDataFrameImplicitsSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,15 @@ package org.zouzias.spark.lucenerdd
import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import scala.collection.JavaConverters._
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._


import scala.io.Source

class LuceneRDDMoreLikeThisSpec extends FlatSpec
class LuceneRDDMoreLikeThisSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,17 @@ import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.lucene.index.Term
import org.apache.lucene.search.{FuzzyQuery, PrefixQuery}
import org.apache.spark.sql.{Row, SparkSession}
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._


import scala.io.Source

case class Country(name: String)

class LuceneRDDRecordLinkageSpec extends FlatSpec
class LuceneRDDRecordLinkageSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,13 @@ package org.zouzias.spark.lucenerdd

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._
import org.zouzias.spark.lucenerdd.testing.LuceneRDDTestUtils

class LuceneRDDSearchSpec extends FlatSpec
class LuceneRDDSearchSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with LuceneRDDTestUtils
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,11 @@ package org.zouzias.spark.lucenerdd

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class LuceneRDDSpec extends FlatSpec
class LuceneRDDSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,14 @@ package org.zouzias.spark.lucenerdd

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.{BeforeAndAfterEach}
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

import org.zouzias.spark.lucenerdd.testing.LuceneRDDTestUtils

class LuceneRDDTermVectorsSpec extends FlatSpec
class LuceneRDDTermVectorsSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with LuceneRDDTestUtils
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,13 @@ package org.zouzias.spark.lucenerdd

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class LuceneRDDTuplesSpec extends FlatSpec with Matchers with SharedSparkContext {

class LuceneRDDTuplesSpec extends AnyFlatSpec with Matchers with SharedSparkContext {

val First = "_1"
val Second = "_2"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,13 @@ package org.zouzias.spark.lucenerdd.analyzers
import org.apache.lucene.analysis.en.EnglishAnalyzer
import org.apache.lucene.analysis.el.GreekAnalyzer
import org.apache.lucene.analysis.de.GermanAnalyzer
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class AnalyzersConfigurableSpec extends FlatSpec with Matchers

class AnalyzersConfigurableSpec extends AnyFlatSpec with Matchers
with BeforeAndAfterEach
with AnalyzerConfigurable {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,14 @@ package org.zouzias.spark.lucenerdd.facets

import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.zouzias.spark.lucenerdd.{LuceneRDD, LuceneRDDKryoRegistrator}
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._

class FacetedLuceneRDDFacetSpec extends FlatSpec

class FacetedLuceneRDDFacetSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,16 @@ package org.zouzias.spark.lucenerdd.facets
import com.holdenkarau.spark.testing.SharedSparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.zouzias.spark.lucenerdd.testing.FavoriteCaseClass
import org.zouzias.spark.lucenerdd.{LuceneRDD, LuceneRDDKryoRegistrator}

class FacetedLuceneRDDImplicitsSpec extends FlatSpec
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._


class FacetedLuceneRDDImplicitsSpec extends AnyFlatSpec
with Matchers
with BeforeAndAfterEach
with SharedSparkContext {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,14 +24,18 @@ import org.apache.lucene.facet.FacetField
import org.apache.lucene.facet.taxonomy.directory.DirectoryTaxonomyReader
import org.apache.lucene.index.DirectoryReader
import org.apache.lucene.search.IndexSearcher
import org.scalatest.{BeforeAndAfterEach, FlatSpec, Matchers}
import org.zouzias.spark.lucenerdd.facets.FacetedLuceneRDD
import org.zouzias.spark.lucenerdd.store.IndexWithTaxonomyWriter
import scala.collection.JavaConverters._
import org.scalatest.BeforeAndAfterEach
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest._
import matchers.should._


import scala.io.Source

class LuceneQueryHelpersSpec extends FlatSpec
class LuceneQueryHelpersSpec extends AnyFlatSpec
with IndexWithTaxonomyWriter
with Matchers
with BeforeAndAfterEach {
Expand Down
Loading

0 comments on commit 6aa7c16

Please sign in to comment.