Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CANNOLI-98] Adding container builder. #107

Merged
merged 2 commits into from
Mar 22, 2018

Conversation

heuermh
Copy link
Member

@heuermh heuermh commented Feb 14, 2018

Fixes #98, #114, #67, #34.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/106/
Test PASSed.

@coveralls
Copy link

coveralls commented Feb 15, 2018

Coverage Status

Coverage decreased (-1.3%) to 22.127% when pulling 151cf63 on heuermh:container-builder into 0ab8543 on bigdatagenomics:master.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/107/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/108/
Test PASSed.

@heuermh
Copy link
Member Author

heuermh commented Feb 19, 2018

It's not every day that you break the scalac compiler:

[INFO] Compiling 13 source files to /Users/mheuer2/working/cannoli/cli/target/scala-2.11.12/classes at 1519070046365
[ERROR] /Users/mheuer2/working/cannoli/cli/src/main/scala/org/bdgenomics/cannoli/cli/Bedtools.scala:100: error: class CommandBuilder takes type parameters
[ERROR]     val builder: CommandBuilder = CommandBuilders.create(args.useDocker, args.useSingularity)
[ERROR]                  ^
[ERROR] error:
[INFO]   Something is wrong: cannot find C in applied type org.bdgenomics.cannoli.builder.CommandBuilder
[INFO]    sought  C in CommandBuilder
[INFO]  classSym  CommandBuilder in builder
[INFO]   tparams  C in CommandBuilder
[INFO]
[INFO]      while compiling: /Users/mheuer2/working/cannoli/cli/src/main/scala/org/bdgenomics/cannoli/cli/Bedtools.scala
[INFO]         during phase: typer
[INFO]      library version: version 2.11.12
[INFO]     compiler version: version 2.11.12
[INFO]
[INFO]   last tree to typer: Ident(args)
[INFO]        tree position: line 101 of /Users/mheuer2/working/cannoli/cli/src/main/scala/org/bdgenomics/cannoli/cli/Bedtools.scala
[INFO]               symbol: <none>
[INFO]    symbol definition: <none> (a NoSymbol)
[INFO]       symbol package: <none>
[INFO]        symbol owners:
[INFO]            call site: method run in class Bedtools in package cli
[INFO]
[INFO] == Source file context for tree position ==
[INFO]
[INFO]     98
[INFO]     99     val builder: CommandBuilder = CommandBuilders.create(args.useDocker, args.useSingularity)
[INFO]    100       .setExecutable(args.executable)
[INFO]    101       .addArgument("intersect")
[INFO]    102       .add("-a")
[INFO]    103       .add(optA.getOrElse("stdin"))
[INFO]    104       .add("-b")
[ERROR] error: scala.reflect.internal.FatalError:
[INFO]   Something is wrong: cannot find C in applied type org.bdgenomics.cannoli.builder.CommandBuilder
[INFO]    sought  C in CommandBuilder
[INFO]  classSym  CommandBuilder in builder
[INFO]   tparams  C in CommandBuilder
[INFO]
[INFO]      while compiling: /Users/mheuer2/working/cannoli/cli/src/main/scala/org/bdgenomics/cannoli/cli/Bedtools.scala
[INFO]         during phase: typer
[INFO]      library version: version 2.11.12
[INFO]     compiler version: version 2.11.12
[INFO]
[INFO]   last tree to typer: Ident(args)
[INFO]        tree position: line 101 of /Users/mheuer2/working/cannoli/cli/src/main/scala/org/bdgenomics/cannoli/cli/Bedtools.scala
[INFO]               symbol: <none>
[INFO]    symbol definition: <none> (a NoSymbol)
[INFO]       symbol package: <none>
[INFO]        symbol owners:
[INFO]            call site: method run in class Bedtools in package cli
[INFO]
[INFO] == Source file context for tree position ==
[INFO]
[INFO]     98
[INFO]     99     val builder: CommandBuilder = CommandBuilders.create(args.useDocker, args.useSingularity)
[INFO]    100       .setExecutable(args.executable)
[INFO]    101       .addArgument("intersect")
[INFO]    102       .add("-a")
[INFO]    103       .add(optA.getOrElse("stdin"))
[INFO]    104       .add("-b")
[INFO] 	at scala.reflect.internal.Reporting$class.abort(Reporting.scala:59)
[INFO] 	at scala.reflect.internal.SymbolTable.abort(SymbolTable.scala:16)
[INFO] 	at scala.reflect.internal.tpe.TypeMaps$AsSeenFromMap.correspondingTypeArgument(TypeMaps.scala:565)
[INFO] 	at scala.reflect.internal.tpe.TypeMaps$AsSeenFromMap.loop$3(TypeMaps.scala:601)
[INFO] 	at scala.reflect.internal.tpe.TypeMaps$AsSeenFromMap.classParameterAsSeen(TypeMaps.scala:606)
[INFO] 	at scala.reflect.internal.tpe.TypeMaps$AsSeenFromMap.apply(TypeMaps.scala:483)
[INFO] 	at scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:129)
[INFO] 	at scala.reflect.internal.tpe.TypeMaps$AsSeenFromMap.apply(TypeMaps.scala:484)
[INFO] 	at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:668)
[INFO] 	at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:703)
[INFO] 	at scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2967)
[INFO] 	at scala.reflect.internal.Types$Type.memberType(Types.scala:694)
[INFO] 	at scala.tools.nsc.typechecker.Infer$Inferencer.checkAccessible(Infer.scala:272)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$makeAccessible(Typers.scala:559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$109.apply(Typers.scala:4784)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$109.apply(Typers.scala:4784)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:693)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedSelectInternal$1(Typers.scala:4784)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedSelect$1(Typers.scala:4709)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedSelectOrSuperCall$1(Typers.scala:4848)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5371)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:680)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4558)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4608)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5370)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5501)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5507)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedSelectOrSuperCall$1(Typers.scala:4839)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5371)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:680)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4558)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4608)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5370)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5501)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5507)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedSelectOrSuperCall$1(Typers.scala:4839)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5371)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:680)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4558)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4608)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5370)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5501)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5507)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedSelectOrSuperCall$1(Typers.scala:4839)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5371)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:680)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4558)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4608)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5370)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5501)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5507)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedSelectOrSuperCall$1(Typers.scala:4839)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5371)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$102.apply(Typers.scala:4559)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:680)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4558)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4608)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5370)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.transformedOrTyped(Typers.scala:5634)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedValDefImpl(Typers.scala:1990)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedValDef(Typers.scala:1953)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5333)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5386)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedByValueExpr(Typers.scala:5481)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:3042)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$65.apply(Typers.scala:3150)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$65.apply(Typers.scala:3150)
[INFO] 	at scala.collection.immutable.List.loop$1(List.scala:176)
[INFO] 	at scala.collection.immutable.List.mapConserve(List.scala:200)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:3150)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2376)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$typedOutsidePatternMode$1$1.apply(Typers.scala:5345)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$typedOutsidePatternMode$1$1.apply(Typers.scala:5345)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5344)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5380)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.transformedOrTyped(Typers.scala:5634)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedDefDef(Typers.scala:2203)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5335)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5386)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedByValueExpr(Typers.scala:5481)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:3042)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$65.apply(Typers.scala:3150)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$65.apply(Typers.scala:3150)
[INFO] 	at scala.collection.immutable.List.loop$1(List.scala:176)
[INFO] 	at scala.collection.immutable.List.mapConserve(List.scala:200)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:3150)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedTemplate(Typers.scala:1916)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedClassDef(Typers.scala:1757)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5336)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5386)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedByValueExpr(Typers.scala:5481)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:3042)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$65.apply(Typers.scala:3150)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$65.apply(Typers.scala:3150)
[INFO] 	at scala.collection.immutable.List.loop$1(List.scala:176)
[INFO] 	at scala.collection.immutable.List.mapConserve(List.scala:200)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:3150)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5042)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5339)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5386)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
[INFO] 	at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5477)
[INFO] 	at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:102)
[INFO] 	at scala.tools.nsc.Global$GlobalPhase$$anonfun$applyPhase$1.apply$mcV$sp(Global.scala:467)
[INFO] 	at scala.tools.nsc.Global$GlobalPhase.withCurrentUnit(Global.scala:458)
[INFO] 	at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:467)
[INFO] 	at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:94)
[INFO] 	at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:93)
[INFO] 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
[INFO] 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
[INFO] 	at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:93)
[INFO] 	at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1528)
[INFO] 	at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1513)
[INFO] 	at scala.tools.nsc.Global$Run.compileSources(Global.scala:1508)
[INFO] 	at scala.tools.nsc.Global$Run.compile(Global.scala:1609)
[INFO] 	at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[INFO] 	at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
[INFO] 	at scala.tools.nsc.Driver.process(Driver.scala:51)
[INFO] 	at scala.tools.nsc.Driver.main(Driver.scala:64)
[INFO] 	at scala.tools.nsc.Main.main(Main.scala)
[INFO] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[INFO] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[INFO] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO] 	at java.lang.reflect.Method.invoke(Method.java:498)
[INFO] 	at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[INFO] 	at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/109/

Build result: FAILURE

GitHub pull request #107 of commit d7520ab automatically merged.Notifying endpoint 'HTTP:https://webhooks.gitter.im/e/ac8bb6e9f53357bc8aa8'[EnvInject] - Loading node environment variables.Building remotely on amp-jenkins-worker-03 (centos spark-test) in workspace /home/jenkins/workspace/cannoli-prbWiping out workspace first.Cloning the remote Git repositoryCloning repository https://github.com/bigdatagenomics/cannoli.git > /home/jenkins/git2/bin/git init /home/jenkins/workspace/cannoli-prb # timeout=10Fetching upstream changes from https://github.com/bigdatagenomics/cannoli.git > /home/jenkins/git2/bin/git --version # timeout=10 > /home/jenkins/git2/bin/git fetch --tags --progress https://github.com/bigdatagenomics/cannoli.git +refs/heads/:refs/remotes/origin/ # timeout=15 > /home/jenkins/git2/bin/git config remote.origin.url https://github.com/bigdatagenomics/cannoli.git # timeout=10 > /home/jenkins/git2/bin/git config --add remote.origin.fetch +refs/heads/:refs/remotes/origin/ # timeout=10 > /home/jenkins/git2/bin/git config remote.origin.url https://github.com/bigdatagenomics/cannoli.git # timeout=10Fetching upstream changes from https://github.com/bigdatagenomics/cannoli.git > /home/jenkins/git2/bin/git fetch --tags --progress https://github.com/bigdatagenomics/cannoli.git +refs/pull/:refs/remotes/origin/pr/ # timeout=15 > /home/jenkins/git2/bin/git rev-parse origin/pr/107/merge^{commit} # timeout=10 > /home/jenkins/git2/bin/git branch -a -v --no-abbrev --contains 75a33fc # timeout=10Checking out Revision 75a33fc (origin/pr/107/merge) > /home/jenkins/git2/bin/git config core.sparsecheckout # timeout=10 > /home/jenkins/git2/bin/git checkout -f 75a33fcd887fa654a9e9631ca5d91e993f30c3e0First time build. Skipping changelog.Triggering cannoli-prb ? 2.7.3,2.11,2.2.1,centoscannoli-prb ? 2.7.3,2.11,2.2.1,centos completed with result FAILURENotifying endpoint 'HTTP:https://webhooks.gitter.im/e/ac8bb6e9f53357bc8aa8'
Test FAILed.

@heuermh heuermh changed the title Adding container builder. [CANNOLI-98] Adding container builder. Feb 20, 2018
@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/112/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/113/
Test PASSed.

@heuermh heuermh requested a review from jpdna February 20, 2018 20:29
@heuermh heuermh added this to the 0.2.0 milestone Feb 20, 2018
@heuermh
Copy link
Member Author

heuermh commented Feb 23, 2018

@fnothaft I'm trying to figure out when paths need to be absolute and when they need to be URIs and when those URIs need to be canonicalized. The current implementation in Bwa.scala seems to be incomplete, in that filesToAdd/filesToMount are never passed to the pipe method.

For files on the local file system, this looks ok to me, except that the mount directory . needs to be absolute:

$ ./bin/cannoli-submit bedtools -b b.bed a.bed intersection.bed
Command:
[bedtools, intersect, -a, stdin, -b, b.bed]
Files:
[]

$ ./bin/cannoli-submit bedtools -add_files -b b.bed a.bed intersection.bed
Command:
[bedtools, intersect, -a, stdin, -b, $0]
Files:
[b.bed]

$ ./bin/cannoli-submit bedtools -use_docker -b b.bed a.bed intersection.bed
Command:
[docker, run, --mount, type=bind,source=.,target=., --rm, quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, b.bed]
Files:
[]

$ ./bin/cannoli-submit bedtools -add_files -use_docker -b b.bed a.bed intersection.bed
Command:
[docker, run, --mount, type=bind,source=$root,target=$root, --rm, quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, $0]
Files:
[b.bed]

$ ./bin/cannoli-submit bedtools -use_singularity -b b.bed a.bed intersection.bed
Command:
[singularity, exec, --bind, ., docker://quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, b.bed]
Files:
[]

$ ./bin/cannoli-submit bedtools -add_files -use_singularity -b b.bed a.bed intersection.bed
Command:
[singularity, exec, --bind, $root, docker://quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, $0]
Files:
[b.bed]

With regards to #34 and #50, how do we get an absolute path for mounting to Docker or Singularity? I don't know if you've noticed, but the Hadoop FS API is terrible. ;) What schemes can we support via the pipe(files = ...) SparkFiles mechanism? Which ones can we mount to Docker and Singularity?

@jpdna How do these Singularity commands look?

@heuermh
Copy link
Member Author

heuermh commented Feb 24, 2018

After hacking a way to the absolute path for mounting to Docker or Singularity, I run into various issues.

Apparently the path to the file on local disk to be accessed after mounting its parent directory to Docker must also be absolute

$ ./bin/cannoli-submit bedtools -use_docker -b b.bed a.bed intersection.bed
...
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:5226+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:6097+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:1742+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:0+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:871+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:4355+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:2613+871
18/02/23 22:34:51 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:3484+871
Error: Unable to open file b.bed. Exiting.
Error: Unable to open file b.bed. Exiting.
Error: Unable to open file b.bed. Exiting.
Error: Unable to open file b.bed. Exiting.
18/02/23 22:34:53 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
18/02/23 22:34:53 ERROR Utils: Aborting task
java.lang.RuntimeException: Piped command List(docker, run, --mount, type=bind,source=/Users/mheuer2/working/cannoli,target=/Users/mheuer2/working/cannoli, --rm, quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, b.bed) exited with error code 1.
	at org.bdgenomics.adam.rdd.OutFormatterRunner.hasNext(OutFormatter.scala:67)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1137)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1137)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1137)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1371)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1145)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1125)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
...

With an absolute file there aren't any errors but the result is an empty file

$ ./bin/cannoli-submit bedtools -use_docker -b `pwd`/b.bed a.bed intersection.bed
...
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:6097+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:5226+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:0+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:1742+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:2613+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:871+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:4355+871
18/02/23 22:35:09 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:3484+871
18/02/23 22:35:10 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
18/02/23 22:35:11 INFO FileOutputCommitter: Saved output of task 'attempt_20180223223509_0000_m_000001_1' to file:/Users/mheuer2/working/cannoli/intersection.bed/_temporary/0/task_20180223223509_0000_m_000001
18/02/23 22:35:11 INFO SparkHadoopMapRedUtil: attempt_20180223223509_0000_m_000001_1: Committed
18/02/23 22:35:11 INFO ADAMKryoRegistrator: Did not find Spark internal class. This is expected for earlier Spark versions.
18/02/23 22:35:11 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 839 bytes result sent to driver
18/02/23 22:35:11 INFO ADAMKryoRegistrator: Did not find Spark internal class. This is expected for earlier Spark versions.
18/02/23 22:35:11 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1539 ms on localhost (executor driver) (1/8)
...
18/02/23 22:35:11 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
18/02/23 22:35:11 INFO DAGScheduler: ResultStage 0 (saveAsTextFile at TextRddWriter.scala:61) finished in 2.350 s
18/02/23 22:35:11 INFO DAGScheduler: Job 0 finished: saveAsTextFile at TextRddWriter.scala:61, took 2.463743 s
18/02/23 22:35:11 INFO Bedtools: Overall Duration: 5.53 secs

$ cat intersection.bed/part-00000 

Using Docker with -add_files also throws an error, Docker can't bind the temporary Spark directory, perhaps a permissions problem?

$ ./bin/cannoli-submit bedtools -use_docker -add_files -b `pwd`/b.bed a.bed intersection.bed
...
18/02/23 22:35:50 INFO ADAMContext: Loading a.bed as BED and converting to Features.
18/02/23 22:35:51 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 236.5 KB, free 366.1 MB)
18/02/23 22:35:51 INFO ADAMKryoRegistrator: Did not find Spark internal class. This is expected for earlier Spark versions.
18/02/23 22:35:51 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 22.9 KB, free 366.0 MB)
18/02/23 22:35:51 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.148:60276 (size: 22.9 KB, free: 366.3 MB)
18/02/23 22:35:51 INFO SparkContext: Created broadcast 0 from textFile at ADAMContext.scala:2511
18/02/23 22:35:52 INFO SparkContext: Added file /Users/mheuer2/working/cannoli/b.bed at file:/Users/mheuer2/working/cannoli/b.bed with timestamp 1519446952365
18/02/23 22:35:52 INFO Utils: Copying /Users/mheuer2/working/cannoli/b.bed to /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-c0f0d3b8-9a28-473f-a0c8-36b0aa0e0136/userFiles-90e82d18-ee8a-4135-843d-21c566977ac5/b.bed
...
18/02/23 22:35:52 INFO Executor: Fetching file:/Users/mheuer2/working/cannoli/b.bed with timestamp 1519446952365
18/02/23 22:35:52 INFO Utils: /Users/mheuer2/working/cannoli/b.bed has been previously copied to /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-c0f0d3b8-9a28-473f-a0c8-36b0aa0e0136/userFiles-90e82d18-ee8a-4135-843d-21c566977ac5/b.bed
...
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:871+871
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:1742+871
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:0+871
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:4355+871
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:2613+871
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:6097+871
18/02/23 22:35:52 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:3484+871
18/02/23 22:35:53 INFO HadoopRDD: Input split: file:/Users/mheuer2/working/cannoli/a.bed:5226+871
docker: Error response from daemon: invalid mount config for type "bind": bind source path does not exist.
See 'docker run --help'.
18/02/23 22:35:53 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
18/02/23 22:35:53 ERROR Utils: Aborting task
java.lang.RuntimeException: Piped command List(docker, run, --mount, type=bind,source=/private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-c0f0d3b8-9a28-473f-a0c8-36b0aa0e0136/userFiles-90e82d18-ee8a-4135-843d-21c566977ac5,target=/private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-c0f0d3b8-9a28-473f-a0c8-36b0aa0e0136/userFiles-90e82d18-ee8a-4135-843d-21c566977ac5, --rm, quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, /Users/mheuer2/working/cannoli/b.bed) exited with error code 125.

FWIW, things work fine without using Docker

$ ./bin/cannoli-submit bedtools -b b.bed a.bed intersection.bed
$ cat intersection.bed/part-00000
1	1331345	1331536	106624	13.53	+
1	1331347	1331348	106624	13.53	+

$ ./bin/cannoli-submit bedtools -add_files -b b.bed a.bed intersection.bed
$ cat intersection.bed/part-00000
1	1331345	1331536	106624	13.53	+
1	1331347	1331348	106624	13.53	+

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/114/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/115/
Test PASSed.

@heuermh
Copy link
Member Author

heuermh commented Feb 26, 2018

After a lot of fiddling around, including this diff to adam

$ git diff .
diff --git a/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicRDD.scala b/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicRDD.scala
index 98e5f242..0ffb127f 100644
--- a/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicRDD.scala
+++ b/adam-core/src/main/scala/org/bdgenomics/adam/rdd/GenomicRDD.scala
@@ -550,12 +550,13 @@ trait GenomicRDD[T, U <: GenomicRDD[T, U]] extends Logging {
     val pipedRdd = partitionedRdd.mapPartitions(iter => {
       if (iter.hasNext) {

-        // get files
-        // from SPARK-3311, SparkFiles doesn't work in local mode.
-        // so... we'll bypass that by checking if we're running in local mode.
-        // sigh!
         val locs = if (isLocal) {
-          files
+          files.map(f => {
+            // SparkFiles.getRootDirectory is set in local mode even if driverTmpDir is not
+            val root = Paths.get(SparkFiles.getRootDirectory()).toAbsolutePath.toString
+            val fileName = new Path(f).getName()
+            Paths.get(root, fileName).toString
+          })
         } else {
           files.map(f => {
             SparkFiles.get(new Path(f).getName())

the absolute paths are looking ok, perhaps there is still a permissions problem?

Caused by: java.lang.RuntimeException: Piped command List(docker, run, -i, -v, /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243/userFiles-acaf8f60-1e54-4d2a-bb3c-bae30bc0090f:/private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243/userFiles-acaf8f60-1e54-4d2a-bb3c-bae30bc0090f, --rm, quay.io/biocontainers/bedtools:2.27.1--0, bedtools, intersect, -a, stdin, -b, /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243/userFiles-acaf8f60-1e54-4d2a-bb3c-bae30bc0090f/b.bed) exited with error code 1.
	at org.bdgenomics.adam.rdd.OutFormatterRunner.hasNext(OutFormatter.scala:67)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1137)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1137)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1137)
	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1371)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1145)
	at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1125)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
18/02/26 15:09:00 INFO SparkContext: Invoking stop() from shutdown hook
18/02/26 15:09:00 INFO SparkUI: Stopped Spark web UI at http://10.21.7.167:4040
18/02/26 15:09:00 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/02/26 15:09:00 INFO MemoryStore: MemoryStore cleared
18/02/26 15:09:00 INFO BlockManager: BlockManager stopped
18/02/26 15:09:00 INFO BlockManagerMaster: BlockManagerMaster stopped
18/02/26 15:09:00 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/02/26 15:09:00 INFO SparkContext: Successfully stopped SparkContext
18/02/26 15:09:00 INFO ShutdownHookManager: Shutdown hook called
18/02/26 15:09:00 INFO ShutdownHookManager: Deleting directory /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243
Error: Unable to open file /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243/userFiles-acaf8f60-1e54-4d2a-bb3c-bae30bc0090f/b.bed. Exiting.
Error: Unable to open file /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243/userFiles-acaf8f60-1e54-4d2a-bb3c-bae30bc0090f/b.bed. Exiting.
Error: Unable to open file /private/var/folders/7c/mskbk2mn1nn1l6n80pb0th3mxkl69f/T/spark-4ded3038-70e1-434d-b6dd-32398ea8d243/userFiles-acaf8f60-1e54-4d2a-bb3c-bae30bc0090f/b.bed. Exiting.

Or perhaps, if running in local mode is never to work, if files.notEmpty the pipe method should throw an exception.

@heuermh
Copy link
Member Author

heuermh commented Feb 26, 2018

the absolute paths are looking ok, perhaps there is still a permissions problem?

After uninstalling Docker Toolbox on Mac, installing Docker for Mac, and adding /usr/local/Cellar/... in the File Sharing dialog, this works, both for local and standalone mode

$ ./bin/cannoli-submit bedtools -add_files -use_docker -b `pwd`/b.bed a.bed intersection.bed
$ head intersection.bed/part-00000
1	1331345	1331536	106624	13.53	+
1	1331347	1331348	106624	13.53	+
1	1331352	1331353	106624	13.53	+

$ ./bin/cannoli-submit --master spark://...:7077 -- bedtools -add_files -use_docker -b `pwd`/b.bed a.bed intersection.bed
$ head intersection.bed/part-00000
1	1331345	1331536	106624	13.53	+
1	1331347	1331348	106624	13.53	+
1	1331352	1331353	106624	13.53	+

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/116/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/117/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/118/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/119/
Test PASSed.

@heuermh
Copy link
Member Author

heuermh commented Mar 1, 2018

Still left todo before rebasing:

  • Test Bwa.getIndexPaths with -use_docker -add_files, it appears to return paths with scheme
  • Search for bowtie/bowtie2 indexes to add to files
  • Reduce code duplication where possible
  • Feedback from @jpdna on Singularity commands & mounts

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/120/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/121/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/122/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/123/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/124/
Test PASSed.

@jpdna
Copy link
Member

jpdna commented Mar 20, 2018

In testing the bcftools cannoli command,
when NOT using singularity this works:

cannoli-submit bcftools test1.vcf test_bcfnorm_1_no_singularity.norm.vcf -reference human_g1k_v37.fasta

but when turning on the singularity option it fails with:

../cannoli/bin/cannoli-submit bcftools test1.vcf test_bcfnorm_1_USE_singularity.norm.vcf -reference human_g1k_v37.fasta -use_singularity
Using SPARK_SUBMIT=/usr/local/Spark/2.1.1/bin/spark-submit
2018-03-20 12:49:46 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-03-20 12:49:58 ERROR Executor:91 - Exception in task 0.0 in stage 1.0 (TID 86)
htsjdk.tribble.TribbleException$InvalidHeader: Your input file has a malformed header: We never saw the required CHROM header line (starting with one #) for the input VCF file
	at htsjdk.variant.vcf.VCFCodec.readActualHeader(VCFCodec.java:115)
	at org.bdgenomics.adam.rdd.variant.VCFOutFormatter.read(VCFOutFormatter.scala:83)
	at org.bdgenomics.adam.rdd.OutFormatterRunner.<init>(OutFormatter.scala:32)
	at org.bdgenomics.adam.rdd.GenomicRDD$$anonfun$13.apply(GenomicRDD.scala:598)
	at org.bdgenomics.adam.rdd.GenomicRDD$$anonfun$13.apply(GenomicRDD.scala:560)

@heuermh are you able to run cannoli-submit bcftools with the -use_singularity in your environment successfully?

@heuermh
Copy link
Member Author

heuermh commented Mar 20, 2018

@heuermh are you able to run cannoli-submit bcftools with the -use_singularity in your environment successfully?

I don't have access to Singularity, so I've generated the command only from documentation. You may want to check if using an absolute path for human_g1k_v37.fasta helps.

Note I see the same thing when using Docker

$ ./bin/cannoli-submit bcftools \
  -use_docker \
  -reference hla.fa \
  small.vcf \
  bcftools.vcf

...
2018-03-20 12:41:57 INFO  BcftoolsFn:93 - Piping VariantContextRDD with 1 reference
sequences and 3 samples to bcftools with command: [docker, run, -i, -v,
/Users/heuermh/working/cannoli:/Users/heuermh/working/cannoli, --rm,
quay.io/biocontainers/bcftools:1.6--0, bcftools, norm, --fasta-ref, hla.fa] files: []
...
[E::fai_build3] Failed to open the FASTA file hla.fa
Failed to load the fai index: hla.fa
...
htsjdk.tribble.TribbleException$InvalidHeader: Your input file has a malformed header: We never saw the required CHROM header line (starting with one #) for the input VCF file
	at htsjdk.variant.vcf.VCFCodec.readActualHeader(VCFCodec.java:119)
	at org.bdgenomics.adam.rdd.variant.VCFOutFormatter.read(VCFOutFormatter.scala:83)
	at org.bdgenomics.adam.rdd.OutFormatterRunner.<init>(OutFormatter.scala:32)
	at org.bdgenomics.adam.rdd.GenomicRDD$$anonfun$13.apply(GenomicRDD.scala:596)

This works fine

$ ./bin/cannoli-submit bcftools \
  -use_docker \
  -reference `pwd`/hla.fa \
  small.vcf \
  bcftools.vcf

...
2018-03-20 12:46:53 INFO  BcftoolsFn:93 - Piping VariantContextRDD with 1 reference
sequences and 3 samples to bcftools with command: [docker, run, -i, -v,
/Users/heuermh/working/cannoli:/Users/heuermh/working/cannoli, --rm,
quay.io/biocontainers/bcftools:1.6--0, bcftools, norm, --fasta-ref,
/Users/heuermh/working/cannoli/hla.fa] files: []
...

@heuermh
Copy link
Member Author

heuermh commented Mar 20, 2018

@jpdna Please see if the most recent commit helps; though I only made changes for bcftools.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/125/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/126/
Test PASSed.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/127/
Test PASSed.

@heuermh
Copy link
Member Author

heuermh commented Mar 22, 2018

@jpdna I'd like to merge this today because the build is failing on git head. Can you file new issues if something is off with the Singularity support?

@heuermh heuermh force-pushed the container-builder branch from b8c3c0b to a982855 Compare March 22, 2018 15:52
@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/cannoli-prb/128/
Test PASSed.

@jpdna
Copy link
Member

jpdna commented Mar 22, 2018

@jpdna I'd like to merge this today because the build is failing on git head. Can you file new issues if something is off with the Singularity support?

sounds good, I'll run through all the tools again in Singularity and create issues if need be.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants