Amit Londhe • almost 8 years ago
Scala to ObjectStore connection stopped working
Hello,
The scala notebook that used to work until day before yesterday has stopped working now. :(
I am getting the following exception now, I hope someone from IBM is monitoring the posts here.
Please note that from a standalone Java code the Object store is still accessible which leads me to think that there is some issue between Spark container and Object store connectivity.
Name: java.lang.NullPointerException
Message: null
StackTrace: org.apache.hadoop.fs.swift.http.SwiftRestClient.authenticate(SwiftRestClient.java:1139)
org.apache.hadoop.fs.swift.http.SwiftRestClient.authIfNeeded(SwiftRestClient.java:1517)
org.apache.hadoop.fs.swift.http.SwiftRestClient.preRemoteCommand(SwiftRestClient.java:1533)
org.apache.hadoop.fs.swift.http.SwiftRestClient.headRequest(SwiftRestClient.java:1077)
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystemStore.stat(SwiftNativeFileSystemStore.java:258)
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystemStore.getObjectMetadata(SwiftNativeFileSystemStore.java:213)
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystemStore.getObjectMetadata(SwiftNativeFileSystemStore.java:182)
org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem.getFileStatus(SwiftNativeFileSystem.java:174)
org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
org.apache.hadoop.fs.Globber.glob(Globber.java:252)
org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:207)
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
scala.Option.getOrElse(Option.scala:120)
org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
scala.Option.getOrElse(Option.scala:120)
org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
scala.Option.getOrElse(Option.scala:120)
org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1255)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
org.apache.spark.rdd.RDD.take(RDD.scala:1250)
org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1290)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
org.apache.spark.rdd.RDD.first(RDD.scala:1289)
$line83.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:28)
$line83.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:33)
$line83.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:35)
$line83.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:37)
$line83.$read$$iwC$$iwC$$iwC$$iwC$$iwC.(:39)
$line83.$read$$iwC$$iwC$$iwC$$iwC.(:41)
$line83.$read$$iwC$$iwC$$iwC.(:43)
$line83.$read$$iwC$$iwC.(:45)
$line83.$read$$iwC.(:47)
$line83.$read.(:49)
$line83.$read$.(:53)
$line83.$read$.()
java.lang.J9VMInternals.initializeImpl(Native Method)
java.lang.J9VMInternals.initialize(J9VMInternals.java:235)
$line83.$eval$.(:7)
$line83.$eval$.()
java.lang.J9VMInternals.initializeImpl(Native Method)
java.lang.J9VMInternals.initialize(J9VMInternals.java:235)
$line83.$eval.$print()
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
java.lang.reflect.Method.invoke(Method.java:620)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:296)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:291)
com.ibm.spark.global.StreamState$.withStreams(StreamState.scala:80)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:290)
com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:290)
com.ibm.spark.utils.TaskManager$$anonfun$add$2$$anon$1.run(TaskManager.scala:123)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1157)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:627)
java.lang.Thread.run(Thread.java:801)
Regards,
Amit
Comments are closed.
7 comments
Serena Pietruszka Manager • almost 8 years ago
Hi Amit,
Thanks for your question. We've opened a ticket with the Bluemix project office and hope to have feedback soon.
Best,
Serena
Amit Londhe • almost 8 years ago
Thanks.
Amit Londhe • almost 8 years ago
Hello there,
Could you please let me know if you have heard any update on this ?
I tried again today and now getting different exception -
Name: org.apache.hadoop.fs.swift.exceptions.SwiftInvalidResponseException
Message: Method POST on https://identity.open.softlayer.com/v2.0/tokens failed, status code: 500, status line: HTTP/1.1 500 Internal Server Error
StackTrace: org.apache.hadoop.fs.swift.http.SwiftRestClient.buildException(SwiftRestClient.java:1726)
org.apache.hadoop.fs.swift.http.SwiftRestClient.perform(SwiftRestClient.java:1627)
org.apache.hadoop.fs.swift.http.SwiftRestClient.authenticate(SwiftRestClient.java:1149)
org.apache.hadoop.fs.swift.http.SwiftRestClient.authIfNeeded(SwiftRestClient.java:1517)
org.apache.hadoop.fs.swift.http.SwiftRestClient.preRemoteCommand(SwiftRestClient.java:1533)...
Regards,
Amit
Serena Pietruszka Manager • almost 8 years ago
I'm sorry, Amit. No update yet, but I'm working to escalate this further.
Serena
Amit Londhe • almost 8 years ago
Thank you Serena, appreciate your support.
Amit Londhe • almost 8 years ago
Hello,
Never mind, I got it working now.
Changed the URL from https://identity.open.softlayer.com/ to https://identity.open.softlayer.com/v3 and seems to be authenticating my credentials.
IBM might have stopped the v2 authentication mechanism.
Regards,
Amit
Serena Pietruszka Manager • almost 8 years ago
Hi Amit,
I'm so glad to hear that! Way to go!
Serena