It's a very strange problem.
I have a simple class which can decode a base64 string and get the first part before the :
:
import scala.util.{Success, Try}
import org.apache.commons.codec.binary.Base64
class IdDecoder {
def decode(token: String): Option[String] = {
if (token.isEmpty)
None
else
Try(new String(Base64.decodeBase64(token.getBytes)).split(":")(0)) match {
case Success(id) => Some(id)
case _ => None
}
}
}
And define a method which decodes the string
object StrangeToken {
def main(args: Array[String]) {
decode()
}
def decode() = {
val token = "InternalServerError"
val Some(id) = (new IdDecoder).decode(token)
println("### StrangeToken's id len:" + id.length)
id.toCharArray.foreach(c => println(c.toInt))
id
}
}
Running as plain code, id's length is 15
When I run it in sbt's console or in IDEA or in production, the result is:
### StrangeToken's id len:15
34
123
94
65533
118
65533
73
65533
65533
122
65533
43
0
0
0
Running as spec2 test, id's length is 14
But when I run it in spec2, as:
"id decoder" should {
"get decoded string whose length is 15" in {
val id = StrangeToken.decode()
id.length must be equalTo 15
}
}
This test failed and the result is:
### StrangeToken's id len:14
34
123
94
198
118
8226
73
205
212
122
177
43
198
228
I'm not sure why the result is different in spec2.