fix/minor: Error encoding hexa decimal

String is defined as an array of char. The char can be negative. The
cast "reinterpret_cast" from char to int keep the negative side, so
the "unsigned char" number 0x91 is negative as "char". When it is
"reinterpret_cast" as integer, it becomes 0xffffff91, so the hexadecimal
display is broken:

   [155493246391.747672] [/absolute?what=badarg2] [9]  T (0) t:hexEncode: "ffffff91ffffffecffffffe6334bffffffebffffff87ffffff9affffff824a06ffffffc33b4cffff (14 characters omitted)"

This patch fix this behavior using classic cast without reinterpret_cast:

   [155493251286.221115] [/absolute?what=badarg2] [9]  T (0) t:hexEncode: "91ece6334beb879a824a06c33b4cb4240e4c6f56"
This commit is contained in:
Thierry Fournier 2019-04-10 23:42:08 +02:00 committed by Felipe Zimmerle
parent 033942c925
commit 4a3e9734ef
No known key found for this signature in database
GPG Key ID: E6DFB08CE8B11277

View File

@ -41,7 +41,7 @@ std::string HexEncode::evaluate(std::string value,
std::stringstream result;
for (std::size_t i=0; i < value.length(); i++) {
int ii = reinterpret_cast<char>(value[i]);
unsigned int ii = (unsigned char)(value[i]);
result << std::setw(2) << std::setfill('0') << std::hex << ii;
}