I’ve been doing tests with jwt.io token example and I don’t get why JWT Debugger verifies this token: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c
and also this one: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5f
with the same secret key (“your-256-bit-secret”). I must be missing something, can anyone explain it to me?
This is just a guess, but the two tokens differ ONLY in the last two bits:
0xC = 1100
0xF = 1111
I would guess these last two bits are not really part of the payload.
If we do 0x8 as the last bit (1000) the signature is invalid.
If we do D and E, the signature is also verified.
So, I’d guess these are just unused bits.