Hi all,
we have a de-central system, mostly RDF based. We have different programs loading RDF (we wrapped them in docker) and a small community of people providing RDF data under different domains/servers.
It’s like a decentral huggingface, i.e. different softwares loading different distributed datasets.
The way we are thinking to implement this:
- Central keycloak: Each user get’s different attributes that manage what they are allowed to access.
- each user get’s a JWT token with this additional claim:
"access": [
{
"artifact": "https://example.org/dataset1",
"until": "2025-04-01"
},
{
"artifact": "https://otherdomain.org/dataset2",
"until": "2024-10-01"
}
],
- JWT tokens are valid for a couple of months, the “until” part is scoped to the individual servers/domain
- users take thier JWT Token and give it to their different softwares/docker that will consume different sources/datasets, getting access to different servers via the same JWT token
- datasets are published by different providers on different servers/domains, using nginx module, that does a verification request to the central keycloak (whether token is valid)
My questions: is this a JWT use case, i.e. was JWT designed to handle this? Is using the same JWT token across distributed servers bad?