You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My use case is that I am working on servers that contain SGX enclaves. The server (untrusted) part can use std because it has the OS available, but the enclave (trusted) part, cannot use std, because there is no trusted OS inside the enclave.
In principle, it should be very convenient to use serde to pass messages between the server and the enclave, because it avoids adding more build steps for serialization, and ensures a single-source-of-truth for the serialization schema. And there's no reason that serde should not work in the enclave.
(There is the issue of std::collections::HashMap not being available outside of std but this is not a problem.)
The real problem is that std::error::Error does not exist in core, because the devs have declined to move it there.
An alternate solution is to use the failure crate instead of std::error module, which provides the same functionality with some improvements, and without requiring std.
It would also be great if there were a way to use serde so that the interface does not break between std and alloc settings. Why use std::error at all? Can we just use Debug + Display in all cases?
The current case basically means that anyone who wants to use serde in a cross-platform crate that works with both std and nostd contexts, must expose std and alloc features mirroring serde, and if these features get confused at any point, the build fails in strange ways.
It would be a lot simpler to write cross platform code if, code that doesn't need std can simply omit std, and then be used as a dependency of code that does need serde/std (for example to serialize a hash map), without causing a build failure within serde. This would prevent serde dependency from infecting the entire project with std and alloc feature flags.
Is there any interest from developers in reconciling the differences between these two lines?
I am strongly considering patching serde in my project to always use the Debug + Display version even when std is on, because I think all my dependencies will still work fine -- the error::Error trait is mostly useless.
The text was updated successfully, but these errors were encountered:
Right now
serde
supportsno_std
in that it has anstd
feature, and analloc
feature, and it supports having neither of them turned on.But, if
std
andalloc
feature are both used, it usually causes build failures.This is because there is an interface change at the following lines:
My use case is that I am working on servers that contain SGX enclaves. The server (untrusted) part can use
std
because it has the OS available, but the enclave (trusted) part, cannot usestd
, because there is no trusted OS inside the enclave.In principle, it should be very convenient to use serde to pass messages between the server and the enclave, because it avoids adding more build steps for serialization, and ensures a single-source-of-truth for the serialization schema. And there's no reason that serde should not work in the enclave.
(There is the issue of
std::collections::HashMap
not being available outside of std but this is not a problem.)The real problem is that
std::error::Error
does not exist incore
, because the devs have declined to move it there.An alternate solution is to use the
failure
crate instead ofstd::error
module, which provides the same functionality with some improvements, and without requiringstd
.It would also be great if there were a way to use
serde
so that the interface does not break betweenstd
andalloc
settings. Why usestd::error
at all? Can we just useDebug + Display
in all cases?The current case basically means that anyone who wants to use
serde
in a cross-platform crate that works with bothstd
andnostd
contexts, must exposestd
andalloc
features mirroringserde
, and if these features get confused at any point, the build fails in strange ways.It would be a lot simpler to write cross platform code if, code that doesn't need
std
can simply omitstd
, and then be used as a dependency of code that does needserde/std
(for example to serialize a hash map), without causing a build failure within serde. This would prevent serde dependency from infecting the entire project withstd
andalloc
feature flags.Is there any interest from developers in reconciling the differences between these two lines?
I am strongly considering patching serde in my project to always use the
Debug + Display
version even whenstd
is on, because I think all my dependencies will still work fine -- theerror::Error
trait is mostly useless.The text was updated successfully, but these errors were encountered: