You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
Current design of DeepNumpy's random module follows native numpy in terms of the interpretation of the parameter size.
More specifically, size indicates the final output size of the sampling operation. Parameter tensors, if narrower or smaller than size, will be automatically broadcast to the output's shape.
However, this mechanism makes I.I.D sampling little bit tricky, for example:
loc = loc_net(x)
scale = scale_net(x)
N = 10
# Generate N samples from the network-parameterized gaussian
np.random.normal(loc, scale, (N,) + loc.shape)
Problem would arise in symbolic model, as the shape of loc and scale cannot be obtained in the frontend.
Description
Current design of DeepNumpy's random module follows native numpy in terms of the interpretation of the parameter
size
.More specifically,
size
indicates the final output size of the sampling operation. Parameter tensors, if narrower or smaller thansize
, will be automatically broadcast to the output's shape.However, this mechanism makes I.I.D sampling little bit tricky, for example:
Problem would arise in symbolic model, as the shape of
loc
andscale
cannot be obtained in the frontend.Solution
The following
InferShape
function could resolve this issue. (modified from: https://github.com/apache/incubator-mxnet/blob/master/src/operator/numpy/random/dist_common.h#L143)Notice that the
FCompute
function could stay the same.The modified sampling method is now able to produce the following result:
The text was updated successfully, but these errors were encountered: