When vectors were broadcast with a repeat of 1, one of the values is accidentally zero'd. This leaves very strange artifacts in neural networks.
🛠 This has now been fixed
This references GoMachine's new implementation.
📄 The Upsample2D operator has been added by @cpllbstr . It is similar to the operator in PyTorch: https://pytorch.org/docs/master/generated/torch.nn.Upsample.html
In the previous version, the
repeatOpwas a compound operation. It had this function signature effectively:
func repeat(a, nTimes *Node, axes ...int). So you could do something like
repeat(a, 300, 1, 2, 3)in which
agets repeated 300 times across axes 1, 2 and 3.
⚡️ This has been deoptimized such that it's effectively
func repeat(a, repeat *Node, axis int). The reason for this deoptimization is because upon further analyses of what the function actually does, it simply calls
tensor.Repeatmany times. This causes many new tensors to be allocated. But the whole point of symbolic operations is so that we may preallocate ahead of time.
🐎 This deoptimization allows for the
tensor.RepeatReusewhich allows for a repeat operation to reuse preallocated values, leading to less allocations, improving performance
🛠 Two bugfixes in this release:
- An Off-By-One bug in which the axes of softmax was affected.
- TrimSpace being used in the iris example
- 🛠 Return value of scalar values are fixed
Previously when an expression such as
-(x+y)is given and
yare scalar values, the neg op would fail to correctly pass the derivative into the constituents. This is due to a misuse of
UnsafeDo. This has been rectified now.
v0.9.6January 04, 2020