Changelog History
Page 1
-
v0.9.15 Changes
September 28, 2020When vectors were broadcast with a repeat of 1, one of the values is accidentally zero'd. This leaves very strange artifacts in neural networks.
๐ This has now been fixed
-
v0.9.14 Changes
September 10, 2020๐ With the release of gorgonia.org/[email protected], the tensor now supports complex numbers as well
-
v0.9.13 Changes
August 06, 2020This references GoMachine's new implementation.
-
v0.9.12 Changes
June 18, 2020๐ The Upsample2D operator has been added by @cpllbstr . It is similar to the operator in PyTorch: https://pytorch.org/docs/master/generated/torch.nn.Upsample.html
-
v0.9.11 Changes
June 15, 2020Due to the great work by @wzzhu, shape inference is now a bit more robust. It goes back to the original Gorgonia understanding of shapes - where reductions do not aggressively squeeze the dimensions.
-
v0.9.10 Changes
April 10, 2020In the previous version, the
repeatOp
was a compound operation. It had this function signature effectively:func repeat(a, nTimes *Node, axes ...int)
. So you could do something likerepeat(a, 300, 1, 2, 3)
in whicha
gets repeated 300 times across axes 1, 2 and 3.โก๏ธ This has been deoptimized such that it's effectively
func repeat(a, repeat *Node, axis int)
. The reason for this deoptimization is because upon further analyses of what the function actually does, it simply callstensor.Repeat
many times. This causes many new tensors to be allocated. But the whole point of symbolic operations is so that we may preallocate ahead of time.๐ This deoptimization allows for the
repeatOp
to calltensor.RepeatReuse
which allows for a repeat operation to reuse preallocated values, leading to less allocations, improving performance -
v0.9.9 Changes
March 25, 2020โฌ๏ธ Dropout had a long standing bug that was fixed by @MarkKremer
-
v0.9.8 Changes
February 10, 2020๐ Two bugfixes in this release:
- An Off-By-One bug in which the axes of softmax was affected.
- TrimSpace being used in the iris example
- ๐ Return value of scalar values are fixed
-
v0.9.7 Changes
January 19, 2020Previously when an expression such as
-(x+y)
is given andx
andy
are scalar values, the neg op would fail to correctly pass the derivative into the constituents. This is due to a misuse ofUnsafeDo
. This has been rectified now. -
v0.9.6
January 04, 2020