All Versions
Latest Version
Avg Release Cycle
30 days
Latest Release
1333 days ago

Changelog History
Page 1

  • v0.9.15 Changes

    September 28, 2020

    When vectors were broadcast with a repeat of 1, one of the values is accidentally zero'd. This leaves very strange artifacts in neural networks.

    🛠 This has now been fixed

  • v0.9.14 Changes

    September 10, 2020

    🚀 With the release of[email protected], the tensor now supports complex numbers as well

  • v0.9.13 Changes

    August 06, 2020

    This references GoMachine's new implementation.

  • v0.9.12 Changes

    June 18, 2020

    📄 The Upsample2D operator has been added by @cpllbstr . It is similar to the operator in PyTorch:

  • v0.9.11 Changes

    June 15, 2020

    Due to the great work by @wzzhu, shape inference is now a bit more robust. It goes back to the original Gorgonia understanding of shapes - where reductions do not aggressively squeeze the dimensions.

  • v0.9.10 Changes

    April 10, 2020

    In the previous version, the repeatOp was a compound operation. It had this function signature effectively: func repeat(a, nTimes *Node, axes So you could do something like repeat(a, 300, 1, 2, 3) in which a gets repeated 300 times across axes 1, 2 and 3.

    ⚡️ This has been deoptimized such that it's effectively func repeat(a, repeat *Node, axis int). The reason for this deoptimization is because upon further analyses of what the function actually does, it simply calls tensor.Repeat many times. This causes many new tensors to be allocated. But the whole point of symbolic operations is so that we may preallocate ahead of time.

    🐎 This deoptimization allows for the repeatOp to call tensor.RepeatReuse which allows for a repeat operation to reuse preallocated values, leading to less allocations, improving performance

  • v0.9.9 Changes

    March 25, 2020

    ⬇️ Dropout had a long standing bug that was fixed by @MarkKremer

  • v0.9.8 Changes

    February 10, 2020

    🛠 Two bugfixes in this release:

    • An Off-By-One bug in which the axes of softmax was affected.
    • TrimSpace being used in the iris example
    • 🛠 Return value of scalar values are fixed
  • v0.9.7 Changes

    January 19, 2020

    Previously when an expression such as -(x+y) is given and x and y are scalar values, the neg op would fail to correctly pass the derivative into the constituents. This is due to a misuse of UnsafeDo . This has been rectified now.

  • v0.9.6

    January 04, 2020