deeplearning4j: SameDiff: Cache not invalidated by in-place INDArray modifications, later outputs incorrect

This started today, was fine yesterday.

First exec is OK, subsequent ones are returning values from the original execution.

    @Test
    public void testOutMultiple(){

        SameDiff sd = SameDiff.create();

        SDVariable var = sd.var("in", Nd4j.linspace(1,5,5));
        SDVariable out = var.add(10);

        INDArray result = sd.execAndEndResult();

        assertEquals(Nd4j.linspace(1,5,5).addi(10), result);

        INDArray origArr = var.getArr();
        origArr.addi(100.0);

        result = sd.execAndEndResult();

        assertEquals(Nd4j.linspace(1,5,5).addi(110), result); //Fails here
    }
java.lang.AssertionError:
Expected :[[  111.0000,  112.0000,  113.0000,  114.0000,  115.0000]]
Actual   :[[   11.0000,   12.0000,   13.0000,   14.0000,   15.0000]]

The most likely cause is the changes here: https://github.com/deeplearning4j/deeplearning4j/pull/5410 (Of course, I could just revert that, but it should be possible to execute without duping the whole samediff instance)

Aha! Link: https://skymindai.aha.io/features/ND4J-172

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 15 (15 by maintainers)

Most upvoted comments

On it.