djl: java.nio.ReadOnlyBufferException when calling TensorFlow model.

Question

I am trying to call a Tensorflow model and keep getting a ReadOnlyBufferException. Originally this was an .hdf5 model which I converted to .pb file to use with djl as per instructions here.

The model input (in python) is a float64 numpy array of shape (N, 40,40,1). The model loads fine using djl and I’ve created a translator which inputs a double[][] array of the same shape but when calling NDArray array = manager.create(specgramFlat, shape); I get a ReadOnlyBufferException error.

A minimal reproducible example is below and you can download the zipped model https://1drv.ms/u/s!AkNvdu-1_rHOgahqrZwrhu6V8v3TFA?e=0BR4a3.

Am I going about loading this model the right way? Any help on this would be much appreciated. Thanks!

import java.io.File;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.LinkedList;
import java.util.Random;

import ai.djl.Model;
import ai.djl.engine.Engine;
import ai.djl.inference.Predictor;
import ai.djl.ndarray.NDArray;
import ai.djl.ndarray.NDList;
import ai.djl.ndarray.NDManager;
import ai.djl.ndarray.types.Shape;
import ai.djl.translate.Batchifier;
import ai.djl.translate.Translator;
import ai.djl.translate.TranslatorContext;


/**
 * A minimal reproducible example of a java.nio.ReadOnlyBufferException when trying to call a tensorflow classifier. 
 * 
 * @author Jamie Macaulay 
 */
public class ReadBufferExceptionTest {


	public static void main(String[] args) {

		String modelPath = "saved_model.pb";

		try {

			//load the Tensorflow model. 
			File file = new File(modelPath); 

			Path modelDir = Paths.get(file.getAbsoluteFile().getParent()); //the directory of the file (in case the file is local this should also return absolute directory)

			System.out.println(Engine.getAllEngines()); 

			Model model = Model.newInstance(modelPath, "TensorFlow"); 

			model.load(modelDir, "saved_model.pb");

			System.out.println("Input: " + model.describeInput().values()); 
			System.out.println("Output: " + model.describeOutput().values()); 

			//create the predictor
			Translator<double[][], float[]>  translator = new Translator<double[][], float[]>() {   
				

				@Override
				public NDList processInput(TranslatorContext ctx, double[][] data) {
					//System.out.println("Hello: 1 " ); 
					NDManager manager = ctx.getNDManager();

					Shape shape = new Shape(1L, data.length, data[0].length, 1L); 

					System.out.println("NDArray shape: " + shape); 

					double[] specgramFlat = flattenDoubleArray(data); 

					NDArray array = manager.create(specgramFlat, shape); 
					//		NDArray array = manager.create(data); 

					System.out.println("NDArray size: " + array.size()); 

					return new NDList (array);
				}

				@Override
				public float[]  processOutput(TranslatorContext ctx, NDList list) {
					System.out.println("Hello: 2 " + list); 

					NDArray temp_arr = list.get(0);

					Number[] number = temp_arr.toArray(); 

					float[] results = new float[number.length]; 
					for (int i=0; i<number.length; i++) {
						results[i] = number[i].floatValue(); 
					}

					return results; 
				}

				@Override
				public Batchifier getBatchifier() {
					// The Batchifier describes how to combine a batch together
					// Stacking, the most common batchifier, takes N [X1, X2, ...] arrays to a single [N, X1, X2, ...] array
					return Batchifier.STACK;
				}
			};
			Predictor<double[][], float[]> predictor = model.newPredictor(translator);
			
			
			//make some fake data for input
			double[][] data = makeDummySpectrogramd(40, 40); 

			Shape shape = new Shape(1L, 40, 40, 1L); 

			System.out.println("NDArray shape: " + shape); 
			
			//			NDArray array = manager.create(specgramFlat, shape); 
			model.getNDManager().create(data); 

			float[] output = predictor.predict(data); 

		}
		catch (Exception e) {
			e.printStackTrace();
		}
	}

	
	
	/**
	 * Make a dummy spectrogram for testing. Filled with random values.  
	 * @param len - the length of the spectrogram in bins. 
	 * @param height - the height of the spectrgram in bins. 
	 * @return a dummy spectrogram with random values. 
	 */
	public static double[][] makeDummySpectrogramd(int len, int len2){

		//		int len = 256; 
		//		int len2 = 128; 

		double[][] specDummy = new double[len][len2]; 

		Random rand = new Random(); 
		for (int i=0; i<len; i++){
			for (int j=0; j<len2; j++) {
				specDummy[i][j] = 2F*(rand.nextFloat()-0.5F);

				if (specDummy[i][j]>1) {
					specDummy[i][j]=1F;
				}
				if (specDummy[i][j]<0) {
					specDummy[i][j]=0F;
				}
			}
		}
		return specDummy; 
	}
	

	/** 
	 * Convert an arbitrary-dimensional rectangular double array to flat vector.<br>
	 * Can pass double[], double[][], double[][][], etc.
	 */
	public static double[] flattenDoubleArray(Object doubleArray) {
		if (doubleArray instanceof double[])
			return (double[]) doubleArray;

		LinkedList<Object> stack = new LinkedList<>();
		stack.push(doubleArray);

		int[] shape = arrayShape(doubleArray);
		int length = prod(shape);
		double[] flat = new double[length];
		int count = 0;

		while (!stack.isEmpty()) {
			Object current = stack.pop();
			if (current instanceof double[]) {
				double[] arr = (double[]) current;
				for (int i = 0; i < arr.length; i++)
					flat[count++] = arr[i];
			} else if (current instanceof Object[]) {
				Object[] o = (Object[]) current;
				for (int i = o.length - 1; i >= 0; i--)
					stack.push(o[i]);
			} else
				throw new IllegalArgumentException("Base array is not double[]");
		}

		if (count != flat.length)
			throw new IllegalArgumentException("Fewer elements than expected. Array is ragged?");
		return flat;
	}
	
	/** Calculate the shape of an arbitrary multi-dimensional array. Assumes:<br>
	 * (a) array is rectangular (not ragged) and first elements (i.e., array[0][0][0]...) are non-null <br>
	 * (b) First elements have > 0 length. So array[0].length > 0, array[0][0].length > 0, etc.<br>
	 * Can pass any Java array opType: double[], Object[][][], float[][], etc.<br>
	 * Length of returned array is number of dimensions; returned[i] is size of ith dimension.
	 */
	public static int[] arrayShape(Object array) {
		int nDimensions = 0;
		Class<?> c = array.getClass().getComponentType();
		while (c != null) {
			nDimensions++;
			c = c.getComponentType();
		}


		int[] shape = new int[nDimensions];
		Object current = array;
		for (int i = 0; i < shape.length - 1; i++) {
			shape[i] = ((Object[]) current).length;
			current = ((Object[]) current)[0];
		}

		if (current instanceof Object[]) {
			shape[shape.length - 1] = ((Object[]) current).length;
		} else if (current instanceof double[]) {
			shape[shape.length - 1] = ((double[]) current).length;
		} else if (current instanceof float[]) {
			shape[shape.length - 1] = ((float[]) current).length;
		} else if (current instanceof long[]) {
			shape[shape.length - 1] = ((long[]) current).length;
		} else if (current instanceof int[]) {
			shape[shape.length - 1] = ((int[]) current).length;
		} else if (current instanceof byte[]) {
			shape[shape.length - 1] = ((byte[]) current).length;
		} else if (current instanceof char[]) {
			shape[shape.length - 1] = ((char[]) current).length;
		} else if (current instanceof boolean[]) {
			shape[shape.length - 1] = ((boolean[]) current).length;
		} else if (current instanceof short[]) {
			shape[shape.length - 1] = ((short[]) current).length;
		} else
			throw new IllegalStateException("Unknown array opType"); //Should never happen
		return shape;
	}

	

	/**
	 * Product of an int array
	 * @param mult the elements
	 *            to calculate the sum for
	 * @return the product of this array
	 */
	public static int prod(int... mult) {
		if (mult.length < 1)
			return 0;
		int ret = 1;
		for (int i = 0; i < mult.length; i++)
			ret *= mult[i];
		return ret;
	}


	
}

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 19 (10 by maintainers)

Commits related to this issue

Most upvoted comments

I guess javacpp is using some reflection that is blocked by module. We can try to dig more.

DJL currently compiled target to java 8, it’s not be modularized for java 9. At runtime, it will be auto modularized by your JVM at runtime, and it’s using the jar file’s name as module name. That’s why you see api as the module.

We have not yet decided to move to java9+ yet. If you want a stable module name for DJL, you need to rename the jar file to something like: ai.djl.jar

Hi @roywei and all. I have found the issue - if you include a module-info file then this produces the ReadOnlyBuffer Exception . If not, then the error disappears. I have no idea why but as far as I can tell “api” is generated from the djl library and that is an seriously unstable name to use in a module-info file.

Thanks for all your help - I would not have thought to try this without everyone’s suggestions and tests.

module jdl4pam {
	exports org.jamdev.jdl4pam.dlpam;
	exports org.jamdev.jdl4pam;
	exports org.jamdev.jdl4pam.pytorch2Java;
	exports org.jamdev.jdl4pam.utils;
	exports org.jamdev.jdl4pam.transforms.jsonfile;
	exports org.jamdev.jdl4pam.SoundSpot;
	exports org.jamdev.jdl4pam.transforms;
	exports org.jamdev.jdl4pam.genericmodel;

	requires api;
	requires java.desktop;
	requires jpamutils;
	requires org.json;
	requires us.hebi.matlab.mat.mfl.core;
}

Thanks @roywei and @lanking520 for bearing with on this issue.

So the issue seems to be the Java version and in versions that do not work (8+) then the problem is that the engine remains Pytorch. That seems weird to me because the model loads just fine but Engine.getInstance().getEngineName() returns PyTorch instead of TensorFlow?

I am using MacOS 11.1 (20C69), Java 14.0.2 from AdoptOpenJDK and running the code in Eclipse 2020-12 4.18.0. Updated code is here.

I’m guessing that this will be solved if we figure out the engine remaining PyTorch but unsure why this might be?

Thanks again for your help with this. It’s much appreciated.

2021-02-05 09:41:06.673130: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
[PyTorch, TensorFlow]
Engine name before model load: PyTorch
2021-02-05 09:41:06.714469: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /Users/au671271/Desktop/model_lenet_dropout_input_conv_all
2021-02-05 09:41:06.717497: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2021-02-05 09:41:06.717516: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:250] Reading SavedModel debug info (if present) from: /Users/au671271/Desktop/model_lenet_dropout_input_conv_all
2021-02-05 09:41:06.733993: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:215] Restoring SavedModel bundle.
2021-02-05 09:41:06.833548: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:199] Running initialization op on SavedModel bundle at path: /Users/au671271/Desktop/model_lenet_dropout_input_conv_all
2021-02-05 09:41:06.849418: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:319] SavedModel load for tags { serve }; Status: success: OK. Took 134949 microseconds.
Engine name after model load: PyTorch
Input: [(-1, 40, 40, 1)]
Output: [(-1, 2)]
Engine name before prediciton: PyTorch
NDArray shape: (1, 40, 40, 1)
java.nio.ReadOnlyBufferException
	at org.tensorflow.ndarray.impl.buffer.Validator.copyToArgs(Validator.java:67)
	at org.tensorflow.ndarray.impl.buffer.nio.ByteNioDataBuffer.copyTo(ByteNioDataBuffer.java:65)
	at org.tensorflow.Tensor.of(Tensor.java:186)
	at ai.djl.tensorflow.engine.TfNDManager.create(TfNDManager.java:218)
	at ai.djl.tensorflow.engine.TfNDManager.create(TfNDManager.java:46)
	at api@0.9.0/ai.djl.ndarray.NDManager.create(NDManager.java:409)
	at api@0.9.0/ai.djl.ndarray.NDManager.create(NDManager.java:348)
	at jdl4pam/org.jamdev.jdl4pam.genericmodel.ReadBufferExceptionTest.main(ReadBufferExceptionTest.java:114)

Hi @lanking520 and @roywei. This is indeed a Java issue. Java 8 works and Java 14 and 15 do not. The project I’m working on really requires Java 11+ so would be great if there was a workaround somehow? Any ideas?

Java 11 works. We haven’t cover much tests on Java 14, will take a look today.