GPflow: Predict call execution time degrades linearly with additional calls
- OS Platform and Distribution: Linux Ubuntu 17.04
- TensorFlow installed from (source or binary): binary
- Tensorflow version: tensorflow-gpu==1.4.0
- GPflow version: gpflow-1.0.0 (from github master source)
- CUDNN installed: 6
- GPU: GeForce GTX 1080 Ti
- Python Version: Python 3.5.3
Issue: Repeated calls to .predict (either .predict_f or .predict_y) have degrading execution time when called successively. The initial call might take perhaps 10ms, but later calls might take as long as several seconds each. This poses a problem when many calls to *.predict are desired. I did not observe this behaviour in versions of gpflow prior to 1.0.0. This occurs for at least the regression models (GPR, SGPR, SVGP), and possibly others.
import numpy as np
import matplotlib.pyplot as plt
import gpflow
import time
X = np.linspace(0,10, 100).reshape(-1,1)
y = np.sin(X) + np.random.normal(0, 0.25, size=100).reshape(-1,1)
k = gpflow.kernels.RBF(input_dim=1)
m = gpflow.models.gpr.GPR(X, y, kern=k)
m.compile()
gpflow.train.ScipyOptimizer().minimize(m)
print('Done optimize.')
predict_duration_log = []
for i in range(250):
start = time.time()
m.predict_y(X)
end = time.time()
predict_duration_log.append(end-start)
plt.plot(predict_duration_log)
plt.show()
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Comments: 18 (16 by maintainers)
Commits related to this issue
- [#568] Improve initialization status checking performance. — committed to GPflow/GPflow by awav 7 years ago
- [#568, #561] Test for Dataset iterators is not possible. — committed to GPflow/GPflow by awav 7 years ago
- Address issues #568, #561 (#575) * Add a bunch of test for Parameter and DataHolder. Minibatch seed can be changed after cleaning or in defer_build. * Add dataholder tests. * Add failure creati... — committed to GPflow/GPflow by awav 7 years ago
Here is speed results after improvements which I made:
W/O checking if variables were initialized (sec):
WITH checking if variables were initialized (sec):
Let’s vote:
@markvdw, it is not a degradation. I’m sorry for confusion - I edited an image above - my previous experiments were wrong. Here is new graphs:
So, there is only small overhead 2-3ms, which is constant.