dlib: Error when using repeat layer with dimpl::subnet_wrapper

Hi, I wanted to use the repeat layer with the loss_mmod, but I found the following issue, which can be reproduced like this:

Just change this line: https://github.com/davisking/dlib/blob/0ffe9c4c40a77506c8b421aca574795343d4024c/examples/dnn_mmod_train_find_cars_ex.cpp#L38

to look like this:

 using net_type = loss_mmod<con<1,9,9,1,1,repeat<3,rcon5,downsampler<input_rgb_image_pyramid<pyramid_down<6>>>>>>>>; 

And the example program does not compile anymore:

/home/adria/Projects/dlib/dlib/../dlib/dnn/core.h:2710:29: error: static assertion failed: Call to layer() attempted to access non-existing layer in neural network.
 2710 |             static_assert(i < T::num_layers, "Call to layer() attempted to access non-existing layer in neural network.");       
      |                           ~~^~~~~~~~~~~~~~~

So, it seems there is a problem when dimpl::subnet_wrapper contains a repeat layer. Is this fixable?

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 21 (21 by maintainers)

Most upvoted comments

In C++14 it’s easy, you can just return decltype(auto) and it just does the right thing here. But in C++11 the thing to do is to add typedefs to the classes that declare the type we want and make input_layer() return that type. Even in C++14 it’s nice to do this because it’s usually nice to have those types easily on hand anyway.

I just pushed https://github.com/davisking/dlib/commit/1de47514bdf104d5b5a7deea0a436fce559f02e3 which adds .input_layer() to all networks and makes it so repeat works here as well.

You know what, just add a .input_layer() method to add_layer. Make it call into the sublayer it contains. That should be easy and not involve any template magic or any of that kind of thing. subnet_wrapper needs a .input_layer() too but should be just a simple non-template method as well. I would add it right now but I’ve to go take my kid to school or we are going to be late :\

Yeah I’m honestly not sure either. I spent a bit the other day trying to get it to work but couldn’t get it going. I’ll look again a bit this weekend.

So, I tried again to figure out how to work around this issue, and it seems that the subnet_wrapper case should be handled here, but it’s not: https://github.com/davisking/dlib/blob/8d4df7c0b3fa7c4c1e4175951161b01ccf4541b5/dlib/dnn/core.h#L2857-L2868

I don’t understand that part of the library very well, so it’s hard to fix it, but I will keep trying 😃

Yeah. layer<idx>(net) doesn’t work when net is a subnet_wrapper. impl::layer_helper needs some appropriate overload to catch that case. Should be fixable. Or just overload input_layer() for this specific type. I just spend some time looking and I’m not sure what the most straightforward fix is though. I also got intercepted by my kids and can’t work on it further right now 😐

If you want to give it a whirl though be my guest 😃