Assertion fail in MessagePassing.cpp

Semantic Image Segmentation with Conditional Random Fields
victor-robles
Posts: 2
Joined: Thu Dec 12, 2019, 23:08
Contact:

Assertion fail in MessagePassing.cpp

Postby victor-robles » Thu Jan 09, 2020, 22:00

I have expanded on the Demo Train example to support multiple image training by modifying the node training section to do comprehensive notation as so:

Code: Select all

for (int y = 1; y < height; y++)
      {
      for (int x = 1; x < width; x++)
         {
         //std::cout << Mat (fv.at<Matx<double,nFeatures,1>>(x,y)) << std::endl;
         nodeTrainer->addFeatureVec(Mat (fv.at<Matx<uint8_t,nFeatures,1>>(x,y)), static_cast<byte>(gt.at<uint8_t>(x,y)));
         }
      }

It has also been generalized to take in more than just uint8 images if need be by just modifying the data type in nodeTrainer->addFeatureVec and in edgeTrainer->addFeatureVecs for edge training respectively. For now I'm training it on 29 feature images and testing on one, and each feature image is 256x256x55 uint8 (55 features, from multiple filter banks that have been quantized into 0-255 to comply with the uint8 feature vector standards of DGM).

For Node Training I use Gaussian Mixture Model or OpenCV Gaussian Mixture Model but I keep getting the following error:
Decoding... Assertion failed: !std::isnan(node->Pot.at<float>(s, 0)) in "C:\DGM-1.7.0\modules\DGM\MessagePassing.cpp", line 74
Assertion failed: !std::isnan(node->Pot.at<float>(s, 0)) in "C:\DGM-1.7.0\modules\DGM\MessagePassing.cpp", line 74
The lower precision boundary for the potential of the node 49152 is reached.
SUM_pot = -nan(ind)


Do you know what could be causing this issue?

User avatar
Creator
Posts: 157
Joined: Tue Dec 16, 2008, 20:52
Location: Hannover, Germany
Contact:

Re: Assertion fail in MessagePassing.cpp

Postby Creator » Tue Jan 14, 2020, 21:50

Dear Victor-Robles,

yes, a very probable scenario for this mistake is that Gaussian Mixture Model as a generative model tries to reconstruct your distribution from the training samples. In the test samples perhaps you have a sample, which lies very far from the training samples (very low potentials) and thus the Gaussian Mixture Model produced an ambiguous potential for this sample, which in its turn creates the problems during the inference (devision on a very small value).

To fix this problem, please try to use less features or use another node potential trainer. To understand which features may be dropped, please try to visualize them using the Bayes node trainer and the feature visualization from Demo Visualization (http://research.project-10.de/dgmdoc/a01852.html): good features build different Gaussians in the "Histogram" window.


Return to “Direct Graphical Models”

Who is online

Users browsing this forum: No registered users and 13 guests