-
Notifications
You must be signed in to change notification settings - Fork 511
Solve equation systems with Eigen to improve convergence #225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@whitequark |
@whitequark, Somehow too many redundant constraints can't be converged. I've figured out what 3xPT_ON_CIRCLE can't converge (just try to make it on empty sketch). After disallowing redundant, too many redundants threated as non-converged for no reason. This is probably bug, but I am not sure whether rank test can work fine or not for such amount of redundants. |
@Evil-Spirit Yes, I did notice that many redundant constraints tend to degrade convergence. Please figure out why. This makes such a mode much less useful. |
@whitequark, I have researched a lot and made some debug functions (visualizing solve way and equations). Finally I've linked Eigen and this has solved the problem. I think the problem is what the Gauss linear system solving is not so precise as it should be. |
here is some beta |
|
@whitequark, this is path of points during newton steps like here https://youtu.be/72Dnqvn_uTk |
OK. Please prepare a commit that just adds Eigen, and separately these debugging capabilities. |
|
@Evil-Spirit I see no problem with Eigen. Just want to know if there is anything it's bad at. |
@whitequark, what about to remove System::MAX_UNKNOWNS limit in eigen mode? |
@Evil-Spirit Yeah, just rip out all pre-Eigen solver code, no #ifdefs needed, as well as MAX_UNKNOWNS limit. No reason to keep these. |
@whitequark ok |
@whitequark
Without substitutions:
|
|
The most robust way to rank-test is usually an SVD. Take care in how you set the tolerance on the rank test. That's a count of singular values above some threshold; so perhaps set that threshold empirically by looking at the singular values for a variety of sketches with known number of redundant DOF, and putting it halfway (geometric mean) between the biggest "zero" and the smallest "nonzero". Make sure to consider ill-conditioned sketches, like with long skinny triangles and stuff. If that SVD costs significant time, then we could look at a faster rank test. Then what's the condition number of the resulting matrix, the ratio of the maximum singular value to the minimum non-zero singular value? If it's not too big, then it shouldn't matter much how you solve; so you can probably just do Cholesky or something like before, faster than QR. |
@whitequark , @jwesthues
And this is ultrafast. @jwesthues what do you think about https://eigen.tuxfamily.org/dox/classEigen_1_1SimplicialLDLT.html? |
@Evil-Spirit Cool. What's the condition number of the matrix? It should intuitively be not too big, in which case Cholesky(-ish) should work robustly, though I don't think I ever actually measured... |
@Evil-Spirit Excellent work! |
@jwesthues , I will try to calculate the condition number of the matrix and do output for it. |
SVD rank testing is better but unlikely to be faster. I'm not sure about Eigen, but sparse SVD solvers are well-studied, http://www.netlib.org/svdpack/ or whatever. Iterative methods for the SVD may have a special benefit here if we cache the result, since while dragging, the A matrix doesn't usually change that much from solve to solve. So the SVD shouldn't change that much either, so we might usually be almost converged to start with. |
@jwesthues, I have calculated using approach mentioned here http://stackoverflow.com/questions/33575478/how-can-you-find-the-condition-number-in-eigen
the result is 1e+7...1e+14 for A*A^T matrix during newton steps (for a long solution solved in 39 steps ) |
It's the number of digits of precision that we lose while solving; but we need to define the condition number as the ratio of the maximum singular value to the minimum non-zero singular value. I suspect that your condition number is artificially big because your sketch has unconstrained DOF, and each unconstrained DOF introduces a singular value that's ideally zero. Ignore all the singular values less than a threshold (e.g., 10^-4 to start), and try to put the threshold roughly halfway (geometric mean) between the smallest singular value that should be non-zero, and the largest singular value that should be zero, over a range of difficult sketches like long skinny triangles. That's probably the most robust way to rank test. |
@jwesthues example of singular values list is
|
Okay, got it. So you've got 32*4*2 rectangle vertex coordinates, minus 31*2 unknowns for the point-coincident at the corner, minus 32*2 for the lengths, minus 32*2 for the parallels, minus 32 for the perpendiculars. That's our 34 DOF. You write 32*2 + 32*2 + 32 = 160 nontrivial equations, and the A matrix would ideally have full rank. So this isn't a useful test case for what I was talking about, but it's still interesting. We have 159 singular values, for a reason that I don't understand; was a line lost to cut and paste? The reported condition number is slightly smaller than 20435/3.1564e-005, so I'm thinking you lost the first line. In any case, the singular values fall into three groups, 32 around 10^-4, 64 around 10^0, and 63 (that I think should be 64) around 10^4. I think that corresponds to your three types of constraints. That indicates that we're writing those equations in an unfortunate way--like, if you figure out what kind of equation is giving you the ~10^-4, and multiply all those equations by 10^4, then you'll improve the matrix's condition number. That's easy enough to do experimentally, and then look at the structure of the equations to see if it makes sense. What are the singular values in some sketches that are actually inconsistent/redundant? |
@jwesthues thank you, you are swimming at the deep where I still can understand you. Now I understand a little bit more about how to answer the question what equation is good, and what is bad. You can read this article I have written.
I should check the PARALLEL equation because this is not scaled by vectors magnitudes, and scaling it can improve the situation. But there still a question how to make length and angular dimension of the same scale. I have mentioned equations in projective space, so this is also can impove situation. |
|
@jwesthues |
@Evil-Spirit Cool. One option is just to multiply or divide by a constant scale factor derived from the bounding box of the overall sketch. That's ugly, but it does the right thing as long as the initial guess is drawn with halfway reasonable (within a few orders of magnitude) scale. What singular values do you get when constraints really are redundant/inconsistent? |
@Evil-Spirit any chance you could rebase these changes on current master? You and JW put a lot of time into it and I'd like to get it merged. The solver stuff is tricky, and I'm reluctant to just make the changes fit in a way that looks right, compiles and seems to work. If you're busy I understand. |
@phkahler Sorry, have no time :( |
@phkahler made it |
Impossible not to put this here |
Merged in #1159 |
See https://drive.google.com/file/d/0B15RHGNY8nBpYnNwcG82WDdseDQ/view. The newly added arc segment has a pt-coincident constraint that causes the failure.
The text was updated successfully, but these errors were encountered: