Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Direct Hybrid Factor Specification #1805

Merged
merged 36 commits into from
Sep 20, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
75d4724
remove extra imports in DiscreteBayesNet.cpp
varunagrawal Aug 22, 2024
dce5641
minor edits
varunagrawal Aug 22, 2024
9e77eba
rename X1 to X0 and X2 to X1
varunagrawal Aug 22, 2024
3fc1019
provide logNormalizers directly to the augment method
varunagrawal Aug 22, 2024
cfef6d3
update GaussianMixture::likelihood to compute the logNormalizers
varunagrawal Aug 22, 2024
30bf261
Tests which verify direct factor specification works well
varunagrawal Aug 22, 2024
bfaff50
remove extra prints
varunagrawal Aug 22, 2024
665d755
add docstring and GTSAM_EXPORT for ComputeLogNormalizer
varunagrawal Aug 22, 2024
03e61f4
Merge branch 'working-hybrid' into direct-hybrid-fg
varunagrawal Aug 23, 2024
07a0088
compute logNormalizers and pass to GaussianMixtureFactor
varunagrawal Aug 23, 2024
62b32fa
Merge branch 'working-hybrid' into direct-hybrid-fg
varunagrawal Aug 25, 2024
fbffd79
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 5, 2024
13193a1
better comments
varunagrawal Sep 5, 2024
0bab8ec
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 5, 2024
51a2fd5
improved comments
varunagrawal Sep 6, 2024
615c04a
some more refactor and remove redundant test
varunagrawal Sep 6, 2024
9dc29e0
fix test
varunagrawal Sep 6, 2024
05a4b7a
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 6, 2024
24ec30e
replace emplace_back with emplace_shared
varunagrawal Sep 6, 2024
506cda8
Merge branch 'hybrid-error-scalars' into direct-hybrid-fg
varunagrawal Sep 15, 2024
336b494
fixes
varunagrawal Sep 15, 2024
de68aec
fix tests
varunagrawal Sep 15, 2024
b895e64
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 18, 2024
987ecd4
undo accidental rename
varunagrawal Sep 18, 2024
80d9a5a
remove duplicate test and focus only on direct specification
varunagrawal Sep 19, 2024
717eb7e
relinearization test
varunagrawal Sep 19, 2024
f875b86
print nonlinear part of HybridValues
varunagrawal Sep 19, 2024
2937533
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 19, 2024
9b6facd
add documentation for additive scalar in the error and remove the 0.5…
varunagrawal Sep 19, 2024
244661a
rename ComputeLogNormalizer to ComputeLogNormalizerConstant
varunagrawal Sep 19, 2024
4f88829
fix docstring for HybridGaussianFactor
varunagrawal Sep 19, 2024
d60a253
logNormalizationConstant is now a method for Gaussian noise model
varunagrawal Sep 19, 2024
cea0dd5
update tests
varunagrawal Sep 19, 2024
1ab82f3
hide sqrt(2*value) so the user doesn't have to premultiply by 2
varunagrawal Sep 20, 2024
364b4b4
logDetR method which leverages noise model for efficiency. Build logD…
varunagrawal Sep 20, 2024
67a8b8f
comprehensive unit testing
varunagrawal Sep 20, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions gtsam/hybrid/HybridGaussianFactor.h
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,15 @@ using GaussianFactorValuePair = std::pair<GaussianFactor::shared_ptr, double>;
* where the set of discrete variables indexes to
* the continuous gaussian distribution.
*
* In factor graphs the error function typically returns 0.5*|h(x)-z|^2, i.e.,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it would be |A*x-b|

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

* the negative log-likelihood for a Gaussian noise model.
* In hybrid factor graphs we allow *adding* an arbitrary scalar dependent on
* the discrete assignment.
* For example, adding a 70/30 mode probability is supported by providing the
* scalars $-log(.7)$ and $-log(.3)$.
* Note that adding a common constant will not make any difference in the
* optimization, so $-log(70)$ and $-log(30)$ work just as well.
*
* @ingroup hybrid
*/
class GTSAM_EXPORT HybridGaussianFactor : public HybridFactor {
Expand Down
15 changes: 13 additions & 2 deletions gtsam/hybrid/HybridNonlinearFactor.h
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,17 @@ using NonlinearFactorValuePair = std::pair<NonlinearFactor::shared_ptr, double>;
* This class stores all factors as HybridFactors which can then be typecast to
* one of (NonlinearFactor, GaussianFactor) which can then be checked to perform
* the correct operation.
*
* In factor graphs the error function typically returns 0.5*|h(x)-z|^2, i.e.,
* the negative log-likelihood for a Gaussian noise model.
* In hybrid factor graphs we allow *adding* an arbitrary scalar dependent on
* the discrete assignment.
* For example, adding a 70/30 mode probability is supported by providing the
* scalars $-log(.7)$ and $-log(.3)$.
* Note that adding a common constant will not make any difference in the
* optimization, so $-log(70)$ and $-log(30)$ work just as well.
*
* @ingroup hybrid
*/
class HybridNonlinearFactor : public HybridFactor {
public:
Expand Down Expand Up @@ -134,7 +145,7 @@ class HybridNonlinearFactor : public HybridFactor {
auto errorFunc =
[continuousValues](const std::pair<sharedFactor, double>& f) {
auto [factor, val] = f;
return factor->error(continuousValues) + (0.5 * val);
return factor->error(continuousValues) + val;
};
DecisionTree<Key, double> result(factors_, errorFunc);
return result;
Expand All @@ -153,7 +164,7 @@ class HybridNonlinearFactor : public HybridFactor {
auto [factor, val] = factors_(discreteValues);
// Compute the error for the selected factor
const double factorError = factor->error(continuousValues);
return factorError + (0.5 * val);
return factorError + val;
}

/**
Expand Down
Loading