Skip to content

Commit

Permalink
Merge pull request #106 from gtbook/frank_feb9
Browse files Browse the repository at this point in the history
Chapter 5 edits
  • Loading branch information
dellaert authored Feb 10, 2025
2 parents 391c15d + 1ec3550 commit 6dbaa09
Show file tree
Hide file tree
Showing 32 changed files with 582 additions and 440 deletions.
2 changes: 1 addition & 1 deletion S10_introduction.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@
"id": "01db0ccf",
"metadata": {},
"source": [
"```{index} differential drive robot, DDR\n",
"```{index} differential-drive robot, DDR\n",
"```\n",
"\n",
"The chapters of the book proceed through a sequence of increasingly complex robotic systems.\n",
Expand Down
2 changes: 1 addition & 1 deletion S11_models.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@
"```{index} configuration, configuration space\n",
"```\n",
"\n",
"The most basic information about a robot’s state is merely a description of the robot’s location (and orientation) in its environment, which we will define as the robot’s *configuration*. The set of all possible configurations will be called the *configuration space*. This information could be a qualitative, high-level description (e.g., the room in which the vacuum cleaning robot of Chapter 3 is located), coordinates for the robot’s position in a grid or continuous position coordinates in the plane (as for the logistics robot of Chapter 4), continuous coordinates for a position and orientation in the plane (as for the differential drive robot, or DDR, of Chapter 5 and the autonomous car of Chapter 6), or continuous coordinates for three-dimensional position and orientation (as for the drone in Chapter 7)."
"The most basic information about a robot’s state is merely a description of the robot’s location (and orientation) in its environment, which we will define as the robot’s *configuration*. The set of all possible configurations will be called the *configuration space*. This information could be a qualitative, high-level description (e.g., the room in which the vacuum cleaning robot of Chapter 3 is located), coordinates for the robot’s position in a grid or continuous position coordinates in the plane (as for the logistics robot of Chapter 4), continuous coordinates for a position and orientation in the plane (as for the differential-drive robot, or DDR, of Chapter 5 and the autonomous car of Chapter 6), or continuous coordinates for three-dimensional position and orientation (as for the drone in Chapter 7)."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion S13_math.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@
"```{index} Jacobian matrix\n",
"```\n",
"\n",
"Perhaps most surprisingly, the relationship between velocities can always be encoded as a linear mapping from one vector space to another. Consider a differential drive robot with two wheels that rotate independently. As these wheels rotate, the robot will move in the world, changing its position and orientation. The *instantaneous* relationship between the angular velocities of the two wheels and the linear and angular velocities of the robot is linear! The matrix that encodes this relationship is called a *Jacobian* matrix (which may include time-varying entries that are nonlinear functions of configuration variables). You may remember Jacobian matrices from an advanced calculus class. If so, you may recall that the Jacobian of a function relates the derivatives of the function’s input to the derivatives of its output. Even for highly nonlinear functions, the instantaneous relationship between these derivatives is linear, and expressed by the Jacobian matrix. We will see Jacobian matrices for omnidirectional robots in Chapter 4, DDRs in Chapter 5, and for drone dynamics in Chapter 7."
"Perhaps most surprisingly, the relationship between velocities can always be encoded as a linear mapping from one vector space to another. Consider a differential-drive robot with two wheels that rotate independently. As these wheels rotate, the robot will move in the world, changing its position and orientation. The *instantaneous* relationship between the angular velocities of the two wheels and the linear and angular velocities of the robot is linear! The matrix that encodes this relationship is called a *Jacobian* matrix (which may include time-varying entries that are nonlinear functions of configuration variables). You may remember Jacobian matrices from an advanced calculus class. If so, you may recall that the Jacobian of a function relates the derivatives of the function’s input to the derivatives of its output. Even for highly nonlinear functions, the instantaneous relationship between these derivatives is linear, and expressed by the Jacobian matrix. We will see Jacobian matrices for omnidirectional robots in Chapter 4, DDRs in Chapter 5, and for drone dynamics in Chapter 7."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion S21_sorter_state.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -914,7 +914,7 @@
"id": "IKt2DaIm1Brr",
"metadata": {},
"source": [
"Above we created an instance of the `gtsam.DiscreteDistribution` class. As with any GTSAM class, you can type\n",
"Above we created an instance of the `DiscreteDistribution` class. As with any GTSAM class, you can type\n",
"```python\n",
"help(gtsam.DiscreteDistribution)\n",
"```\n",
Expand Down
2 changes: 1 addition & 1 deletion S23_sorter_sensing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -712,7 +712,7 @@
"id": "tHvu2cyxbopL",
"metadata": {},
"source": [
"Above we created an instance of the `gtsam.DiscreteConditional` class. As with any GTSAM class, you can type\n",
"Above we created an instance of the `DiscreteConditional` class. As with any GTSAM class, you can type\n",
"\n",
"```python\n",
"help(gtsam.DiscreteConditional)\n",
Expand Down
6 changes: 3 additions & 3 deletions S24_sorter_perception.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1536,7 +1536,7 @@
"source": [
"### Factors\n",
"\n",
"Above we created an instance of the `gtsam.DecisionTreeFactor` class. As with any GTSAM class, you can type\n",
"Above we created an instance of the `DecisionTreeFactor` class. As with any GTSAM class, you can type\n",
"\n",
"```python\n",
"help(gtsam.DecisionTreeFactor)\n",
Expand Down Expand Up @@ -1575,7 +1575,7 @@
"id": "3mAh1FdQ0xrA",
"metadata": {},
"source": [
"The factors we created above are of type `gtsam.DecisionTreeFactor`, which are stored as decision trees:"
"The factors we created above are of type `DecisionTreeFactor`, which are stored as decision trees:"
]
},
{
Expand Down Expand Up @@ -1677,7 +1677,7 @@
}
],
"source": [
"#| caption: Decision tree in a `gtsam.DecisionTreeFactor`.\n",
"#| caption: Decision tree in a `DecisionTreeFactor`.\n",
"#| label: fig:decision_tree_factor\n",
"show(conductivity_false_factor)"
]
Expand Down
2 changes: 1 addition & 1 deletion S26_sorter_learning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -573,7 +573,7 @@
}
},
"source": [
"A `gtsam.DiscreteConditional` determines the counts, grouped by the conditioning variable. In our case, `Category` can take on 5 separate values, and hence we have five groups. For example, for a binary sensor:\n"
"A `DiscreteConditional` determines the counts, grouped by the conditioning variable. In our case, `Category` can take on 5 separate values, and hence we have five groups. For example, for a binary sensor:\n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion S31_vacuum_state.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -369,7 +369,7 @@
"id": "2OOSTBL1a4sV",
"metadata": {},
"source": [
"When we print the results, we see that we now get a dictionary of `DiscreteKeys`, i.e., integer tuples of the form *(`Key`, cardinality)*. However, the \"keys\" now seem to be very large integers. This is because for series of variables we use the `gtsam.Symbol` type, composed of a single character and an integer index:"
"When we print the results, we see that we now get a dictionary of `DiscreteKeys`, i.e., integer tuples of the form *(`Key`, cardinality)*. However, the \"keys\" now seem to be very large integers. This is because for series of variables we use the `Symbol` type, composed of a single character and an integer index:"
]
},
{
Expand Down
8 changes: 4 additions & 4 deletions S32_vacuum_actions.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1419,17 +1419,17 @@
"\n",
"> The GTSAM concepts used in this section, explained.\n",
"\n",
"As in Chapter 2, we once again used a `gtsam.DiscreteConditional`, this time to specify a motion model for the controlled Markov chain above, as shown in Figure [4](#vacuum-motion-model)."
"As in Chapter 2, we once again used a `DiscreteConditional`, this time to specify a motion model for the controlled Markov chain above, as shown in Figure [4](#vacuum-motion-model)."
]
},
{
"cell_type": "markdown",
"id": "1GGXFMgV1VUb",
"metadata": {},
"source": [
"To specify the motion model, we used the `gtsam.DiscreteBayesNet` class, and in particular these methods:\n",
"To specify the motion model, we used the `DiscreteBayesNet` class, and in particular these methods:\n",
"\n",
"- `add(self:, key: Tuple[int, int], parents: List[Tuple[int, int]], spec: str) -> None`: adds a conditional with the same arguments as the `gtsam.DiscreteConditional` constructor.\n",
"- `add(self:, key: Tuple[int, int], parents: List[Tuple[int, int]], spec: str) -> None`: adds a conditional with the same arguments as the `DiscreteConditional` constructor.\n",
"- `at(self, i: int) -> gtsam.DiscreteConditional`: retrieves the $i^{th}$ conditional added."
]
},
Expand Down Expand Up @@ -1522,7 +1522,7 @@
"id": "22gJu5XkegOn",
"metadata": {},
"source": [
"Finally, a word about the graphs above. You might wonder, why these graphs come out so beautifully positioned, e.g., to indicate time from left to right. This was accomplished with the `hints` argument, which positions variables series at an appropriate height. Similarly, the `boxes` argument (which takes `gtsam.Keys`, not tuples) indicates which variables should considered as given.\n",
"Finally, a word about the graphs above. You might wonder, why these graphs come out so beautifully positioned, e.g., to indicate time from left to right. This was accomplished with the `hints` argument, which positions variables series at an appropriate height. Similarly, the `boxes` argument (which takes `Keys`, not tuples) indicates which variables should considered as given.\n",
"\n",
"These arguments are handled in the `gtbook` library {cite:p}`gtbook`, and are passed on in the appropriate format to the underlying GTSAM `dot` methods, which generate graphviz-style graphs{cite:p}`graphviz`."
]
Expand Down
4 changes: 2 additions & 2 deletions S33_vacuum_sensing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -647,7 +647,7 @@
"id": "HlzAWlJNSilC",
"metadata": {},
"source": [
"We use the `gtsam.DiscreteBayesNet` method `sample`, with signature\n",
"We use the `DiscreteBayesNet` method `sample`, with signature\n",
"\n",
"```python\n",
" sample(self, given: gtsam::DiscreteValues) -> gtsam::DiscreteValues\n",
Expand All @@ -660,7 +660,7 @@
"metadata": {},
"source": [
"It implements ancestral sampling, but does assume that the Bayes net is reverse topologically sorted, i.e. last\n",
"conditional will be sampled first. In addition, it can optionally take an assignment for certain *given* variables, as a `gtsam.DiscreteValues` instance.\n",
"conditional will be sampled first. In addition, it can optionally take an assignment for certain *given* variables, as a `DiscreteValues` instance.\n",
"In that case, it is also assumed that the Bayes net does not contain any conditionals for the given values.\n",
"We used this functionality to pass the given action sequence above."
]
Expand Down
2 changes: 1 addition & 1 deletion S41_logistics_state.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -472,7 +472,7 @@
"\n",
"> The GTSAM concepts used in this section, explained.\n",
"\n",
"We really used only one concept from GTSAM above, which is `gtsam.Point2`. For maximal compatibility with numpy, in python this is just a function that creates a 2D, float numpy array. Inside GTSAM, it is represented as an Eigen vector, where Eigen is the C++ equivalent of numpy."
"We really used only one concept from GTSAM above, which is `Point2`. For maximal compatibility with numpy, in python this is just a function that creates a 2D, float numpy array. Inside GTSAM, it is represented as an Eigen vector, where Eigen is the C++ equivalent of numpy."
]
},
{
Expand Down
6 changes: 3 additions & 3 deletions S42_logistics_actions.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -912,14 +912,14 @@
"id": "EzTs7UlNSH3r",
"metadata": {},
"source": [
"A `gtsam.GaussianDensity` class can be constructed via the following named constructor:\n",
"A `GaussianDensity` class can be constructed via the following named constructor:\n",
"\n",
"```python\n",
"FromMeanAndStddev(key: gtsam.Key, mean: np.array, sigma: float) -> gtsam.GaussianDensity\n",
"```\n",
"\n",
"{raw:tex}`\\noindent`\n",
"and two similar named constructors exists for `gtsam.GaussianConditional`:\n",
"and two similar named constructors exists for `GaussianConditional`:\n",
"\n",
"```python\n",
"- FromMeanAndStddev(key: gtsam.Key, A: np.array, parent: gtsam.Key, b: numpy.ndarray[numpy.float64[m, 1]], sigma: float) -> gtsam.GaussianConditional\n",
Expand Down Expand Up @@ -976,7 +976,7 @@
"id": "Xlcqd_5WJoBS",
"metadata": {},
"source": [
"In the above, all error functions take an instance of `gtsam.VectorValues`, which is simply a map from GTSAM keys to values as vectors. This is the equivalent of `gtsam.DiscreteValues` from the previous sections."
"In the above, all error functions take an instance of `VectorValues`, which is simply a map from GTSAM keys to values as vectors. This is the equivalent of `DiscreteValues` from the previous sections."
]
}
],
Expand Down
4 changes: 2 additions & 2 deletions S44_logistics_perception.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -225,7 +225,7 @@
"id": "Du5R3YoGb21M",
"metadata": {},
"source": [
"Note above we used a `gtsam.VectorValues` to store the ground truth trajectory, which will come in handy again when we simulate the measurements. In Figure [2](#fig:logistics-ground-truth) we show this ground truth trajectory overlaid on the warehouse map we introduced before."
"Note above we used a `VectorValues` to store the ground truth trajectory, which will come in handy again when we simulate the measurements. In Figure [2](#fig:logistics-ground-truth) we show this ground truth trajectory overlaid on the warehouse map we introduced before."
]
},
{
Expand Down Expand Up @@ -585,7 +585,7 @@
"\n",
"```{index} particle filter\n",
"```\n",
"```{index} pair: MCL; Monte Carlo Localization\n",
"```{index} pair: MCL; Monte Carlo localization\n",
"```\n",
"The above finite element discretization of space is very costly, and most of the memory and computation is used to compute near-zero probabilities. \n",
"While there *are* ways to deal with this, switching to a sampling-based representation gets us more bang for the buck computation-wise. And, as we will see, it also leads to a very simple algorithm.\n",
Expand Down
20 changes: 9 additions & 11 deletions S50_diffdrive_intro.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,11 @@
"source": [
"# A Mobile Robot With Simple Kinematics\n",
"\n",
"```{index} differential drive\n",
"```\n",
"> A simple, differential-drive robot that is similar to a car, but can rotate in place.\n",
"\n",
"<img src=\"Figures5/S50-Two-wheeled_Toy_Robot-02.jpg\" alt=\"Splash image with steampunk differential drive robot\" width=\"60%\" align=center style=\"vertical-align:middle;margin:10px 0px\">"
"<img src=\"Figures5/S50-Two-wheeled_Toy_Robot-02.jpg\" alt=\"Splash image with steampunk differential-drive robot\" width=\"60%\" align=center style=\"vertical-align:middle;margin:10px 0px\">"
]
},
{
Expand All @@ -20,25 +22,21 @@
"source": [
"In this chapter we introduce a robot with differential drive, i.e, two wheels that can spin in either direction to move the robot forwards or backwards, or change its orientation by turning. In contrast to the previous chapter, we will now treat *orientation* as a first class citizen.\n",
"\n",
"```{index} camera\n",
"```\n",
"We will also introduce a new and powerful sensor, the *camera*,\n",
"which is used by computer vision systems.\n",
"The last decade has seen spectacular advances in computer vision powered by new breakthroughs in machine learning, specifically deep neural networks in computer vision. We first lay the groundwork by describing cameras and mathematical models for image formation, and then dive in with some elementary image processing, In the perception section we discuss multilayer perceptrons and convolutional neural networks to give a taste for the machine learning methods currently in vogue. We also discuss *learning* the parameters of these networks in the last section.\n",
"Computer vision has seen spectacular advances, powered by new breakthroughs in machine learning, specifically deep neural networks and transformer-based models. We first lay the groundwork by describing cameras and mathematical models for image formation, and then dive in with some elementary image processing. In the perception section we discuss multilayer perceptrons and convolutional neural networks to give a taste for the machine learning methods currently in vogue. We also discuss *learning* the parameters of these networks in the last section.\n",
"\n",
"In this chapter, we move beyond the grid-based representation of the robot state, introduce the notion\n",
"of the robot's configuration, a continuous space that captures all of the robot's degrees\n",
"In this chapter, we move beyond the grid-based representation of the robot state, introducing the notion\n",
"of the robot's configuration space, a continuous space that captures all of the robot's degrees\n",
"of freedom.\n",
"We introduce several planning methods that work in continuous configuration space, including\n",
"state-of-the-art sampling based methods. \n",
"state-of-the-art sampling-based methods. \n",
"\n",
"We conclude the chapter with an introduction to modern Deep Learning (DL) methods.\n",
"These methods can be applied to train the neural networks that we introduce in this chapter."
]
},
{
"cell_type": "markdown",
"id": "etZa-SPouupD",
"metadata": {},
"source": []
}
],
"metadata": {
Expand Down
Loading

0 comments on commit 6dbaa09

Please sign in to comment.