From 3e06ac5cbe4ae8520d91acb877710ab7de945c10 Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 13:45:04 -0500 Subject: [PATCH 1/9] Section 5.0 --- S50_diffdrive_intro.ipynb | 20 +++++++++----------- 1 file changed, 9 insertions(+), 11 deletions(-) diff --git a/S50_diffdrive_intro.ipynb b/S50_diffdrive_intro.ipynb index 94506147..12ed045a 100644 --- a/S50_diffdrive_intro.ipynb +++ b/S50_diffdrive_intro.ipynb @@ -8,9 +8,11 @@ "source": [ "# A Mobile Robot With Simple Kinematics\n", "\n", + "```{index} differential drive\n", + "```\n", "> A simple, differential-drive robot that is similar to a car, but can rotate in place.\n", "\n", - "\"Splash" + "\"Splash" ] }, { @@ -20,25 +22,21 @@ "source": [ "In this chapter we introduce a robot with differential drive, i.e, two wheels that can spin in either direction to move the robot forwards or backwards, or change its orientation by turning. In contrast to the previous chapter, we will now treat *orientation* as a first class citizen.\n", "\n", + "```{index} camera\n", + "```\n", "We will also introduce a new and powerful sensor, the *camera*,\n", "which is used by computer vision systems.\n", - "The last decade has seen spectacular advances in computer vision powered by new breakthroughs in machine learning, specifically deep neural networks in computer vision. We first lay the groundwork by describing cameras and mathematical models for image formation, and then dive in with some elementary image processing, In the perception section we discuss multilayer perceptrons and convolutional neural networks to give a taste for the machine learning methods currently in vogue. We also discuss *learning* the parameters of these networks in the last section.\n", + "Computer vision has seen spectacular advances, powered by new breakthroughs in machine learning, specifically deep neural networks and transformer-based models. We first lay the groundwork by describing cameras and mathematical models for image formation, and then dive in with some elementary image processing. In the perception section we discuss multilayer perceptrons and convolutional neural networks to give a taste for the machine learning methods currently in vogue. We also discuss *learning* the parameters of these networks in the last section.\n", "\n", - "In this chapter, we move beyond the grid-based representation of the robot state, introduce the notion\n", - "of the robot's configuration, a continuous space that captures all of the robot's degrees\n", + "In this chapter, we move beyond the grid-based representation of the robot state, introducing the notion\n", + "of the robot's configuration space, a continuous space that captures all of the robot's degrees\n", "of freedom.\n", "We introduce several planning methods that work in continuous configuration space, including\n", - "state-of-the-art sampling based methods. \n", + "state-of-the-art sampling-based methods. \n", "\n", "We conclude the chapter with an introduction to modern Deep Learning (DL) methods.\n", "These methods can be applied to train the neural networks that we introduce in this chapter." ] - }, - { - "cell_type": "markdown", - "id": "etZa-SPouupD", - "metadata": {}, - "source": [] } ], "metadata": { From 1589f7cbbec2a0d8a29503db13ee5f6401c4a0a0 Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 13:45:17 -0500 Subject: [PATCH 2/9] differential-drive robot --- S10_introduction.ipynb | 2 +- S11_models.ipynb | 2 +- S13_math.ipynb | 2 +- S51_diffdrive_state.ipynb | 8 ++++---- S52_diffdrive_actions.ipynb | 2 +- S53_diffdrive_sensing.ipynb | 2 +- S54_diffdrive_perception.ipynb | 2 +- S55_diffdrive_planning.ipynb | 2 +- S56_diffdrive_learning.ipynb | 2 +- S57_diffdrive_summary.ipynb | 8 ++++---- S61_driving_state.ipynb | 2 +- S62_driving_actions.ipynb | 4 ++-- S65_driving_planning.ipynb | 2 +- S67_driving_summary.ipynb | 2 +- S72_drone_actions.ipynb | 2 +- 15 files changed, 22 insertions(+), 22 deletions(-) diff --git a/S10_introduction.ipynb b/S10_introduction.ipynb index 3c8a5d74..164c1b4a 100644 --- a/S10_introduction.ipynb +++ b/S10_introduction.ipynb @@ -80,7 +80,7 @@ "id": "01db0ccf", "metadata": {}, "source": [ - "```{index} differential drive robot, DDR\n", + "```{index} differential-drive robot, DDR\n", "```\n", "\n", "The chapters of the book proceed through a sequence of increasingly complex robotic systems.\n", diff --git a/S11_models.ipynb b/S11_models.ipynb index d8a77c2c..e98ddba5 100644 --- a/S11_models.ipynb +++ b/S11_models.ipynb @@ -119,7 +119,7 @@ "```{index} configuration, configuration space\n", "```\n", "\n", - "The most basic information about a robot’s state is merely a description of the robot’s location (and orientation) in its environment, which we will define as the robot’s *configuration*. The set of all possible configurations will be called the *configuration space*. This information could be a qualitative, high-level description (e.g., the room in which the vacuum cleaning robot of Chapter 3 is located), coordinates for the robot’s position in a grid or continuous position coordinates in the plane (as for the logistics robot of Chapter 4), continuous coordinates for a position and orientation in the plane (as for the differential drive robot, or DDR, of Chapter 5 and the autonomous car of Chapter 6), or continuous coordinates for three-dimensional position and orientation (as for the drone in Chapter 7)." + "The most basic information about a robot’s state is merely a description of the robot’s location (and orientation) in its environment, which we will define as the robot’s *configuration*. The set of all possible configurations will be called the *configuration space*. This information could be a qualitative, high-level description (e.g., the room in which the vacuum cleaning robot of Chapter 3 is located), coordinates for the robot’s position in a grid or continuous position coordinates in the plane (as for the logistics robot of Chapter 4), continuous coordinates for a position and orientation in the plane (as for the differential-drive robot, or DDR, of Chapter 5 and the autonomous car of Chapter 6), or continuous coordinates for three-dimensional position and orientation (as for the drone in Chapter 7)." ] }, { diff --git a/S13_math.ipynb b/S13_math.ipynb index f1899a66..b317bf54 100644 --- a/S13_math.ipynb +++ b/S13_math.ipynb @@ -179,7 +179,7 @@ "```{index} Jacobian matrix\n", "```\n", "\n", - "Perhaps most surprisingly, the relationship between velocities can always be encoded as a linear mapping from one vector space to another. Consider a differential drive robot with two wheels that rotate independently. As these wheels rotate, the robot will move in the world, changing its position and orientation. The *instantaneous* relationship between the angular velocities of the two wheels and the linear and angular velocities of the robot is linear! The matrix that encodes this relationship is called a *Jacobian* matrix (which may include time-varying entries that are nonlinear functions of configuration variables). You may remember Jacobian matrices from an advanced calculus class. If so, you may recall that the Jacobian of a function relates the derivatives of the function’s input to the derivatives of its output. Even for highly nonlinear functions, the instantaneous relationship between these derivatives is linear, and expressed by the Jacobian matrix. We will see Jacobian matrices for omnidirectional robots in Chapter 4, DDRs in Chapter 5, and for drone dynamics in Chapter 7." + "Perhaps most surprisingly, the relationship between velocities can always be encoded as a linear mapping from one vector space to another. Consider a differential-drive robot with two wheels that rotate independently. As these wheels rotate, the robot will move in the world, changing its position and orientation. The *instantaneous* relationship between the angular velocities of the two wheels and the linear and angular velocities of the robot is linear! The matrix that encodes this relationship is called a *Jacobian* matrix (which may include time-varying entries that are nonlinear functions of configuration variables). You may remember Jacobian matrices from an advanced calculus class. If so, you may recall that the Jacobian of a function relates the derivatives of the function’s input to the derivatives of its output. Even for highly nonlinear functions, the instantaneous relationship between these derivatives is linear, and expressed by the Jacobian matrix. We will see Jacobian matrices for omnidirectional robots in Chapter 4, DDRs in Chapter 5, and for drone dynamics in Chapter 7." ] }, { diff --git a/S51_diffdrive_state.ipynb b/S51_diffdrive_state.ipynb index fec9b7f6..9d46a9df 100644 --- a/S51_diffdrive_state.ipynb +++ b/S51_diffdrive_state.ipynb @@ -23,7 +23,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "id": "2d8QVmE4S-v1", "metadata": { "tags": [ @@ -40,12 +40,12 @@ } ], "source": [ - "%pip install -q -U gtbook\n" + "%pip install -q -U gtbook" ] }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "id": "azWgi15MntHa", "metadata": { "tags": [ @@ -57,7 +57,7 @@ "import math\n", "\n", "import gtsam\n", - "from gtbook.display import pretty\n" + "from gtbook.display import pretty" ] }, { diff --git a/S52_diffdrive_actions.ipynb b/S52_diffdrive_actions.ipynb index 8aca2b8e..1bcf1913 100644 --- a/S52_diffdrive_actions.ipynb +++ b/S52_diffdrive_actions.ipynb @@ -78,7 +78,7 @@ ] }, "source": [ - "\"Splash" + "\"Splash" ] }, { diff --git a/S53_diffdrive_sensing.ipynb b/S53_diffdrive_sensing.ipynb index 2e0158e2..3e8d006d 100644 --- a/S53_diffdrive_sensing.ipynb +++ b/S53_diffdrive_sensing.ipynb @@ -158,7 +158,7 @@ "```\n", "In essence, we get access to images as multi-dimensional arrays. Expensive CCD cameras have three sensors, one per color channel (red, green, and blue or **RGB**), and hence their raw output can be represented as three arrays of numbers that represent light levels in a specific frequency band, roughly corresponding to the same frequency bands that receptors in our eye are sensitive to. However, most cameras now have a *single* CMOS sensor with a color filter on top (called a Bayer pattern), and specialized algorithms that hallucinate three color channels. Actually, most cameras do a great deal more processing to improve the color and lighting; this sometimes gets in the way of algorithms that rely on measuring light exactly, but those are rather rare. In most cases, we are content to simply think of a (color) image as a $H \\times W \\times 3$ array of numbers, where $H$ is the height of the image, and $W$ the width.\n", "\n", - "As an example, below we show an image on the left, taken by the differential drive robot on the right:" + "As an example, below we show an image on the left, taken by the differential-drive robot on the right:" ] }, { diff --git a/S54_diffdrive_perception.ipynb b/S54_diffdrive_perception.ipynb index 02695d4d..2e2edb74 100644 --- a/S54_diffdrive_perception.ipynb +++ b/S54_diffdrive_perception.ipynb @@ -87,7 +87,7 @@ ] }, "source": [ - "\"Splash" + "\"Splash" ] }, { diff --git a/S55_diffdrive_planning.ipynb b/S55_diffdrive_planning.ipynb index e7d5d20e..518ef22b 100644 --- a/S55_diffdrive_planning.ipynb +++ b/S55_diffdrive_planning.ipynb @@ -90,7 +90,7 @@ ] }, "source": [ - "\"Splash" + "\"Splash" ] }, { diff --git a/S56_diffdrive_learning.ipynb b/S56_diffdrive_learning.ipynb index 922eebc2..bfb4bf2c 100644 --- a/S56_diffdrive_learning.ipynb +++ b/S56_diffdrive_learning.ipynb @@ -106,7 +106,7 @@ ] }, "source": [ - "\"Splash" + "\"Splash" ] }, { diff --git a/S57_diffdrive_summary.ipynb b/S57_diffdrive_summary.ipynb index 7e1a38de..0d4fead6 100644 --- a/S57_diffdrive_summary.ipynb +++ b/S57_diffdrive_summary.ipynb @@ -18,7 +18,7 @@ ] }, "source": [ - "\"Splash" + "\"Splash" ] }, { @@ -53,7 +53,7 @@ "probability models we have seen in previous chapters. \n", "\n", "We began the chapter with a formal definition of a *configuration* and of the *configuration space* for robotic systems.\n", - "For the simple differential drive robot of this chapter, we rigidly attach a coordinate frame to the robot,\n", + "For the simple differential-drive robot of this chapter, we rigidly attach a coordinate frame to the robot,\n", "with origin at the midpoint of the axle and $x$-axis parallel to the direction of motion.\n", "The pose of this frame (position of its origin, and orientation of its axes) defines a configuration\n", "of the robot, and the set of all configurations, ${\\cal Q} = \\mathbb{R}^2 \\times [0, 2\\pi),$ defines\n", @@ -61,7 +61,7 @@ "We then showed how it is possible to determine the location of any point on the robot using the\n", "robot's configuration.\n", "\n", - "The motion model for the differential drive robot relates the angular velocities of the two\n", + "The motion model for the differential-drive robot relates the angular velocities of the two\n", "actuated wheels to the linear and angular velocity of the body-attached coordinate frame.\n", "The *forward velocity kinematics* in the body-attached frame are given by\n", "\\begin{equation}\n", @@ -276,7 +276,7 @@ "id": "6pwDl7q8lvp_", "metadata": {}, "source": [ - "The kinematics of differential drive robots are described in detail in [Introduction to Autonomous Mobile Robots](https://mitpress.mit.edu/9780262015356/introduction-to-autonomous-mobile-robots/) by Siegwart, Nourbakhsh, Scaramuzza {cite:p}`Siegwart11book_robots`\n", + "The kinematics of differential-drive robots are described in detail in [Introduction to Autonomous Mobile Robots](https://mitpress.mit.edu/9780262015356/introduction-to-autonomous-mobile-robots/) by Siegwart, Nourbakhsh, Scaramuzza {cite:p}`Siegwart11book_robots`\n", "\n", "The first mathematically rigorous book on robot motion planning was written by Latombe\n", "in the early nineties {cite:p}`Latombe91book`.\n", diff --git a/S61_driving_state.ipynb b/S61_driving_state.ipynb index 375c9a07..ae6ac70b 100644 --- a/S61_driving_state.ipynb +++ b/S61_driving_state.ipynb @@ -87,7 +87,7 @@ "id": "OivsqQTs9jsg", "metadata": {}, "source": [ - "In the previous chapter, we saw that the configuration space for a differential drive robot\n", + "In the previous chapter, we saw that the configuration space for a differential-drive robot\n", "can be represented as ${\\cal Q} = \\mathbb{R}^2 \\times [0, 2\\pi),$\n", "and we used $q = (x,y,\\theta)$ to parameterize this configuration space.\n", "This choice of parameterization has the nice property of being minimal: \n", diff --git a/S62_driving_actions.ipynb b/S62_driving_actions.ipynb index 3ff9c6f3..8c741aa9 100644 --- a/S62_driving_actions.ipynb +++ b/S62_driving_actions.ipynb @@ -30,7 +30,7 @@ "```{index} action; Ackermann steering\n", "```\n", "\n", - "> Cars are more complex than differential drive robots: they cannot turn in place, and they typically have front-wheel steering." + "> Cars are more complex than differential-drive robots: they cannot turn in place, and they typically have front-wheel steering." ] }, { @@ -51,7 +51,7 @@ "metadata": {}, "source": [ "In this section we introduce a kinematic model for cars.\n", - "Unlike the differential drive robots of the previous chapter, cars have four wheels, two of which are used for steering,\n", + "Unlike the differential-drive robots of the previous chapter, cars have four wheels, two of which are used for steering,\n", "and two of which are used to induce linear motion (either the two rear or two front wheels, except in the case of four-wheel drive vehicles).\n", "Therefore, we might choose to model the car as having two inputs: a steering angle and an acceleration.\n", "While this might be a more accurate physical model, we will use a slightly simplified model, in which the\n", diff --git a/S65_driving_planning.ipynb b/S65_driving_planning.ipynb index b662dde2..e74194d1 100644 --- a/S65_driving_planning.ipynb +++ b/S65_driving_planning.ipynb @@ -84,7 +84,7 @@ "In these problems, we used probability theory to quantify uncertainty,\n", "and developed policies to maximize the expected benefit (or to minimize the expected cost)\n", "of executing actions in a given state.\n", - "In contrast, for the differential drive robot (DDR), we considered the purely geometric\n", + "In contrast, for the differential-drive robot (DDR), we considered the purely geometric\n", "problem of planning collision-free paths. Even though this problem did not consider uncertainty,\n", "the computational complexity of the problem precludes exact, complete solutions for all\n", "but the simplest problems, leading to the introduction of sampling-based methods.\n", diff --git a/S67_driving_summary.ipynb b/S67_driving_summary.ipynb index cd5f7fca..3adf11bd 100644 --- a/S67_driving_summary.ipynb +++ b/S67_driving_summary.ipynb @@ -122,7 +122,7 @@ "Having developed homogeneous transformations to represent pose, \n", "we turned our attention to differential kinematics for a car-like robot.\n", "Previously we derived the relationship between wheel angular velocity and the resulting velocity (linear and angular) for\n", - "a differential drive robot.\n", + "a differential-drive robot.\n", "For car-like systems, we prefer to compute the linear and angular velocities of the robot\n", "with respect to the world coordinate frame as a function of the rate of change in the steering angle and of the robot's linear velocity (expressed in the body-attached frame).\n", "This is a more natural choice than using wheel speed, since the control input for a car-like robot is often\n", diff --git a/S72_drone_actions.ipynb b/S72_drone_actions.ipynb index 41154896..f7062281 100644 --- a/S72_drone_actions.ipynb +++ b/S72_drone_actions.ipynb @@ -752,7 +752,7 @@ "\\end{bmatrix}\n", "\\end{equation}\n", "where $l$ is the distance from the rotors to the center of mass, and $\\kappa$ is a torque constant.\n", - "From this equation you can see that a quadrotor is like a differential drive robot but with *two* differential axes. Controlling it is easy in principle: *tilt, then move*. A part of $F^b_z$ will be used to keep the quadrotor flying, the other part will be used to overcome drag and attain a constant velocity in the chosen direction.\n", + "From this equation you can see that a quadrotor is like a differential-drive robot but with *two* differential axes. Controlling it is easy in principle: *tilt, then move*. A part of $F^b_z$ will be used to keep the quadrotor flying, the other part will be used to overcome drag and attain a constant velocity in the chosen direction.\n", "\n", "The final piece of the puzzle is how exactly the rotors generate the individual forces $f_i$. We will not discuss that in detail here, but suffice to say that the forces $f_i$ are well understood functions of the rotor speed. Each motor also generates a small torque $\\kappa f_i$ around it's rotor axis, proportional to the force with proportionality constant $\\kappa$. The rotation direction for each rotor determines which sign to use to calculate the $z$-component of the body torque $\\tau^b$.\n", "\n", From 7fb23aa6944c8c5f19adb0fd83a18fd3d2856d0a Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 13:58:57 -0500 Subject: [PATCH 3/9] Section 5.1 --- S51_diffdrive_state.ipynb | 83 +++++++++++++++++++++------------------ 1 file changed, 45 insertions(+), 38 deletions(-) diff --git a/S51_diffdrive_state.ipynb b/S51_diffdrive_state.ipynb index 9d46a9df..367a0ea6 100644 --- a/S51_diffdrive_state.ipynb +++ b/S51_diffdrive_state.ipynb @@ -5,7 +5,7 @@ "id": "ghLEweEVYMmD", "metadata": {}, "source": [ - "# State Space for a Differential Drive Robot" + "# State Space for a differential-drive robot" ] }, { @@ -69,7 +69,7 @@ "```{index} state; configuration space\n", "```\n", "\n", - "> Unlike robots with omni-directional wheels, the orientation of a differential drive robot matters." + "> Unlike robots with omni-directional wheels, the orientation of a differential-drive robot matters." ] }, { @@ -81,7 +81,7 @@ ] }, "source": [ - "\"Splash" + "\"Splash" ] }, { @@ -89,19 +89,21 @@ "id": "mjHixibVjIOR", "metadata": {}, "source": [ + "```{index} pair: differential-drive robot; DDR\n", + "```\n", + "
\n", + "\"\"\n", + "
Two views of the Duckiebot platform. Note the two actuated wheels in the front of the robot, and the castor wheel in the back.
\n", + "
\n", + "\n", "So far we have seen two kinds of state space: discrete state spaces (the categories of trash, the rooms in a house)\n", "and continuous state spaces that were equivalent to $\\mathbb{R}^2$ (the position of a logistics robot in a warehouse).\n", "In this chapter, we consider a robot whose state space includes the orientation of the robot as well as its position.\n", - "In particular, we consider differential drive robots (DDRs), such as the Duckiebot shown below.\n", + "In particular, we consider differential-drive robots (DDRs), such as the Duckiebot shown in Figure [1](#fig:duckiebot).\n", "DDRs have two actuated wheels that share a common axis of rotation, and typically have a castor wheel in the back\n", "to stabilize the robot (without this castor wheel, the DDR would essentially be equivalent to a Segway two-wheeled scooter).\n", "Unlike robots with omni-directional wheels, DDRs cannot move in the direction parallel to the wheel axis -- they can only move\n", - "in the steering direction. Because of this, it is necessary to include the orientation of the robot in its state description.\n", - "\n", - "
\n", - "\"\"\n", - "
Two views of the Duckiebot platform. Note the two actuated wheels in the front of the robot, and the castor wheel in the back.
\n", - "
" + "in the steering direction. Because of this, it is necessary to include the orientation of the robot in its state description." ] }, { @@ -109,20 +111,22 @@ "id": "F4pYeCN49Kv7", "metadata": {}, "source": [ + "
\n", + "\"\"\n", + "
A Coordinate frame that is rigidly attached to a DDR.
\n", + "
\n", + "\n", + "```{index} body-attached frame\n", + "```\n", "Representing the state of the logistics robot was straightforward, we merely used the x- and y-coordinates of the robot's center of mass\n", "(in the case of a robot with circular shape, the center of this circle).\n", "Representing orientation is slightly more complex, and cannot be accomplished by merely encoding properties of a single point on the robot.\n", "Instead, we rigidly attach a coordinate frame to the robot, and define the robot state by the position of the origin of this\n", "frame and the orientation of the frame with repsect to the world frame.\n", - "We refer to the robot's frame as the body-attached frame, or merely the robot frame.\n", - "For DDR robots, it is typical to place the origin of the body-attached frame at the midpoint between the two wheels, and to align its\n", + "We refer to the robot's frame as the *body-attached frame*, or merely the robot frame.\n", + "For DDRs, it is typical to place the origin of the body-attached frame at the midpoint between the two wheels, and to align its\n", "x-axis with the forward steering direction. The y-axis is coincident with the axis of wheel rotation.\n", - "This frame is illustrated in the figure below.\n", - "\n", - "
\n", - "\"\"\n", - "
A Coordinate frame that is rigidly attached to a DDR.
\n", - "
\n" + "This frame is illustrated in Figure [2](#fig:DDR-coordinate-frame)." ] }, { @@ -146,7 +150,13 @@ "id": "qUYBzGvNnt2j", "metadata": {}, "source": [ - "As an example, consider the problem of determining the x-y position of the wheel centers for our DDR.\n", + "
\n", + "\"\"\n", + "
Determining the position of the wheel centers.
\n", + "
\n", + "\n", + "As an example, consider the problem of determining the x-y position of the wheel centers\n", + "for the DDR shown in Figure [3](#fig:DDR-wheel-centers).\n", "If the wheelbase (i.e., the distance between the two wheel centers) is denoted by $L$,\n", "and the robot is in configuration $q=(x,y.\\theta)$,\n", "then the x-y coordinates of the left and right wheel centers are given by\n", @@ -159,12 +169,7 @@ "\\left[ \\begin{array}{c} x_{\\mathrm{right}} \\\\ y_{\\mathrm{right}} \\end{array}\\right]\n", "=\n", "\\left[ \\begin{array}{c} x + \\frac{L}{2} \\sin \\theta \\\\ \\ \\\\ y - \\frac{L}{2} \\cos \\theta \\end{array}\\right]\n", - "\\end{equation}\n", - "\n", - "
\n", - "\"\"\n", - "
Determining the position of the wheel centers.
\n", - "
" + "\\end{equation}" ] }, { @@ -172,8 +177,13 @@ "id": "gmjDBptZ2m_L", "metadata": {}, "source": [ - "We can apply the same geometric analysis to any point on the robot. Consider a point $p$ that is\n", - "rigidly attached to the robot.\n", + "
\n", + "\"\"\n", + "
A Coordinate frame that is rigidly attached to a DDR.
\n", + "
\n", + "\n", + "We can apply the same geometric analysis to any point on the robot.\n", + "Consider a point $p$ that is rigidly attached to the robot, as shown in Figure [4](#fig:DDR-arbitrary-point).\n", "We can define the coordinates of $p$ with respect to the body-attached frame as\n", "$p^{\\mathrm{body}} = [p_x, p_y]^T$.\n", "If the robot is in configuration $q=(x,y.\\theta)$,\n", @@ -185,12 +195,7 @@ "x +p_x \\cos \\theta - p_y \\sin \\theta \\\\ \n", "y +p_x \\sin \\theta + p_y \\cos \\theta\n", " \\end{array}\\right]\n", - "\\end{equation}\n", - "\n", - "
\n", - "\"\"\n", - "
A Coordinate frame that is rigidly attached to a DDR.
\n", - "
\n" + "\\end{equation}" ] }, { @@ -226,7 +231,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": null, "id": "TQuWx3jaQch1", "metadata": {}, "outputs": [ @@ -241,7 +246,7 @@ ], "source": [ "pose = gtsam.Pose2(12.4, 42.5, math.radians(45))\n", - "print(f\"pose: {pose}with x={pose.x()}, y={pose.y()}, theta={pose.theta()}\")\n" + "print(f\"pose: {pose}with x={pose.x()}, y={pose.y()}, theta={pose.theta()}\")" ] }, { @@ -249,12 +254,13 @@ "id": "iYyqVKbscNBS", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "Note that internally we represent poses using radians, hence the ugly looking number above. Often, it makes sense to specify *and* display angles in degrees, which makes specifying poses and debugging code easier. Hence, we also provide a \"pretty\" version that does the conversion for us:" ] }, { "cell_type": "code", - "execution_count": 8, + "execution_count": null, "id": "GTVL3B859N73", "metadata": {}, "outputs": [ @@ -273,7 +279,7 @@ } ], "source": [ - "pretty(pose)\n" + "pretty(pose)" ] }, { @@ -281,6 +287,7 @@ "id": "DNInTYxbC3vq", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "Using a real number $\\theta$ to represent orientation, while convenient and familiar, is *not* ideal as numbers that are offset by 360 degrees represent the same orientation. In other words, there is *not* a one-to-one relationship between orientation and its representation as a float value. Hence, internally GTSAM stores the orientation as *two* numbers, the unit vector $(\\cos\\theta,\\sin\\theta)$, which *is* a unique representation." ] }, @@ -301,7 +308,7 @@ "id": "qOTrnD8Vqy1a", "metadata": {}, "source": [ - "A sampling-based representation is *much* easier: we just add a value $\\theta$ to each sample, or, even better, uses `gtsam.Pose2` samples." + "A sampling-based representation is *much* easier: we just add a value $\\theta$ to each sample, or, even better, uses `Pose2` samples." ] }, { From c74ca81af1e2fdd01f8dc9c0bceb6d2f4a803b62 Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 13:59:55 -0500 Subject: [PATCH 4/9] Remove gtsam namespace --- S21_sorter_state.ipynb | 2 +- S23_sorter_sensing.ipynb | 2 +- S24_sorter_perception.ipynb | 6 +++--- S26_sorter_learning.ipynb | 2 +- S31_vacuum_state.ipynb | 2 +- S32_vacuum_actions.ipynb | 8 ++++---- S33_vacuum_sensing.ipynb | 4 ++-- S41_logistics_state.ipynb | 2 +- S42_logistics_actions.ipynb | 6 +++--- S44_logistics_perception.ipynb | 2 +- S53_diffdrive_sensing.ipynb | 2 +- S61_driving_state.ipynb | 2 +- S63_driving_sensing.ipynb | 2 +- S64_driving_perception.ipynb | 6 +++--- S71_drone_state.ipynb | 2 +- S73_drone_sensing.ipynb | 2 +- S75_drone_planning.ipynb | 6 +++--- 17 files changed, 29 insertions(+), 29 deletions(-) diff --git a/S21_sorter_state.ipynb b/S21_sorter_state.ipynb index b7ad74dd..9cb5f56a 100644 --- a/S21_sorter_state.ipynb +++ b/S21_sorter_state.ipynb @@ -914,7 +914,7 @@ "id": "IKt2DaIm1Brr", "metadata": {}, "source": [ - "Above we created an instance of the `gtsam.DiscreteDistribution` class. As with any GTSAM class, you can type\n", + "Above we created an instance of the `DiscreteDistribution` class. As with any GTSAM class, you can type\n", "```python\n", "help(gtsam.DiscreteDistribution)\n", "```\n", diff --git a/S23_sorter_sensing.ipynb b/S23_sorter_sensing.ipynb index c0c59b6c..3ba680fd 100644 --- a/S23_sorter_sensing.ipynb +++ b/S23_sorter_sensing.ipynb @@ -712,7 +712,7 @@ "id": "tHvu2cyxbopL", "metadata": {}, "source": [ - "Above we created an instance of the `gtsam.DiscreteConditional` class. As with any GTSAM class, you can type\n", + "Above we created an instance of the `DiscreteConditional` class. As with any GTSAM class, you can type\n", "\n", "```python\n", "help(gtsam.DiscreteConditional)\n", diff --git a/S24_sorter_perception.ipynb b/S24_sorter_perception.ipynb index 4d154abc..2d9eea3e 100644 --- a/S24_sorter_perception.ipynb +++ b/S24_sorter_perception.ipynb @@ -1536,7 +1536,7 @@ "source": [ "### Factors\n", "\n", - "Above we created an instance of the `gtsam.DecisionTreeFactor` class. As with any GTSAM class, you can type\n", + "Above we created an instance of the `DecisionTreeFactor` class. As with any GTSAM class, you can type\n", "\n", "```python\n", "help(gtsam.DecisionTreeFactor)\n", @@ -1575,7 +1575,7 @@ "id": "3mAh1FdQ0xrA", "metadata": {}, "source": [ - "The factors we created above are of type `gtsam.DecisionTreeFactor`, which are stored as decision trees:" + "The factors we created above are of type `DecisionTreeFactor`, which are stored as decision trees:" ] }, { @@ -1677,7 +1677,7 @@ } ], "source": [ - "#| caption: Decision tree in a `gtsam.DecisionTreeFactor`.\n", + "#| caption: Decision tree in a `DecisionTreeFactor`.\n", "#| label: fig:decision_tree_factor\n", "show(conductivity_false_factor)" ] diff --git a/S26_sorter_learning.ipynb b/S26_sorter_learning.ipynb index ff52c4c9..6d814526 100644 --- a/S26_sorter_learning.ipynb +++ b/S26_sorter_learning.ipynb @@ -573,7 +573,7 @@ } }, "source": [ - "A `gtsam.DiscreteConditional` determines the counts, grouped by the conditioning variable. In our case, `Category` can take on 5 separate values, and hence we have five groups. For example, for a binary sensor:\n" + "A `DiscreteConditional` determines the counts, grouped by the conditioning variable. In our case, `Category` can take on 5 separate values, and hence we have five groups. For example, for a binary sensor:\n" ] }, { diff --git a/S31_vacuum_state.ipynb b/S31_vacuum_state.ipynb index 537c0256..7fe20977 100644 --- a/S31_vacuum_state.ipynb +++ b/S31_vacuum_state.ipynb @@ -369,7 +369,7 @@ "id": "2OOSTBL1a4sV", "metadata": {}, "source": [ - "When we print the results, we see that we now get a dictionary of `DiscreteKeys`, i.e., integer tuples of the form *(`Key`, cardinality)*. However, the \"keys\" now seem to be very large integers. This is because for series of variables we use the `gtsam.Symbol` type, composed of a single character and an integer index:" + "When we print the results, we see that we now get a dictionary of `DiscreteKeys`, i.e., integer tuples of the form *(`Key`, cardinality)*. However, the \"keys\" now seem to be very large integers. This is because for series of variables we use the `Symbol` type, composed of a single character and an integer index:" ] }, { diff --git a/S32_vacuum_actions.ipynb b/S32_vacuum_actions.ipynb index 28372e48..bb1d792f 100644 --- a/S32_vacuum_actions.ipynb +++ b/S32_vacuum_actions.ipynb @@ -1419,7 +1419,7 @@ "\n", "> The GTSAM concepts used in this section, explained.\n", "\n", - "As in Chapter 2, we once again used a `gtsam.DiscreteConditional`, this time to specify a motion model for the controlled Markov chain above, as shown in Figure [4](#vacuum-motion-model)." + "As in Chapter 2, we once again used a `DiscreteConditional`, this time to specify a motion model for the controlled Markov chain above, as shown in Figure [4](#vacuum-motion-model)." ] }, { @@ -1427,9 +1427,9 @@ "id": "1GGXFMgV1VUb", "metadata": {}, "source": [ - "To specify the motion model, we used the `gtsam.DiscreteBayesNet` class, and in particular these methods:\n", + "To specify the motion model, we used the `DiscreteBayesNet` class, and in particular these methods:\n", "\n", - "- `add(self:, key: Tuple[int, int], parents: List[Tuple[int, int]], spec: str) -> None`: adds a conditional with the same arguments as the `gtsam.DiscreteConditional` constructor.\n", + "- `add(self:, key: Tuple[int, int], parents: List[Tuple[int, int]], spec: str) -> None`: adds a conditional with the same arguments as the `DiscreteConditional` constructor.\n", "- `at(self, i: int) -> gtsam.DiscreteConditional`: retrieves the $i^{th}$ conditional added." ] }, @@ -1522,7 +1522,7 @@ "id": "22gJu5XkegOn", "metadata": {}, "source": [ - "Finally, a word about the graphs above. You might wonder, why these graphs come out so beautifully positioned, e.g., to indicate time from left to right. This was accomplished with the `hints` argument, which positions variables series at an appropriate height. Similarly, the `boxes` argument (which takes `gtsam.Keys`, not tuples) indicates which variables should considered as given.\n", + "Finally, a word about the graphs above. You might wonder, why these graphs come out so beautifully positioned, e.g., to indicate time from left to right. This was accomplished with the `hints` argument, which positions variables series at an appropriate height. Similarly, the `boxes` argument (which takes `Keys`, not tuples) indicates which variables should considered as given.\n", "\n", "These arguments are handled in the `gtbook` library {cite:p}`gtbook`, and are passed on in the appropriate format to the underlying GTSAM `dot` methods, which generate graphviz-style graphs{cite:p}`graphviz`." ] diff --git a/S33_vacuum_sensing.ipynb b/S33_vacuum_sensing.ipynb index 79b923ac..75edd92c 100644 --- a/S33_vacuum_sensing.ipynb +++ b/S33_vacuum_sensing.ipynb @@ -647,7 +647,7 @@ "id": "HlzAWlJNSilC", "metadata": {}, "source": [ - "We use the `gtsam.DiscreteBayesNet` method `sample`, with signature\n", + "We use the `DiscreteBayesNet` method `sample`, with signature\n", "\n", "```python\n", " sample(self, given: gtsam::DiscreteValues) -> gtsam::DiscreteValues\n", @@ -660,7 +660,7 @@ "metadata": {}, "source": [ "It implements ancestral sampling, but does assume that the Bayes net is reverse topologically sorted, i.e. last\n", - "conditional will be sampled first. In addition, it can optionally take an assignment for certain *given* variables, as a `gtsam.DiscreteValues` instance.\n", + "conditional will be sampled first. In addition, it can optionally take an assignment for certain *given* variables, as a `DiscreteValues` instance.\n", "In that case, it is also assumed that the Bayes net does not contain any conditionals for the given values.\n", "We used this functionality to pass the given action sequence above." ] diff --git a/S41_logistics_state.ipynb b/S41_logistics_state.ipynb index c0337ef9..384e86fd 100644 --- a/S41_logistics_state.ipynb +++ b/S41_logistics_state.ipynb @@ -472,7 +472,7 @@ "\n", "> The GTSAM concepts used in this section, explained.\n", "\n", - "We really used only one concept from GTSAM above, which is `gtsam.Point2`. For maximal compatibility with numpy, in python this is just a function that creates a 2D, float numpy array. Inside GTSAM, it is represented as an Eigen vector, where Eigen is the C++ equivalent of numpy." + "We really used only one concept from GTSAM above, which is `Point2`. For maximal compatibility with numpy, in python this is just a function that creates a 2D, float numpy array. Inside GTSAM, it is represented as an Eigen vector, where Eigen is the C++ equivalent of numpy." ] }, { diff --git a/S42_logistics_actions.ipynb b/S42_logistics_actions.ipynb index 331f8ec7..f1c2b827 100644 --- a/S42_logistics_actions.ipynb +++ b/S42_logistics_actions.ipynb @@ -912,14 +912,14 @@ "id": "EzTs7UlNSH3r", "metadata": {}, "source": [ - "A `gtsam.GaussianDensity` class can be constructed via the following named constructor:\n", + "A `GaussianDensity` class can be constructed via the following named constructor:\n", "\n", "```python\n", "FromMeanAndStddev(key: gtsam.Key, mean: np.array, sigma: float) -> gtsam.GaussianDensity\n", "```\n", "\n", "{raw:tex}`\\noindent`\n", - "and two similar named constructors exists for `gtsam.GaussianConditional`:\n", + "and two similar named constructors exists for `GaussianConditional`:\n", "\n", "```python\n", "- FromMeanAndStddev(key: gtsam.Key, A: np.array, parent: gtsam.Key, b: numpy.ndarray[numpy.float64[m, 1]], sigma: float) -> gtsam.GaussianConditional\n", @@ -976,7 +976,7 @@ "id": "Xlcqd_5WJoBS", "metadata": {}, "source": [ - "In the above, all error functions take an instance of `gtsam.VectorValues`, which is simply a map from GTSAM keys to values as vectors. This is the equivalent of `gtsam.DiscreteValues` from the previous sections." + "In the above, all error functions take an instance of `VectorValues`, which is simply a map from GTSAM keys to values as vectors. This is the equivalent of `DiscreteValues` from the previous sections." ] } ], diff --git a/S44_logistics_perception.ipynb b/S44_logistics_perception.ipynb index 231baaae..5a48fafb 100644 --- a/S44_logistics_perception.ipynb +++ b/S44_logistics_perception.ipynb @@ -225,7 +225,7 @@ "id": "Du5R3YoGb21M", "metadata": {}, "source": [ - "Note above we used a `gtsam.VectorValues` to store the ground truth trajectory, which will come in handy again when we simulate the measurements. In Figure [2](#fig:logistics-ground-truth) we show this ground truth trajectory overlaid on the warehouse map we introduced before." + "Note above we used a `VectorValues` to store the ground truth trajectory, which will come in handy again when we simulate the measurements. In Figure [2](#fig:logistics-ground-truth) we show this ground truth trajectory overlaid on the warehouse map we introduced before." ] }, { diff --git a/S53_diffdrive_sensing.ipynb b/S53_diffdrive_sensing.ipynb index 3e8d006d..33c1bff6 100644 --- a/S53_diffdrive_sensing.ipynb +++ b/S53_diffdrive_sensing.ipynb @@ -472,7 +472,7 @@ "\n", "> Everything above and more.\n", "\n", - "In GTSAM you have access to several calibration models, with the simple one above corresponding to `gtsam.Cal3_S2`:" + "In GTSAM you have access to several calibration models, with the simple one above corresponding to `Cal3_S2`:" ] }, { diff --git a/S61_driving_state.ipynb b/S61_driving_state.ipynb index ae6ac70b..4cc97653 100644 --- a/S61_driving_state.ipynb +++ b/S61_driving_state.ipynb @@ -762,7 +762,7 @@ "id": "919LyWTuLA6u", "metadata": {}, "source": [ - "Using a 2D position and a 2D rotation we can create a 2D pose, represented by a `gtsam.Pose2`.As always, you can execute `help(gtsam.Pose2)` to get the full documentation of a class. Below is an excerpt with some useful methods. We have several constructors:\n", + "Using a 2D position and a 2D rotation we can create a 2D pose, represented by a `Pose2`.As always, you can execute `help(gtsam.Pose2)` to get the full documentation of a class. Below is an excerpt with some useful methods. We have several constructors:\n", "\n", "```python\n", "__init__(...)\n", diff --git a/S63_driving_sensing.ipynb b/S63_driving_sensing.ipynb index 27483cfb..cbcb02fd 100644 --- a/S63_driving_sensing.ipynb +++ b/S63_driving_sensing.ipynb @@ -319,7 +319,7 @@ "\\end{equation}\n", "with transformed plane parameters $\\hat{n}^b \\doteq (R^w_b)^T \\hat{n}^w$ and $d^b \\doteq d^w - \\hat{n}^w \\cdot t^w_b$.\n", "\n", - "We can use a `gtsam.Pose2` or `gtsam.Pose3` object to specify the body frame, respectively in 2D or 3D, and then use it to transform plane coordinates:" + "We can use a `Pose2` or `Pose3` object to specify the body frame, respectively in 2D or 3D, and then use it to transform plane coordinates:" ] }, { diff --git a/S64_driving_perception.ipynb b/S64_driving_perception.ipynb index 7a2b3f34..d08c6502 100644 --- a/S64_driving_perception.ipynb +++ b/S64_driving_perception.ipynb @@ -646,7 +646,7 @@ "id": "7WvObRdMv2-3", "metadata": {}, "source": [ - "Before we can optimize, we need to create an initial estimate. In GTSAM, this is done via the `gtsam.Values` type:" + "Before we can optimize, we need to create an initial estimate. In GTSAM, this is done via the `Values` type:" ] }, { @@ -871,7 +871,7 @@ "```\n", "We can also inspect the result graphically. Looking at the result as printed above only gets us so far, and more importantly, it only shows us the maximum a posteriori (MAP) solution, but not the uncertainty around it. Luckily, GTSAM can also compute the **posterior marginals**, which show the uncertainty on each recovered pose as a Gaussian density $P(T_i|Z)$, taking into account all the measurements $Z$.\n", "\n", - "In code, we do this via the `gtsam.Marginals` object, and we can plot marginals with a special function `plot_pose2`:" + "In code, we do this via the `Marginals` object, and we can plot marginals with a special function `plot_pose2`:" ] }, { @@ -4551,7 +4551,7 @@ "- There are about 20 landmarks, some of which are seen briefly, while others are seen for longer periods of time.\n", "- The graph is very sparsely connected, and hence optimization will still be quite fast.\n", "\n", - "Optimizing with `gtsam.LevenbergMarquardtOptimizer`, again..." + "Optimizing with `LevenbergMarquardtOptimizer`, again..." ] }, { diff --git a/S71_drone_state.ipynb b/S71_drone_state.ipynb index 9ba8e538..78b01dbd 100644 --- a/S71_drone_state.ipynb +++ b/S71_drone_state.ipynb @@ -558,7 +558,7 @@ "id": "AQ1qWXnQp6Qd", "metadata": {}, "source": [ - "Using a 3D position and a 3D rotation we can create a 3D pose, represented by a `gtsam.Pose3`. As always, you can execute `help(gtsam.Pose3)` to get the full documentation of a class. Below is an excerpt with some useful methods. We have several constructors:\n", + "Using a 3D position and a 3D rotation we can create a 3D pose, represented by a `Pose3`. As always, you can execute `help(gtsam.Pose3)` to get the full documentation of a class. Below is an excerpt with some useful methods. We have several constructors:\n", "\n", "```python\n", " __init__(...)\n", diff --git a/S73_drone_sensing.ipynb b/S73_drone_sensing.ipynb index 07aa29eb..63cd40b1 100644 --- a/S73_drone_sensing.ipynb +++ b/S73_drone_sensing.ipynb @@ -237,7 +237,7 @@ "id": "QT-8dEgxUKpG", "metadata": {}, "source": [ - "To specify the *orientation* $R^b_c$ for each of the cameras, we need to remember that (a) the $z$-axis points into the scene, and (b) the $y$-axis points down. The easiest way to specify this is by using the `gtsam.Rot3` constructor that takes three column vectors:" + "To specify the *orientation* $R^b_c$ for each of the cameras, we need to remember that (a) the $z$-axis points into the scene, and (b) the $y$-axis points down. The easiest way to specify this is by using the `Rot3` constructor that takes three column vectors:" ] }, { diff --git a/S75_drone_planning.ipynb b/S75_drone_planning.ipynb index 6aea7ac2..174f77fa 100644 --- a/S75_drone_planning.ipynb +++ b/S75_drone_planning.ipynb @@ -460,7 +460,7 @@ "source": [ "## Factor Graphs for Trajectory Optimization\n", "\n", - "Now that we can evaluate cost and its derivatives at any location, we can create factors. Since occupancy maps or cost maps are not built into GTSAM, we use its facility to create a custom factor from arbitrary python code. A `gtsam.CustomFactor` just has a constructor that can take an arbitrary python callback function, along with a `Key` (and a noise model). At evaluation, the callback function gets a handle to the factor and a `gtsam.Values` object, and it does three things:\n", + "Now that we can evaluate cost and its derivatives at any location, we can create factors. Since occupancy maps or cost maps are not built into GTSAM, we use its facility to create a custom factor from arbitrary python code. A `CustomFactor` just has a constructor that can take an arbitrary python callback function, along with a `Key` (and a noise model). At evaluation, the callback function gets a handle to the factor and a `Values` object, and it does three things:\n", "\n", "- check which variable is involved, by asking the factor;\n", "- with that key, extract the current estimate from the passed in `Values`;\n", @@ -548,7 +548,7 @@ "id": "af1kqxLmvZdt", "metadata": {}, "source": [ - "Plotting this path on the cost map shows that, in this instance, we are not lucky, however: the path definitely passes through some obstacles. We can immediately create all obstacle cost factors now, and evaluate the cost to confirm that this is not a zero-cost path. Since these factors are nonlinear, we store them in a `gtsam.NonlinearFactorGraph`:" + "Plotting this path on the cost map shows that, in this instance, we are not lucky, however: the path definitely passes through some obstacles. We can immediately create all obstacle cost factors now, and evaluate the cost to confirm that this is not a zero-cost path. Since these factors are nonlinear, we store them in a `NonlinearFactorGraph`:" ] }, { @@ -608,7 +608,7 @@ "id": "4RCPWR4x0mOm", "metadata": {}, "source": [ - "A complete path/trajectory optimization problem needs more than just obstacle factors, however. We also need to add start and goal factors, as well as smoothness factors. Smoothness factors are implemented using a `gtsam.BetweenFactor`, \n", + "A complete path/trajectory optimization problem needs more than just obstacle factors, however. We also need to add start and goal factors, as well as smoothness factors. Smoothness factors are implemented using a `BetweenFactor`, \n", "and these help to ensure that the distance between successive points on the path is not too great.\n", "\n", "Start and goal factors impose penalties when the starting point, $X_1$, is not near to the current position,\n", From e95197428e74e4c4231973aedd0dbe543c1e7057 Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 14:25:55 -0500 Subject: [PATCH 5/9] Sections 5.1-5.2 --- S44_logistics_perception.ipynb | 2 +- S51_diffdrive_state.ipynb | 2 +- S52_diffdrive_actions.ipynb | 128 ++++++++++++++++++++------------- 3 files changed, 80 insertions(+), 52 deletions(-) diff --git a/S44_logistics_perception.ipynb b/S44_logistics_perception.ipynb index 5a48fafb..cb5117c1 100644 --- a/S44_logistics_perception.ipynb +++ b/S44_logistics_perception.ipynb @@ -585,7 +585,7 @@ "\n", "```{index} particle filter\n", "```\n", - "```{index} pair: MCL; Monte Carlo Localization\n", + "```{index} pair: MCL; Monte Carlo localization\n", "```\n", "The above finite element discretization of space is very costly, and most of the memory and computation is used to compute near-zero probabilities. \n", "While there *are* ways to deal with this, switching to a sampling-based representation gets us more bang for the buck computation-wise. And, as we will see, it also leads to a very simple algorithm.\n", diff --git a/S51_diffdrive_state.ipynb b/S51_diffdrive_state.ipynb index 367a0ea6..dc6c8a3a 100644 --- a/S51_diffdrive_state.ipynb +++ b/S51_diffdrive_state.ipynb @@ -300,7 +300,7 @@ "\n", "> We also need to think about probability densities over poses. \n", "\n", - "It is conceptually easy to extend the *finite element* approximation to include orientation: just discretize $\\theta$ using some chosen resolution, e.g., one bin for every 5 degrees. However, one thing to keep in mind is that angles *wrap*. Hence, the topology of the \"map\" in the orientation dimension is like a torus." + "It is conceptually easy to extend the *finite element* approximation to include orientation: just discretize $\\theta$ using some chosen resolution, e.g., one bin for every 5 degrees. However, one thing to keep in mind is that angles *wrap*. Hence, the topology of the \"map\" in the orientation dimension is like a circle." ] }, { diff --git a/S52_diffdrive_actions.ipynb b/S52_diffdrive_actions.ipynb index 1bcf1913..dda19807 100644 --- a/S52_diffdrive_actions.ipynb +++ b/S52_diffdrive_actions.ipynb @@ -23,7 +23,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "id": "b6FbWLqdiOWi", "metadata": { "tags": [ @@ -40,7 +40,7 @@ } ], "source": [ - "%pip install -q -U gtbook\n" + "%pip install -q -U gtbook" ] }, { @@ -93,15 +93,20 @@ "we were able to simply ignore the body-attached frame, and reason directly in the world frame without difficulty.\n", "Things are more complex for our DDR, due to the role of orientation.\n", "\n", + "
\n", + "\"\"\n", + "
The linear velocity is always in the steering direction.
\n", + "
\n", + "\n", "When describing the motion of our DDR, the orientation of the robot enters in two ways.\n", "First, because the robot wheels roll without slipping,\n", "the linear velocity of the robot is always instantaneously in the steering direction.\n", "Second, because the robot can rotate, we must take account of its angular velocity, in addition to the linear velocity.\n", - "This is illustrated in the figure below.\n", + "This is illustrated in Figure [1](#fig:DDR-velocity).\n", "Suppose the robot is following a path $\\gamma(s)$ (where $s$ parameterizes the path).\n", "The instantaneous linear velocity expressed with respect to the body frame is given by:\n", "\\begin{equation}\n", - "v^{\\mathrm{body,linear}}\n", + "v^{\\mathrm{body,linear}}=\n", "\\begin{bmatrix} v_x \\\\ 0 \\end{bmatrix}\n", "\\end{equation}\n", "Note that the velocity is tangent to the curve $\\gamma$ at $s$, and that in the body-attached frame the y-component of the velocity\n", @@ -109,26 +114,33 @@ "The steering direction is determined by the angle $\\theta$ as $[\\cos \\theta, \\sin \\theta]^T$,\n", "so that the linear velocity with respect to the world frame is given by\n", "\\begin{equation}\n", - "v^{\\mathrm{world,linear}}\n", + "v^{\\mathrm{world,linear}}=\n", "\\begin{bmatrix} v_x \\cos \\theta \\\\ v_x \\sin \\theta \\end{bmatrix}\n", "\\end{equation}\n", "Because our robot moves in the plane, the z-axis of the body-attached frame is always parallel to the z-axis of the world frame.\n", "This greatly simplifies the description of angular velocity, which in this case we may define as $\\omega = \\dot{\\theta}$,\n", - "the instantaneous rate of change of the robot's orientation.\n", - "\n", - "It is common to combine the angular and linear velocity into a single vector, \n", + "the instantaneous rate of change of the robot's orientation." + ] + }, + { + "cell_type": "markdown", + "id": "82964258", + "metadata": {}, + "source": [ + "It is common to combine the angular and linear velocity into a single vector, either in the body frame,\n", "\\begin{equation}\n", "v^{\\mathrm{body}}=\n", - "\\begin{bmatrix} v_x \\\\ 0 \\\\ \\dot{\\theta} \\end{bmatrix}\n", - "~~~~~~~\n", - "v^{\\mathrm{world}}=\n", - "\\begin{bmatrix} v_x \\cos \\theta \\\\ v_x \\sin \\theta \\\\ \\dot{\\theta} \\end{bmatrix}\n", + "\\begin{bmatrix}\n", + "v_x \\\\ 0 \\\\ \\dot{\\theta} \n", + "\\end{bmatrix},\n", "\\end{equation}\n", - "\n", - "
\n", - "\"\"\n", - "
The linear velocity is always in the steering direction.
\n", - "
" + "or in the world frame:\n", + "\\begin{equation}\n", + "v^{\\mathrm{world}}=\n", + "\\begin{bmatrix}\n", + "v_x \\cos \\theta \\\\ v_x \\sin \\theta \\\\ \\dot{\\theta}\n", + "\\end{bmatrix}.\n", + "\\end{equation}" ] }, { @@ -142,7 +154,18 @@ "We can derive the relationship between wheel rotation and robot velocity by considering first the motion of a single\n", "wheel, and then considering the effect of coupling the two wheels along a single axis of rotation.\n", "\n", - "The figure below shows a side-view of the right wheel.\n", + "
\n", + "\"\"\n", + "
The linear velocity is always in the steering direction.
\n", + "
" + ] + }, + { + "cell_type": "markdown", + "id": "79643ba2", + "metadata": {}, + "source": [ + "Figure [2](#fig:DDR-one-wheel) shows a side-view of the right wheel.\n", "We denote by $\\phi_R$ the instantaneous orientation of the right wheel with respect to the world z-axis (Note\n", "that we measure the angle $\\phi_R$ by attaching a distinguished point to the wheel, so that we can uniquely identify\n", "its orientation. In the figure, a red star is used to denote this point.)\n", @@ -155,12 +178,7 @@ "The same reasoning can be applied to the left wheel to obtain\n", "\\begin{equation}\n", "v_\\mathrm{left} = r \\dot{\\phi}_L\n", - "\\end{equation}\n", - "\n", - "
\n", - "\"\"\n", - "
The linear velocity is always in the steering direction.
\n", - "
\n" + "\\end{equation}" ] }, { @@ -168,7 +186,12 @@ "id": "ZZa1f28eSLf3", "metadata": {}, "source": [ - "Suppose now that both wheels spin at the same speed, $\\dot{\\phi}_R = \\dot{\\phi}_L$.\n", + "
\n", + "\"\"\n", + "
When the wheels spin in the same direction with the same speed, the robot moves with pure translation.
\n", + "
\n", + "\n", + "Suppose now that both wheels spin at the same speed, $\\dot{\\phi}_R = \\dot{\\phi}_L$, as in Figure [3](#fig:DDR-pure-translation).\n", "In this case, the forward speed of the wheels will also be equal, $v_\\mathrm{left} = v_\\mathrm{right}$,\n", "and the robot will move in purely translational motion (i.e., $\\omega = 0$), with\n", "$v_x = v_\\mathrm{left} = v_\\mathrm{right}$, since all points on the robot move with exactly the\n", @@ -177,11 +200,7 @@ "\\begin{equation}\n", "\\dot{\\phi}_L = \\dot{\\phi}_R = \\frac{v_x}{r}\n", "\\end{equation}\n", - "\n", - "
\n", - "\"\"\n", - "
When the wheels spin in the same direction with the same speed, the robot moves with pure translation.
\n", - "
" + "\n" ] }, { @@ -189,30 +208,36 @@ "id": "0uZqmFo9Rnn6", "metadata": {}, "source": [ - "Suppose instead that the two wheels spin in opposite directions,\n", - "so that $\\dot{\\phi}_R = -\\dot{\\phi}_L$.\n", + "
\n", + "\"\"\n", + "
When the wheels spin in the opposite direction with the same speed, the robot moves with pure rotation..
\n", + "
\n", + "\n", + "If instead, as in Figure [4](#fig:DDR-pure-rotation), the two wheels spin in opposite directions, i.e., we have $\\dot{\\phi}_R = -\\dot{\\phi}_L$.\n", "In this case, $v_\\mathrm{left} = -r\\dot{\\phi}_L $ and $v_\\mathrm{right} = r\\dot{\\phi}_R$.\n", "Because the two wheels are constrained by the physical mechanism to remain in a fixed geometric relationship\n", "to one another, these opposite but equal forward wheel speeds cause the robot to rotate,\n", "with both $v_\\mathrm{left}$ and $v_\\mathrm{right}$ tangent to a circle of diameter $L$ centered at the origin of the body-attached frame.\n", "Note that the linear velocity of the robot, $v_x$, is zero in this case,\n", - "since $v_\\mathrm{left}$ and $v_\\mathrm{right}$ \"cancel one another out\" with respect to the linear velocity of the robot.\n", + "since $v_\\mathrm{left}$ and $v_\\mathrm{right}$ \"cancel one another out\" with respect to the linear velocity of the robot." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ "Applying the equation of circular motion yields\n", "\\begin{equation}\n", - "\\frac{L}{2} \\omega = -v_\\mathrm{left} = -r\\dot{\\phi}_R\n", - "~~~~~~~\n", + "\\frac{L}{2} \\omega = -v_\\mathrm{left} = -r\\dot{\\phi}_L\n", + "\\,\\,\\,\\,\\,\\,\\,\\,\n", "\\frac{L}{2} \\omega = v_\\mathrm{right} = r\\dot{\\phi}_R\n", "\\end{equation}\n", "which leads to\n", "\\begin{equation}\n", "\\dot{\\phi}_L= -\\frac{L}{2} \\frac{\\omega}{r}\n", - "~~~~~~~\n", + "\\,\\,\\,\\,\\,\\,\\,\\,\n", "\\dot{\\phi}_R= \\frac{L}{2} \\frac{\\omega}{r}\n", - "\\end{equation}\n", - "
\n", - "\"\"\n", - "
When the wheels spin in the opposite direction with the same speed, the robot moves with pure rotation..
\n", - "
\n" + "\\end{equation}" ] }, { @@ -236,7 +261,7 @@ "Given a desired *output* specified by $v$ and $\\omega$,\n", "determine the required $input$ specified as $\\dot{\\phi}_R$ and $\\dot{\\phi}_L$.\n", "These equations can be used to determine the required wheel actuation to achieve\n", - "the desired linear and angular velocities of the robot.\n" + "the desired linear and angular velocities of the robot." ] }, { @@ -280,26 +305,26 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "id": "qthL5AQCjdrp", "metadata": {}, "outputs": [], "source": [ "def ddr_ik(v_x, omega, L=0.5, r=0.1):\n", " \"\"\"DDR inverse kinematics: calculate wheels speeds from desired velocity.\"\"\"\n", - " return (v_x - (L/2)*omega)/r, (v_x + (L/2)*omega)/r\n" + " return (v_x - (L/2)*omega)/r, (v_x + (L/2)*omega)/r" ] }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "id": "dkatPWe1bj2p", "metadata": {}, "outputs": [], "source": [ "def ddr_fk(phidot_L, phidot_R, L=0.5, r=0.1):\n", " \"\"\"DDR inverse kinematics: calculate wheels speeds from desired velocity.\"\"\"\n", - " return gtsam.Point3((phidot_R+phidot_L)*r/2, 0, (phidot_R-phidot_L)*r/L)\n" + " return gtsam.Point3((phidot_R+phidot_L)*r/2, 0, (phidot_R-phidot_L)*r/L)" ] }, { @@ -307,12 +332,13 @@ "id": "EqPp3G8KIuxp", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "As an example, let us try to move forward with a velocity of 20 cm/s, while turning counterclockwise at 0.3 rad/s:" ] }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "id": "eteMiyZh3YUI", "metadata": {}, "outputs": [ @@ -326,7 +352,7 @@ ], "source": [ "phidot_L, phidot_R = ddr_ik(v_x=0.2, omega=0.3)\n", - "print(phidot_L, phidot_R)\n" + "print(phidot_L, phidot_R)" ] }, { @@ -334,12 +360,13 @@ "id": "2GqqqCFTX_am", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "As expected, the left wheel rotates less quickly, making us turn counter-clockwise. To sanity-check, let us put these same wheel speeds through the *forward* kinematics:" ] }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "id": "7UD-jHlAXUas", "metadata": {}, "outputs": [ @@ -352,7 +379,7 @@ } ], "source": [ - "print(ddr_fk(phidot_L, phidot_R))\n" + "print(ddr_fk(phidot_L, phidot_R))" ] }, { @@ -360,6 +387,7 @@ "id": "uc_kexkKN6MW", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "The velocities are as desired, validating both the equations and their implementation. Feel free to experiment with other values using the code above!" ] } From 65dd72ea63c8381b17b06023233ceb50092dba8a Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 14:59:50 -0500 Subject: [PATCH 6/9] Section 5.3 --- S53_diffdrive_sensing.ipynb | 156 +++++++++++++++++++++--------------- 1 file changed, 92 insertions(+), 64 deletions(-) diff --git a/S53_diffdrive_sensing.ipynb b/S53_diffdrive_sensing.ipynb index 33c1bff6..8fba064f 100644 --- a/S53_diffdrive_sensing.ipynb +++ b/S53_diffdrive_sensing.ipynb @@ -5,7 +5,7 @@ "id": "azt6MDJEGy3_", "metadata": {}, "source": [ - "# Robot Vision" + "# Cameras for Robot Vision" ] }, { @@ -23,7 +23,7 @@ }, { "cell_type": "code", - "execution_count": 14, + "execution_count": null, "id": "l9PoxhNACByz", "metadata": { "tags": [ @@ -40,12 +40,12 @@ } ], "source": [ - "%pip install -q -U gtbook\n" + "%pip install -q -U gtbook" ] }, { "cell_type": "code", - "execution_count": 15, + "execution_count": null, "id": "Xanc4Al-lQIn", "metadata": { "tags": [ @@ -67,7 +67,7 @@ "from gtbook import diffdrive\n", "from gtbook.html import ROW\n", "\n", - "FIG5 = \"https://raw.githubusercontent.com/gtbook/robotics/main/Figures5\"\n" + "FIG5 = \"https://raw.githubusercontent.com/gtbook/robotics/main/Figures5\"" ] }, { @@ -107,20 +107,28 @@ "id": "thYsnivDXyNJ", "metadata": {}, "source": [ - "```{index} Camera Obscura, photography, digital cameras, pixels\n", - "```\n", - "## Cameras\n", + "## Cameras throughout History\n", "\n", + "```{index} Camera Obscura, photography, pinhole\n", + "```\n", "> The basic ideas behind cameras have been around for centuries.\n", "\n", "Everyone knows what a camera is these days, and you probably have between 1 and 5 on your phone, depending on what model you have.\n", "\n", - "Historically, a **Camera Obscura**, literally \"dark room\", showed people that focused *upside-down* images can be formed on a surface, provided the light rays coming from outside the room were constricted to a small \"pinhole\". If you have never experienced this in real-life, it is a worthwhile experience to see this with your own eyes. One of the surprising but obvious properties of a camera obscura is that the images *move*: it really is *video obscura*.\n", + "Historically, a **Camera Obscura**, literally \"dark room\", showed people that focused *upside-down* images can be formed on a surface, provided the light rays coming from outside the room were constricted to a small **pinhole**. If you have never experienced this in real life, it is a worthwhile experience to see this with your own eyes. One of the surprising but obvious properties of a camera obscura is that the images *move*: the \"camera\" or room does not apply a still photo!\n", "\n", + "```{index} Daguerreotype\n", + "```\n", "The question then is how to capture these fleeting images. Da Vinci wrote extensively about using the camera obscura for drawing, and several 17th century painters may have used it in their painting process, the most famous of them being [Johannes Vermeer](https://en.wikipedia.org/wiki/Johannes_Vermeer).\n", "The invention of **photography** (painting with light!) is usually credited to [Niépce](https://en.wikipedia.org/wiki/Nic%C3%A9phore_Ni%C3%A9pce), who used a light-sensitive material to capture the light around 1825. However, it was his partner [Daguerre](https://en.wikipedia.org/wiki/Louis_Daguerre) who introduced photography to the world on a large scale via his *Daguerreotype* process, released into the public domain in 1839.\n", "\n", - "Since the 1990s, **digital cameras** have replaced cameras based on chemical emulsions, using CCDs (charged-coupled devices) or CMOS sensors as the underlying technology. Both sensor types capture photons in an array of picture elements or **pixels**. We will not discuss in detail how these devices work, but in essence both sensor types count how many photons fall onto each pixel's area over a given time period. Below we discuss the more practical matter of the format in which images come to us, and how they can be used for robot vision." + "```{index} digital cameras, pixel\n", + "```\n", + "```{index} pair: CCD; charged-coupled device\n", + "```\n", + "```{index} pair: CMOS; complementary metal-oxide-semiconductor\n", + "```\n", + "Since the 1990s, **digital cameras** have replaced cameras based on chemical emulsions, using CCDs (charged-coupled devices) or CMOS (complementary metal-oxide-semiconductor) sensors as the underlying technology. Both sensor types capture photons in an array of picture elements or **pixels**. We will not discuss in detail how these devices work, but in essence both sensor types count how many photons fall onto each pixel's area over a given time period. Below we discuss the more practical matter of the format in which images come to us, and how they can be used for robot vision." ] }, { @@ -128,8 +136,6 @@ "id": "Wht3neS4pInK", "metadata": {}, "source": [ - "```{index} lens, resolution, focal length, field of view\n", - "```\n", "## Cameras for Robot Vision\n", "\n", "> A camera is two sensors in one.\n", @@ -139,14 +145,20 @@ "This information can be analyzed by computer vision algorithms to recognize objects and analyze the scene in front of the robot. In this section we focus on the basics of image formation,\n", "however, and leave algorithms for Section 5.4.\n", "\n", - "A pinhole by itself is rather amazing, as it renders the entire scene in front entirely *in focus*. However, it has a large drawback, in that it only lets in a tiny amount of light. The solution is to use a **lens**, which *collects* light over a larger diameter and *focuses* the light onto the image sensor. The upshot is that we can collect a lot more light (photons) in the same amount of time. The *downside* is that only part of the scene can be in focus at a given time - a phenomenon that leads to the \"depth of field\" of a camera: the (possibly narrow) area between where objects are too close or too far to be in focus.\n", + "```{index} lens, pinhole, depth of field\n", + "```\n", + "A pinhole by itself is rather amazing, as it renders the entire scene in front entirely *in focus*. However, it has a large drawback, in that it only lets in a tiny amount of light. The solution is to use a **lens**, which *collects* light over a larger diameter and *focuses* the light onto the image sensor. The upshot is that we can collect a lot more light (photons) in the same amount of time. The *downside* is that only part of the scene can be in focus at a given time - a phenomenon that leads to the *depth of field* of a camera: the (possibly narrow) area between where objects are too close or too far to be in focus.\n", "\n", + "```{index} resolution, focal length\n", + "```\n", + "```{index} pair: FOV; field of view\n", + "```\n", "The most important properties associated with a digital camera are its \n", "**resolution**, typically specified as $W \\times H$ in pixels; \n", "its **focal length**, which, as we will see below, can be measured either in meters or pixels;\n", "and its **field of view** (FOV), typically specified in degrees (horizontal, vertical, or diagonal). \n", - "The resolution is a property of the *sensor*, whereas focal length and field of view depend on the lens. We will investigate the relationships between these quantities below, where we talk about the camera\n", - "imaging geometry." + "The resolution is a property of the *sensor*, whereas focal length and field of view depend on the lens.\n", + "We will investigate the relationships between these quantities below, where we describe the camera imaging geometry." ] }, { @@ -156,14 +168,20 @@ "source": [ "```{index} RGB\n", "```\n", - "In essence, we get access to images as multi-dimensional arrays. Expensive CCD cameras have three sensors, one per color channel (red, green, and blue or **RGB**), and hence their raw output can be represented as three arrays of numbers that represent light levels in a specific frequency band, roughly corresponding to the same frequency bands that receptors in our eye are sensitive to. However, most cameras now have a *single* CMOS sensor with a color filter on top (called a Bayer pattern), and specialized algorithms that hallucinate three color channels. Actually, most cameras do a great deal more processing to improve the color and lighting; this sometimes gets in the way of algorithms that rely on measuring light exactly, but those are rather rare. In most cases, we are content to simply think of a (color) image as a $H \\times W \\times 3$ array of numbers, where $H$ is the height of the image, and $W$ the width.\n", - "\n", - "As an example, below we show an image on the left, taken by the differential-drive robot on the right:" + "In essence, we get access to images as multi-dimensional arrays. Expensive CCD cameras have three sensors, one per color channel (red, green, and blue or **RGB**), and hence their raw output can be represented as three arrays of numbers that represent light levels in a specific frequency band, roughly corresponding to the same frequency bands that receptors in our eye are sensitive to. However, most cameras now have a *single* CMOS sensor with a color filter on top (called a Bayer pattern), and specialized algorithms that hallucinate three color channels. Actually, most cameras do a great deal more processing to improve the color and lighting; this sometimes gets in the way of algorithms that rely on measuring light exactly, but those are rather rare. In most cases, we are content to simply think of a (color) image as a $H \\times W \\times 3$ array of numbers, where $H$ is the height of the image, and $W$ the width." + ] + }, + { + "cell_type": "markdown", + "id": "b2bba525", + "metadata": {}, + "source": [ + "As an example on how to deal with images in code, in Figure [1](#fig:outdoor-lagr) we show an image on the left, taken by the differential-drive robot on the right." ] }, { "cell_type": "code", - "execution_count": 16, + "execution_count": null, "id": "dWfNJ6RBxQ8U", "metadata": {}, "outputs": [ @@ -187,9 +205,11 @@ } ], "source": [ + "#| caption: An outdoor scene on the left, taken by the \"LAGR\" robot on the right.\n", + "#| label: fig:outdoor-lagr\n", "image_name = \"LL_color_1201754063.387872.jpeg\"\n", "ROW([f'\"Outdoor,',\n", - " f'\"LAGR'])\n" + " f'\"LAGR'])" ] }, { @@ -197,12 +217,14 @@ "id": "pTSfIaJPutVb", "metadata": {}, "source": [ - "A python library, the *Python Imaging Library* or PIL provides some basic capabilities to deal with digital images. We can load images using the `PIL.Image` class, examine its dimensions, and create a numpy array view (you can also use `display` in a notebook to show it):" + "```{index} pair: Python Imaging Library; PIL\n", + "```\n", + "A python library, the [Python Imaging Library](https://pillow.readthedocs.io/en/stable/) or PIL provides some basic capabilities to deal with digital images. We can load an image using the `PIL.Image` class, examine its dimensions, and create a numpy array view (you can also use `display` in a notebook to show it):" ] }, { "cell_type": "code", - "execution_count": 17, + "execution_count": null, "id": "Fn-WSMx83Co6", "metadata": {}, "outputs": [ @@ -221,7 +243,7 @@ "print(f\"resolution = {image.width}x{image.height}\")\n", "image_data = np.asarray(image)\n", "print(f\"image_data.shape = {image_data.shape}\")\n", - "print(image_data[383,511])\n" + "print(image_data[383,511])" ] }, { @@ -229,7 +251,7 @@ "id": "wpwvFOsHjp3_", "metadata": {}, "source": [ - "We see that the image width and height are $512$ and $384$, respectively. But when we access the array with numpy, the first (slowest changing) dimension is the *height*, followed by the width and then the color dimension. Hence, the numpy array has to be indexed using the $(\\text{row},\\text{column})$ convention, after which you get the RGB value in the array, as shown in the last line of code above.\n", + "We see that the image width and height are $512$ and $384$, respectively. But when we access the array with `numpy`, the first (slowest changing) dimension is the *height*, followed by the width and then the color dimension. Hence, the `numpy` array has to be indexed using the $(\\text{row},\\text{column})$ convention, after which you get the RGB value in the array, as shown in the last line of code above.\n", "\n", "It is customary to use variables $(i,j)$ or $(r,c)$ to index pixels, where the latter is slightly preferred as it emphasizes the *row* and *column* semantics of these *integer* coordinates." ] @@ -239,34 +261,35 @@ "id": "1VSEd0UFqClr", "metadata": {}, "source": [ - "```{index} pinhole camera model\n", - "```\n", "## Camera Imaging Geometry\n", "\n", "> Points in the 3D environment project to points in a 2D image.\n", "\n", + "```{index} pinhole camera model\n", + "```\n", "In order to use a camera to infer the properties of the robot's 3D environment,\n", "we need to fully under stand the geometry of image formation.\n", "We already did so at a superficial level, but the geometry involved needs more detail: exactly what light falls into what pixel?\n", "The simplest model for geometric image formation is the **pinhole camera model**. \n", - "Imagine a three-dimensional, orthogonal coordinate frame centered at center of the lens.\n", + "Imagine a three-dimensional, orthonormal coordinate frame with its origin at center of the lens.\n", "Computer vision folks use a very specific camera convention which will make the math easy:\n", "- the X-axis points to the *right*;\n", "- the Y-axis points *down*; and \n", "- the Z-axis points into the scene.\n", "\n", - "When we express 3D points in the scene according to this convention, in a coordinate frame that is attached the the cameras, we speak of specifying an object in *camera coordinates*. For example, a 2 meter tall person, standing 5 meters away, and 3 meters to the left, would have be in between these two 3D coordinates: " + "{raw:tex}`\\noindent`\n", + "When we express 3D points in the scene according to this convention, in a coordinate frame that is attached the camera, we speak of specifying an object in *camera coordinates*. For example, a 2 meter tall person, standing 5 meters away, and 3 meters to the left, would be in between these two 3D coordinates: " ] }, { "cell_type": "code", - "execution_count": 18, + "execution_count": null, "id": "whBvZ0qA36zj", "metadata": {}, "outputs": [], "source": [ "feet = gtsam.Point3(-3,1.7,5) # point at the feet of the person, 5 meters in front of camera, 3 meters to the left\n", - "head = gtsam.Point3(-3,-0.3,5) # point at the top of the head (note, Y = *minus* 2 meters)\n" + "head = gtsam.Point3(-3,-0.3,5) # point at the top of the head (note, Y = *minus* 2 meters)" ] }, { @@ -284,12 +307,12 @@ "\\end{equation}\n", "Here, $F$ denotes the focal length measured in meters,\n", "which is defined as the distance from the image plane to the pinhole, i.e., the center of the lens. \n", - "The following figure shows the geometry:" + "We show the geometry in Figure [2](#fig:pinhole_geometry). Note that the coordinate frame is rendered in color such that the X-axis is red, the Y-axis green, and the Z-axis blue, i.e., XYZ=RGB." ] }, { "cell_type": "code", - "execution_count": 19, + "execution_count": null, "id": "z7BD32_g1GN5", "metadata": {}, "outputs": [ @@ -306,7 +329,7 @@ "#| label: fig:pinhole_geometry\n", "F = 1 # meter\n", "from gtbook.diffdrive import axes, plane, ray, show_3d\n", - "show_3d(go.Figure(data = plane(-F) + [ray(feet, -F), ray(head, -F)] + axes()))\n" + "show_3d(go.Figure(data = plane(-F) + [ray(feet, -F), ray(head, -F)] + axes()))" ] }, { @@ -320,12 +343,12 @@ "Y_V = F \\frac{Y}{Z} ~~~~\n", "Z_V = F\n", "\\end{equation}\n", - "The virtual image geometry is shown below:" + "The virtual image geometry, with the virtual image *in front* of the camera, is shown in Figure [3](#fig:virtual_image)." ] }, { "cell_type": "code", - "execution_count": 20, + "execution_count": null, "id": "98ReH9s8q8BU", "metadata": {}, "outputs": [ @@ -340,7 +363,7 @@ "source": [ "#| caption: The virtual image is *in front* of the camera.\n", "#| label: fig:virtual_image\n", - "show_3d(go.Figure(data = plane(F) + [ray(feet, F), ray(head, F)] + axes()))\n" + "show_3d(go.Figure(data = plane(F) + [ray(feet, F), ray(head, F)] + axes()))" ] }, { @@ -348,16 +371,17 @@ "id": "AmG9i2PH6JE3", "metadata": {}, "source": [ - "```{index} intrinsic camera coordinates, principal point\n", + "```{index} intrinsic camera coordinates, principal point, optical axis\n", "```\n", "The above has the disadvantage that we still have to take into account the focal length $F$ when doing the projection. Dividing by the focal length yields the fundamental *pinhole projection equation*:\n", "\\begin{equation}\n", "x = \\frac{X}{Z} ~~~~ y = \\frac{Y}{Z}\n", "\\end{equation}\n", "The dimensionless $x$ and $y$ coordinates are called the **intrinsic camera coordinates**, and can be thought of as the image of the scene in a virtual image plane situated at a focal length of 1.0. \n", - "Note that the image origin at $(x,y)=(0,0)$ is the location where the *optical axis* (the blue Z-axis above) pierces the image plane. \n", + "Note that the image origin at $(x,y)=(0,0)$ is the location where the **optical axis** (the blue Z-axis above) pierces the image plane. \n", "This point is commonly refered to as the **principal point**.\n", - "The intrinsic coordinates are in essence measuring a direction in space, but parameterized by a location in the virtual image plane rather than two angles." + "The intrinsic coordinates are in essence measuring a direction in space,\n", + "parameterized by a location in the virtual image plane." ] }, { @@ -365,12 +389,12 @@ "id": "yOy6v--GgxhV", "metadata": {}, "source": [ - "```{index} sensor coordinates\n", - "```\n", "## Camera Calibration\n", "\n", "> From intrinsic to sensor coordinates.\n", "\n", + "```{index} sensor coordinates\n", + "```\n", "Intrinsic coordinates are dimensionless, but what *pixels* in an image do they correspond to?\n", "Also, when we project real-valued 3D coordinates in an image, we get *real-valued* intrinsic coordinates $(x,y)$. How does that relate to integer pixel coordinates? \n", "To translate from intrinsic coordinates to pixel coordinates, we introduce real-valued **sensor coordinates** $(u,v)$, with the following conventions (try to draw this out for a $4\\times3$ image!):\n", @@ -401,9 +425,9 @@ "\\end{equation}\n", "As an example, consider the [FireFly S](https://www.flir.com/products/firefly-s/?model=FFY-U3-04S2C-C) machine vision camera, which has the following specifications:\n", "- sensor: [Sony IMX297](https://www.phase1vision.com/userfiles/product_files/imx273_287_296_297_flyer.pdf) (CMOS)\n", - "- resolution: 728 x 544\n", + "- resolution: 720 x 540\n", "- pixel size: 6.9 $\\mu m$ (H) x 6.9 $\\mu m$ (V)\n", - "- sensor size: 6.3mm diagonally (sanity-check this!)" + "- sensor size: 6.3mm diagonally" ] }, { @@ -411,16 +435,19 @@ "id": "F-k1W_8AjMk9", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "We typically expect the *image center*, corresponding to $(x,y)=(0.0,0.0)$, to be close to $(u_0,v_0)=(W/2,H/2)$. \n", "For the sensor above this would be $(u_0,v_0)=(364.0, 272.0)$. \n", "To compute $\\alpha$ and $\\beta$ we have to take into account the lens focal length $F$. Since $u$ and $v$ are expressed in pixels, and $x$ and $y$ are dimensionless, it is clear that $\\alpha$ and $\\beta$ must also be expressed in pixels. They can be computed as\n", + "\\begin{equation}\n", "\\begin{aligned}\n", "\\alpha = F k &= 8\\text{mm}/6.9\\mu\\text{m} \\approx 1160\\text{px}\\\\\n", "\\beta = F l &= 8\\text{mm}/6.9\\mu\\text{m} \\approx 1160\\text{px}\n", "\\end{aligned}\n", + "\\end{equation}\n", "where \n", "\\begin{equation}\n", - "k = \\text{px}/6.9\\mu\\text{m}~~~~~\\mathrm{and}~~~~l = \\text{px}/6.9\\mu\\text{m}\n", + "k = \\text{px}/6.9\\mu\\text{m}\\,\\,\\,\\,\\,\\mathrm{and}\\,\\,\\,\\,l = \\text{px}/6.9\\mu\\text{m}\n", "\\end{equation}\n", "are sensor-specific constants that indicated the number of pixels per unit of length." ] @@ -431,7 +458,7 @@ "metadata": {}, "source": [ "Whenever $k=l$, the sensor has *square pixels*, and we can just use one proportionality constant, $f=\\alpha=\\beta$.\n", - "In this case, $f$ again denotes the *focal length*, but this time, expressed in pixels. This is a slight abuse of terminology, as $f$ is a property of both the lens *and* the image sensor plane, but it is in widespread and we will adopt it here as well." + "In this case, $f$ again denotes the *focal length*, but this time, expressed in pixels. This is a slight abuse of terminology, as $f$ is a property of both the lens *and* the image sensor plane, but its use is in widespread and we will adopt it here as well." ] }, { @@ -439,12 +466,12 @@ "id": "ovtBgfyyQioE", "metadata": {}, "source": [ - "```{index} pinhole projection\n", - "```\n", "## Pinhole Projection Equations\n", "\n", "> From 3D to pixel coordinates.\n", "\n", + "```{index} pinhole projection\n", + "```\n", "Putting all of the above together,\n", "we finally obtain the fundamental **pinhole projection** equation, projecting a point $P$ in 3D camera coordinates $P=(X,Y,Z)$, to its 2D image projection $p=(u,v)$ in sensor coordinates:\n", "\\begin{equation}\n", @@ -477,12 +504,12 @@ }, { "cell_type": "code", - "execution_count": 21, + "execution_count": null, "id": "8zshTv8tpiYp", "metadata": {}, "outputs": [], "source": [ - "cal_8mm_FireFlyS = gtsam.Cal3_S2(fx=1160, fy=1160, s=0, u0=364, v0=272)\n" + "cal_8mm_FireFlyS = gtsam.Cal3_S2(fx=1160, fy=1160, s=0, u0=364, v0=272)" ] }, { @@ -490,13 +517,14 @@ "id": "qPKWCDNV4o0Y", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "The arguments `fx` and `fy` above correspond to $\\alpha$ and $\\beta$, and for now you can ignore the extra `s` argument, denoting *skew* which is almost always zero for modern sensors.\n", "We can then convert from integer pixel coordinates to intrinsic coordinates:" ] }, { "cell_type": "code", - "execution_count": 22, + "execution_count": null, "id": "hwWvq16WV14s", "metadata": {}, "outputs": [ @@ -520,7 +548,7 @@ "\n", "calibration_demo(cal_8mm_FireFlyS, row=0, col=0)\n", "calibration_demo(cal_8mm_FireFlyS, row=272, col=364)\n", - "calibration_demo(cal_8mm_FireFlyS, row=543, col=727)\n" + "calibration_demo(cal_8mm_FireFlyS, row=543, col=727)" ] }, { @@ -534,7 +562,7 @@ }, { "cell_type": "code", - "execution_count": 23, + "execution_count": null, "id": "WblXPqZhKlrb", "metadata": {}, "outputs": [ @@ -548,7 +576,7 @@ ], "source": [ "u,v = cal_8mm_FireFlyS.uncalibrate([0,0])\n", - "print(f\"(x,y)=(0,0) -> (u,v)=({round(u,2)}px,{round(v,2)}px)\")\n" + "print(f\"(x,y)=(0,0) -> (u,v)=({round(u,2)}px,{round(v,2)}px)\")" ] }, { @@ -556,23 +584,23 @@ "id": "uA9J968r9whb", "metadata": {}, "source": [ - "```{index} pair: field of view; FOV\n", - "```\n", "## Camera Field of View\n", "\n", + "```{index} pair: field of view; FOV\n", + "```\n", "The last concept we need to define the camera imaging geometry\n", "is the camera's **field of view** or **FOV**.\n", "Because the *left-most* ray we can see has $u=0$, it corresponds to $x=-u_0/f\\approx-W/2f$.\n", "The horizontal FOV can then be calculated by\n", "\\begin{equation}\n", - "\\mathrm{HFOV} = 2 \\arctan(W/2f)~~\\mathrm{rad} = 360 \\arctan(W/2f) / \\pi~~\\mathrm{degrees}\n", + "\\mathrm{HFOV} = 2 \\arctan(W/2f)\\,\\,\\mathrm{rad} = 360 \\arctan(W/2f) / \\pi\\,\\,\\mathrm{degrees}\n", "\\end{equation}\n", "For the sensor-lens combination above we get a relatively narrow field of view of about 35 degrees:" ] }, { "cell_type": "code", - "execution_count": 24, + "execution_count": null, "id": "_fd099Uf0Occ", "metadata": {}, "outputs": [ @@ -587,7 +615,7 @@ "source": [ "f = 1160\n", "hfov = 360 * math.atan(728/(2*f)) / math.pi\n", - "print(f\"HFOV for f={f} is {hfov:.2f} degrees\")\n" + "print(f\"HFOV for f={f} is {hfov:.2f} degrees\")" ] }, { @@ -600,7 +628,7 @@ }, { "cell_type": "code", - "execution_count": 25, + "execution_count": null, "id": "xd-dKVfnbMPL", "metadata": {}, "outputs": [ @@ -615,7 +643,7 @@ "source": [ "f_wide = 4e-3/6.9e-6\n", "hfov_wide = 360 * math.atan(728/(2*f_wide)) / math.pi\n", - "print(f\"HFOV for f={f_wide:.1f} is {hfov_wide:.2f} degrees\")\n" + "print(f\"HFOV for f={f_wide:.1f} is {hfov_wide:.2f} degrees\")" ] }, { @@ -668,19 +696,19 @@ "id": "mwhW-KuCEhOX", "metadata": {}, "source": [ - "```{index} stereo baseline\n", - "```\n", "When using two cameras, we can triangulate a feature that is seen in both cameras to calculate its location in space. \n", "Given a projection $p=(u,v)$ of a point $P=(X,Y,Z)$ in a single camera we can only determine the *ray* on which the point $P$ must lie.\n", "However, if we see *two* projections of the same feature in two cameras, placed side by side, we can *triangulate* the location of $P$.\n", "In particular, let us name the cameras \"Left\" and \"Right\", abbreviated as \"L\" and \"R\", and let the two projections be $p_L=(u_L,v_L)$ and $p_R=(u_R,v_R)$. How could we recover the coordinates $(X,Y,Z)$ in, say, the *left* camera coordinate frame?\n", "\n", + "```{index} stereo baseline\n", + "```\n", "We can easily work out the answer *if* the cameras have the same calibration *and* the camera pair is in a \"stereo\" configuration. The latter means that the cameras have exactly the same orientation with respect to the world, and the right camera is displaced only horizontally with respect to the left camera. We call the displacement the **stereo baseline** $B$. In that case we have\n", "\\begin{equation}\n", "\\begin{aligned}\n", - "u_L &= u_0 + f \\frac{X}{Z}, ~~~~~ &v_L = v_0 + f \\frac{Y}{Z} \\\\\n", + "u_L &= u_0 + f \\frac{X}{Z}, \\,\\,\\,\\,\\, &v_L = v_0 + f \\frac{Y}{Z} \\\\\n", "\\\\\n", - "u_R &= u_0 + f \\frac{X-B}{Z}, ~~~~~ &v_R = v_0 + f \\frac{Y}{Z}\n", + "u_R &= u_0 + f \\frac{X-B}{Z}, \\,\\,\\,\\,\\, &v_R = v_0 + f \\frac{Y}{Z}\n", "\\end{aligned}\n", "\\end{equation}" ] From 4ec29053a41f77abda77d6694c852a35dd4c637f Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 17:56:09 -0500 Subject: [PATCH 7/9] Transformers --- S56_diffdrive_learning.ipynb | 28 +++++++++++++++++++++- S57_diffdrive_summary.ipynb | 9 ++++++-- references.bib | 45 ++++++++++++++++++++++++++++-------- 3 files changed, 70 insertions(+), 12 deletions(-) diff --git a/S56_diffdrive_learning.ipynb b/S56_diffdrive_learning.ipynb index bfb4bf2c..280cc56e 100644 --- a/S56_diffdrive_learning.ipynb +++ b/S56_diffdrive_learning.ipynb @@ -5913,10 +5913,36 @@ "id": "y8ZAvEIMT2qY", "metadata": {}, "source": [ - "## Validation and Testing\n", + "### Validation and Testing\n", "\n", "In practice, validation and test datasets are used to evaluate the performance of a model. The validation dataset is used to tune the hyperparameters of a model, while the test dataset is used to evaluate the performance of the model on unseen data. " ] + }, + { + "cell_type": "markdown", + "id": "04dbaf8e", + "metadata": {}, + "source": [ + "## Transformer Architectures\n", + "\n", + "```{index} transformers, transformer network, attention, token\n", + "```\n", + "```{index} pair: large language model; LLM\n", + "```\n", + "We would be remiss in not mentioning the increasing importance of **transformer** architectures in computer vision and robotics. A transformer network can be viewed as deep multi-layer neural network whose connections can be rewired during training, through a process called **attention**. This architecture has led to the breakthrough of **large language models** or **LLMs**, which take a large context of *tokens* and produce a *next-token probability distribution* , which is then sampled to output a response to an input prompt.\n", + "\n", + "```{index} pair: vision transformer; VIT\n", + "```\n", + "In computer vision, **vision transformers** or **VITs** use this same architecture, by *tokenizing* an image and provide it as part of the context. This allows LLM-style models to then answer questions about images, or perform traditional computer vision tasks such as object detection, image segmentation, and much more.\n", + "\n", + "```{index} pair: vision-language-action model; VLA\n", + "```\n", + "One step beyond VITs are **vision-language-action** models, specifically crafted for use in robots. They take not only visual input alongside language prompts, but also other signals such as joint angles (in case of articulated robotics), orientation sensors, etc... And, more importantly, they are trained to also output *actions* via specialized output heads, matched to the particular robot architecture that is targeted.\n", + "\n", + "A drawback of transformer-based methods is that they take a large aount of time and effort to train, and running the models on embedded computers is also a challenge. Hence, convolutional architectures remain competetive in robotics, especially when computational resources are constrained and/or there are constraints on communication that prevent calling a remote API. However, this is an intense area of study and the mix between fully connected, convolutional, and transformer architectures is sure to shift.\n", + "\n", + "While we do not discuss transformer architectures in detail here, some pointers into the literature are provided in section 5.7." + ] } ], "metadata": { diff --git a/S57_diffdrive_summary.ipynb b/S57_diffdrive_summary.ipynb index 0d4fead6..7bb3b4eb 100644 --- a/S57_diffdrive_summary.ipynb +++ b/S57_diffdrive_summary.ipynb @@ -66,7 +66,7 @@ "The *forward velocity kinematics* in the body-attached frame are given by\n", "\\begin{equation}\n", "v_x = \\frac{r}{2} (\\dot{\\phi}_R + \\dot{\\phi}_L)\n", - "~~~~~~~~\n", + "\\,\\,\\,\\,\\,\\,\\,\\,\n", "\\omega = \\frac{r}{L} (\\dot{\\phi}_R - \\dot{\\phi}_L)\n", "\\end{equation}\n", "which can be expressed with respect to the world coordinate frame, as a function\n", @@ -164,6 +164,8 @@ "id": "jII2jZl40v2A", "metadata": {}, "source": [ + "```{index} deep learning\n", + "```\n", "Our introduction of deep nets focused on their use as simple computational units. \n", "We provided the specific weights in the network that were required to implement specific, and known, operations.\n", "However, the real power of deep nets is that they can be used trained to implement operations\n", @@ -290,7 +292,10 @@ "Excellent introductions to the material on machine learning can be found in\n", "[Deep Learning](https://www.deeplearningbook.org/) by Goodfellow, Bengio, and Courville {cite:p}`Goodfellow16book_dl`\n", "and\n", - "[Dive into Deep Learning](https://d2l.ai/) by Zhang et al. {cite:p}`Zhang20book_d2l`." + "[Dive into Deep Learning](https://d2l.ai/) by Zhang et al. {cite:p}`Zhang23book_d2l`.\n", + "The seminal reference for transformer-bassed architectures is the famous \"Attention is all you need\" paper by\n", + "{cite:t}`Vaswani17neurips_attention`, and for vision-transformers the equivalent is the \"An Image is Worth 16x16 Words\" paper by {cite:t}`Dosovitskiy21iclr_VIT`.\n", + "A seminal reference for vision-language-action models is the RT-2 paper from Google {cite:p}`Brohan23_rt2_vla`." ] } ], diff --git a/references.bib b/references.bib index a454ba03..ccfed43c 100644 --- a/references.bib +++ b/references.bib @@ -294,15 +294,13 @@ @book{Watkins89thesis_Qlearning title = {Learning from delayed rewards}, year = {1989}} -@book{Zhang20book_d2l, - author = {Zhang, Aston and Lipton, Zack and Li, Mu and Smola, Alexander J.}, - isbn = {978-1009389433}, - publisher = {d2l.ai}, - title = {Dive into Deep Learning}, - url = {https://d2l.ai/}, - year = {2020} +@book{Zhang23book_d2l, + title={Dive into Deep Learning}, + author={Zhang, Aston and Lipton, Zachary C. and Li, Mu and Smola, Alexander J.}, + publisher={Cambridge University Press}, + url={https://d2l.ai}, + year={2023} } - @book{HaldBook98, author = {Hald, Anders}, publisher = {Wiley}, @@ -494,4 +492,33 @@ @misc{graphviz author = {Dellaert, Frank}, title = {{Graphviz}: open source graph visualization software}, url = {https://graphviz.org/}, -} \ No newline at end of file +} + +@inproceedings{Vaswani17neurips_attention, + title={Attention Is All You Need}, + author={Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit, Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, Łukasz and Polosukhin, Illia}, + booktitle={Advances in Neural Information Processing Systems}, + pages={5998--6008}, + year={2017}, + url={https://en.wikipedia.org/wiki/Attention_Is_All_You_Need} +} +@article{Dosovitskiy21iclr_VIT, + title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale}, + author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil}, + booktitle={International Conference on Learning Representations}, + year={2021}, + url={https://openreview.net/forum?id=YicbFdNTTy} +} +@article{Kirillov23_sam, + title={Segment Anything}, + author={Kirillov, Alexander and Mintun, Eric and Ravi, Nikhila and Mao, Hanzi and Rolland, Chloe and Gustafson, Laura and Xiao, Tete and Whitehead, Spencer and Berg, Alexander C. and Lo, Wan-Yen and Doll{\'a}r, Piotr and Girshick, Ross}, + journal={arXiv preprint arXiv:2304.02643}, + year={2023} +} +@inproceedings{Brohan23_rt2_vla, + title={RT-2: Vision-Language-Action Models Transfer Web Knowledge to Robotic Control}, + author={Anthony Brohan and Noah Brown and Justice Carbajal and Yevgen Chebotar and Xi Chen and Krzysztof Choromanski and Tianli Ding and Danny Driess and Avinava Dubey and Chelsea Finn and Pete Florence and Chuyuan Fu and Montse Gonzalez Arenas and Keerthana Gopalakrishnan and Kehang Han and Karol Hausman and Alex Herzog and Jasmine Hsu and Brian Ichter and Alex Irpan and Nikhil Joshi and Ryan Julian and Dmitry Kalashnikov and Yuheng Kuang and Isabel Leal and Lisa Lee and Tsang-Wei Edward Lee and Sergey Levine and Yao Lu and Henryk Michalewski and Igor Mordatch and Karl Pertsch and Kanishka Rao and Krista Reymann and Michael Ryoo and Grecia Salazar and Pannag Sanketi and Pierre Sermanet and Jaspiar Singh and Anikait Singh and Radu Soricut and Huong Tran and Vincent Vanhoucke and Quan Vuong and Ayzaan Wahid and Stefan Welker and Paul Wohlhart and Jialin Wu and Fei Xia and Ted Xiao and Peng Xu and Sichun Xu and Tianhe Yu and Brianna Zitkovich}, + booktitle={arXiv preprint arXiv:2307.15818}, + year={2023}, + url={https://robotics-transformer2.github.io/} +} From 17faa90f877538d867f2eeac0703bb7b5c27ee61 Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 18:47:10 -0500 Subject: [PATCH 8/9] Finished Section 5.4 --- S54_diffdrive_perception.ipynb | 215 ++++++++++++++++----------------- S57_diffdrive_summary.ipynb | 8 +- references.bib | 30 ++++- 3 files changed, 138 insertions(+), 115 deletions(-) diff --git a/S54_diffdrive_perception.ipynb b/S54_diffdrive_perception.ipynb index 2e2edb74..eabfd851 100644 --- a/S54_diffdrive_perception.ipynb +++ b/S54_diffdrive_perception.ipynb @@ -23,7 +23,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "id": "2p7UZn4uNHOH", "metadata": { "tags": [ @@ -40,12 +40,12 @@ } ], "source": [ - "%pip install -q -U gtbook\n" + "%pip install -q -U gtbook" ] }, { "cell_type": "code", - "execution_count": 19, + "execution_count": null, "id": "OLjuI_lMV_Tv", "metadata": { "tags": [ @@ -63,7 +63,7 @@ " \"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", "\n", "import matplotlib.pyplot as plt\n", - "%matplotlib inline\n" + "%matplotlib inline" ] }, { @@ -95,9 +95,9 @@ "id": "2tzcJ3Vpo8ut", "metadata": {}, "source": [ - "No book about robotics is complete without mentioning computer vision and introducing some of its main ideas, which we do in this section. However, computer vision is a large subject and it is not our intention to summarize the entire field and its many recent developments in this section. Rather, we give a broad overview of the ideas, and our treatment is necessarily light and superficial.\n", + "No book about robotics is complete without mentioning computer vision and introducing some of its main ideas, which we do in this section. However, computer vision is a large subject and it is not our intention to summarize the entire field and its many recent developments in this section. Rather, we give a broad overview of the ideas, and our treatment is necessarily somewhat superficial.\n", "\n", - "A very good resource for a deeper dive into the concepts introduced here and in Section 5.6 is the book [Dive into Deep Learning](https://d2l.ai/), which is structured as a completely executable set of jupyter notebooks. We encourage you to check it out." + "A very good resource for a deeper dive into the concepts introduced here and in Section 5.6 is the book [Dive into Deep Learning](https://d2l.ai/) {cite:p}`Zhang23book_d2l`, which -like this book- was authored as a set of executable jupyter notebooks. We encourage you to check it out." ] }, { @@ -107,17 +107,17 @@ "source": [ "## Linear Filtering\n", "\n", + "```{index} linear image filter\n", + "```\n", "> Filtering can be applied to spatial, as well as temporal, data.\n", "\n", "In the previous chapter, we developed the Bayes filter, and applied it to temporal data streams.\n", - "Here, we extend the idea of filtering to spatial data, such as images, instead of time sequences.\n", - "\n", - "Recall the image from the previous section:" + "Here, we extend the idea of filtering to spatial data, such as images, instead of time sequences." ] }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "id": "F5Mf4tOu4-Ds", "metadata": {}, "outputs": [ @@ -137,7 +137,7 @@ "#| label: fig:color_image_by_robot\n", "image_name = \"LL_color_1201754063.387872.jpeg\"\n", "lagr_image = diffdrive.read_image(image_name) # locally: PIL.Image.open(image_name)\n", - "plt.imshow(lagr_image);\n" + "plt.imshow(lagr_image);" ] }, { @@ -145,12 +145,14 @@ "id": "gmreOOwcyR8c", "metadata": {}, "source": [ - "First, to explain linear filtering operations, we will convert the image to grayscale:" + "Recall the image from the previous section, shown again in Figure [1](#fig:color_image_by_robot).\n", + "First, to explain linear filtering operations, we will convert the image to grayscale.\n", + "The PIL code to do so and the result is shown in Figure [2](#fig:gray_image_by_robot)." ] }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "id": "lwsPtQ2E-8yf", "metadata": {}, "outputs": [ @@ -169,7 +171,7 @@ "#| caption: Gray scale version of the same image.\n", "#| label: fig:gray_image_by_robot\n", "grayscale_image = PIL.ImageOps.grayscale(lagr_image)\n", - "plt.imshow(grayscale_image, cmap=\"gray\");\n" + "plt.imshow(grayscale_image, cmap=\"gray\");" ] }, { @@ -177,12 +179,14 @@ "id": "muWgsq-yxbNq", "metadata": {}, "source": [ - "We will be using the `pytorch` library below, which operates on *tensors*, which are basically equivalent to multidimensional numpy arrays. It is easy to convert from numpy to pytorch tensors:" + "```{index} tensor\n", + "```\n", + "To further analyze this image, we use the `pytorch` library below, which operates on **tensors**. These are basically equivalent to multidimensional numpy arrays. It is easy to convert from numpy to pytorch tensors:" ] }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "id": "t8sCwUujHFNV", "metadata": {}, "outputs": [ @@ -196,15 +200,7 @@ ], "source": [ "grayscale = torch.from_numpy(np.asarray(grayscale_image, dtype=float))\n", - "print(f\"type={type(grayscale)}, dtype={grayscale.dtype}, shape={grayscale.shape}\")\n" - ] - }, - { - "cell_type": "markdown", - "id": "XW7hJj9QoYGC", - "metadata": {}, - "source": [ - "Below we first motivate filtering using an edge detection example, explain it in 1D, and then generalize to arbitrary filters." + "print(f\"type={type(grayscale)}, dtype={grayscale.dtype}, shape={grayscale.shape}\")" ] }, { @@ -214,18 +210,20 @@ "source": [ "```{index} edge detection\n", "```\n", - "A frequent operation in computer vision is **edge detection**, which is to find transitions between dark and light areas, or vice versa. A simple edge detector can be implemented using a linear \"filtering\" operation. We first show the code below and then explain it in depth:" + "Below we first motivate filtering using an edge detection example, explain it in 1D, and then generalize to arbitrary filters.\n", + "\n", + "A frequent operation in computer vision is **edge detection**, which is to find transitions between dark and light areas in an image, or vice versa. A simple edge detector can be implemented using a *linear filtering* operation. We first show the code below and then explain it below that:" ] }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "id": "6mWYlY-pNIp9", "metadata": {}, "outputs": [], "source": [ "sobel_u = torch.tensor([[-1, 0, 1]], dtype=float)\n", - "I_u = diffdrive.conv2(grayscale, sobel_u)\n" + "I_u = diffdrive.conv2(grayscale, sobel_u)" ] }, { @@ -233,12 +231,13 @@ "id": "9ZE7X4X25AV6", "metadata": {}, "source": [ - "Above the first line creates a \"filter\" of size $1 \\times 3$, with values $\\begin{bmatrix}-1 & 0 & 1\\end{bmatrix}$, and then the second line calls a function `conv2` which implements the filtering. The results are shown below:" + "Above the first line creates a \"filter\" of size $1 \\times 3$, with values $\\begin{bmatrix}-1 & 0 & 1\\end{bmatrix}$, and then the second line calls a function `conv2` which implements the filtering. The results are shown in Figure [3](#fig:gray_image_and_edges_by_robot).\n", + "We show the input image and the computed \"edge image\" side by side. The edge image is color-coded: red is negative, green is positive, and yellow is zero. By comparing with the input, you can see that the edge image highlights strong *vertical edges* in the input image, where green corresponds to dark-light transitions, and red corresponds to light-dark transitions." ] }, { "cell_type": "code", - "execution_count": 7, + "execution_count": null, "id": "bTDxYKYbW90i", "metadata": {}, "outputs": [ @@ -258,15 +257,7 @@ "#| label: fig:gray_image_and_edges_by_robot\n", "fig, ax = plt.subplot_mosaic([['input', 'edges']], figsize=(14, 7))\n", "ax['input'].imshow(grayscale, cmap=\"gray\")\n", - "ax['edges'].imshow(I_u, cmap=\"RdYlGn\");\n" - ] - }, - { - "cell_type": "markdown", - "id": "PsFk38RYeSVR", - "metadata": {}, - "source": [ - "Above we show the input image and the computed \"edge image\" side by side. The edge image is color-coded: red is negative, green is positive, and yellow is zero. By comparing with the input, you can see that it highlights strong *vertical edges* in the input image, where green corresponds to dark-light transitions, and red corresponds to light-dark transitions." + "ax['edges'].imshow(I_u, cmap=\"RdYlGn\");" ] }, { @@ -283,7 +274,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": null, "id": "fLKgndaTdO4n", "metadata": {}, "outputs": [ @@ -299,7 +290,7 @@ "source": [ "simple = torch.tensor([[3,3,3,5,5,5,5,2,2,2]], dtype=float)\n", "print(np.vstack([simple.numpy(),\n", - "diffdrive.conv2(simple, sobel_u).numpy()]))\n" + "diffdrive.conv2(simple, sobel_u).numpy()]))" ] }, { @@ -307,6 +298,7 @@ "id": "CnnniLsMxSQf", "metadata": {}, "source": [ + "{raw:tex}`\\noindent`\n", "The first line above shows the pixel values for the original $1 \\times 10$ image,\n", "and the second line shows the \"edge\" image. \n", "Every value in the edge image is computed from three values in the original image.\n", @@ -333,11 +325,13 @@ "id": "qqUTmf61OIok", "metadata": {}, "source": [ + "```{index} kernel\n", + "```\n", "For the simple 1D example above, we could write this with the simple formula\n", "\\begin{equation}\n", "h[i] = \\sum_{k=-1}^1 g[k] f[i+k]\n", "\\end{equation}\n", - "where $f$ is the 1D input image, $g$ is the 1D filter or *kernel*, and $h$ is the output edge image. Note that we index into the kernel $g$ with coordinates $k\\in[-1,0,1]$. By adding $k$ to the output coordinate $i$, we automatically take the weighted sum of pixels in the input image $f$ centered around $i$." + "where $f$ is the 1D input image, $g$ is the 1D filter or **kernel**, and $h$ is the output edge image. Note that we index into the kernel $g$ with coordinates $k\\in[-1,0,1]$. By adding $k$ to the output coordinate $i$, we automatically take the weighted sum of pixels in the input image $f$ centered around $i$." ] }, { @@ -368,6 +362,8 @@ "id": "HyFnjTEApkOK", "metadata": {}, "source": [ + "```{index} correlation, convolution\n", + "```\n", "**Correlation vs. Convolution**: we use the term convolution above, but the formula above is really *correlation*. The correct formula for *convolution*, a term from the signal processing literature, is \n", "\\begin{equation}\n", "h[i] = \\sum_{k=-1}^1 g[k] f[i-k]\n", @@ -414,12 +410,13 @@ "\\begin{equation}\n", "g = \\begin{pmatrix}-1 \\\\ 0 \\\\ 1\\end{pmatrix}\n", "\\end{equation}\n", - "The code below applies the horizontal Sobel edge detector to our original image from above." + "The code in Figure [4](#fig:gray_image_and_horizontal_edges_by_robot) applies the horizontal Sobel edge detector to our original image from above, and shows the result alongside the original.\n", + "Note that above we defined the filter such that a positive transition is defined as having dark then light for an increasing value of the *row* coordinate. This explains why above the strong edge with the sky shows up as *negative*, perhaps counter to your intuition." ] }, { "cell_type": "code", - "execution_count": 9, + "execution_count": null, "id": "m_rTUBt-O0Kc", "metadata": {}, "outputs": [ @@ -441,15 +438,7 @@ "I_v = diffdrive.conv2(grayscale, sobel_v)\n", "fig, ax = plt.subplot_mosaic([['input', 'edges']], figsize=(14, 7))\n", "ax['input'].imshow(grayscale, cmap=\"gray\")\n", - "ax['edges'].imshow(I_v, cmap=\"RdYlGn\");\n" - ] - }, - { - "cell_type": "markdown", - "id": "4xFyDXa2o54j", - "metadata": {}, - "source": [ - "Note that above we defined the filter such that a positive transition is defined as having dark then light for an increasing value of the *row* coordinate. This explains why above the strong edge with the sky shows up as *negative*, perhaps counter to your intuition." + "ax['edges'].imshow(I_v, cmap=\"RdYlGn\");" ] }, { @@ -459,17 +448,17 @@ "source": [ "## Gradients vs. Edges\n", "\n", - "> In fact, the Sobel operator is a gradient operator, not an edge detector.\n", + "> The Sobel operator is a gradient operator, not an edge detector.\n", "\n", - "Actually, above we told a small white lie: the Sobel filters actually approximate the image *gradient*, i.e., the spatial derivatives of the image values in the horizontal or vertical directions. We associate high gradient values with edges, but actually the two concepts are not the same: saying that there is an edge in the image can be regarded as a binary classification decision: either there is one, or not.\n", + "Above we told a small white lie: the Sobel filters actually approximate the image *gradient*, i.e., the spatial derivatives of the image values in the horizontal or vertical directions. We associate high gradient values with edges, but actually the two concepts are not the same: saying that there is an edge in the image can be regarded as a binary classification decision: either there is an edge, or not.\n", "Could we use our Sobel gradient operators to construct an edge detector?\n", "\n", - "The gradient magnitude (i.e., the Euclidean norm of the gradient) is a positive number that combines both horizontal and vertical gradient values. We can calculate and visualize it as follows:" + "The gradient magnitude (i.e., the Euclidean norm of the gradient) is a positive number that combines both horizontal and vertical gradient values. We can calculate and visualize it in Figure [5](#fig:gray_image_and_gradient_magnitude_by_robot)." ] }, { "cell_type": "code", - "execution_count": 10, + "execution_count": null, "id": "TDsUl6zA_LNf", "metadata": {}, "outputs": [ @@ -490,7 +479,7 @@ "I_m = torch.sqrt(torch.square(I_u)+torch.square(I_v))\n", "fig, ax = plt.subplot_mosaic([['input', 'edges']], figsize=(14, 7))\n", "ax['input'].imshow(grayscale, cmap=\"gray\")\n", - "ax['edges'].imshow(I_m, cmap=\"Greys\");\n" + "ax['edges'].imshow(I_m, cmap=\"Greys\");" ] }, { @@ -498,12 +487,14 @@ "id": "LeJN3FmLJfOk", "metadata": {}, "source": [ - "Above, it seems that edges have a high magnitude, and non-edges have a low magnitude, so a very simple idea is to simply threshold and get a *binary edge image*:" + "```{index} binary edge image, threshold\n", + "```\n", + "From the figure, you should notice that edges seem to have a high magnitude, and non-edges have a low magnitude. Hence, a simple idea is to threshold the gradient to obtain a *binary edge image*, as we do in Figure [6](#fig:gray_image_and_thresholded_gradient_magnitude_by_robot). We used a threshold $\\theta=50$, but feel free to play with this threshold a bit and see what you like best. You might experience that it is not so easy to make this simple, hand-designed edge detector to do what we *really* want, which is to detect edges as we think of them, and not react to all the noise in the image. Image processing, and computer vision in general, is messy and hard!" ] }, { "cell_type": "code", - "execution_count": 11, + "execution_count": null, "id": "UWyEgnXrE7Qd", "metadata": {}, "outputs": [ @@ -524,15 +515,7 @@ "edges = torch.threshold(I_m,50,0)>0\n", "fig, ax = plt.subplot_mosaic([['input', 'edges']], figsize=(14, 7))\n", "ax['input'].imshow(grayscale, cmap=\"gray\")\n", - "ax['edges'].imshow(edges, cmap=\"Greys\");\n" - ] - }, - { - "cell_type": "markdown", - "id": "W21OChIJQGZQ", - "metadata": {}, - "source": [ - "In the above we used a threshold $\\theta=50$, but feel free to play with this threshold a bit and see what you like best. You can see that it is not so easy to make this simple, hand-designed edge detector to do what we *really* want, which is to detect edges as we think of them, and not react to all the noise in the image. Image processing, and computer vision in general, is messy and hard!" + "ax['edges'].imshow(edges, cmap=\"Greys\");" ] }, { @@ -540,7 +523,7 @@ "id": "pMtAOsVHcrSA", "metadata": {}, "source": [ - "```{index} neural networks, Perceptron\n", + "```{index} neural network, Perceptron\n", "```\n", "## Fully Connected Neural Networks\n", "\n", @@ -548,7 +531,7 @@ "\n", "Above we looked at edge detection as a classification problem. Our solution was to calculate the output of two different filters (both Sobel operators), combine these with a non-linear operation (the norm), and then apply a threshold at some hand-tuned level. It is only natural to ask whether this idea of taking a linear combination of pixels and feeding it into some \"decision maker\" could solve other tasks, including the a main goal of computer vision, detecting and recognizing objects, a capability which seems effortless to people yet which eluded computer vision researchers for a long time.\n", "\n", - "Inspired by the way neurons in the brain appear to be connected, **neural networks** were first proposed by [Frank Rosenblatt](https://en.wikipedia.org/wiki/Frank_Rosenblatt) in the 50s, who with his collaborators proposed the **Perceptron**. The mathematical equation for a perceptron is simple:\n", + "Inspired by the way neurons in the brain appear to be connected, **neural networks** were first proposed by [Frank Rosenblatt](https://en.wikipedia.org/wiki/Frank_Rosenblatt) in the 1950s, who with his collaborators proposed the **Perceptron**. The mathematical equation for a perceptron is simple:\n", "\\begin{equation}\n", "f(x) = \\theta \\begin{pmatrix}\\sum_k w[k] x[k] + b\\end{pmatrix} = \\theta(w \\cdot x + b)\n", "\\end{equation}\n", @@ -593,13 +576,25 @@ "The threshold function $\\theta: \\mathbb{R}^{n_o} \\rightarrow \\{0,1\\}^{n_o}$\n", "applies a threshold operation to each entry of a vector (merely a vector version of the threshold\n", "operation above).\n", - "This is called going \"wide\" as we now create multi-dimensional outputs.\n", - "\n", + "This is called going \"wide\" as we now create multi-dimensional outputs." + ] + }, + { + "cell_type": "markdown", + "id": "f65ec42f", + "metadata": {}, + "source": [ + "```{index} output features, layer\n", + "```\n", + "```{index} pair: multi-layer perceptron; MLP\n", + "```\n", "Going *deep* is taking the output from one (multi-dimensional) perceptron and feeding it into a subsequent perceptron.\n", - "We call each stage a \"layer\" in the neural network. \n", - "Multi-layer perceptrons or MLPs can capture increasingly complex concepts present in the input signal. \n", + "We call each stage a **layer** in the neural network. \n", + "**Multi-layer perceptrons** or **MLPs** can capture increasingly complex concepts present in the input signal. \n", "The idea is that the output of the first layer learns simpler concepts from the input signal, and the next layer combines these concepts into more complex concepts\n", "\n", + "```{index} feature\n", + "```\n", "The notion of \"simple concepts\" computed at each layer is very useful, and we have already introduced the term *feature* above to denote these concepts. A feature can be hand-designed, much like the output of the Sobel operator we introduced above, or *learned*. While we will postpone the discussion of *how* to learn these features from data until section 5.6, it is important to know that almost all successful vision pipelines these days learn the feature representations from data, and which features are learned very heavily depends on the task." ] }, @@ -608,10 +603,10 @@ "id": "Ia6qTjOPERWv", "metadata": {}, "source": [ + "The MLP architecture is very powerful, but it is also expensive. For every MLP layer with $n$ input features and $n_o$, we need a *weight matrix* $W$ of size $n_o \\times n$. That seems doable in the 1D case, but when thinking about images this becomes rather expensive. Even for relatively low-resolution images, say $256\\times 256$, the number of input features $n =256^2=65,536$. Even if we wanted to only compute a relatively modest number of features, say $32$, that still requires over $2$ million weights to be specified. However, even if we had infinite compute and storage, there is another issue with having that many weights when they are to be learned: there might simply not be enough *data* to nail down the weights in a principled manner. \n", + "\n", "```{index} convolutional neural networks\n", "```\n", - "The MLP architecture is very powerful, but it is also *very* expensive. For every MLP layer with $n$ input features and $n_o$, we need a *weight matrix* $W$ of size $n_o \\times n$. That seems doable in the 1D case, but when thinking about images this becomes rather expensive. Even for relatively low-resolution images, say $256\\times 256$, the number of input features $n =256^2=65,536$. Even if we wanted to only compute a relatively modest number of features, say $32$, that still requires over $2$ million weights to be specified. However, even if we had infinite compute and storage, there is another issue with having that many weights when they are to be learned: there might simply not be enough *data* to nail down the weights in a principled manner. \n", - "\n", "For computer vision applications, a different class of neural networks called\n", "**convolutional neural networks** alleviates both concerns\n", "by combining notions of multi-layer networks with the earlier introduced concept of convolutions." @@ -622,20 +617,20 @@ "id": "priszrPPox8p", "metadata": {}, "source": [ - "```{index} pair: convolutional neural network; CNN\n", - "```\n", "## Convolutional Neural Networks\n", "\n", "> Convolve, \"threshold\", repeat...\n", "\n", + "```{index} pair: convolutional neural network; CNN\n", + "```\n", "Three separate ideas gave rise to **convolutional neural networks** or **CNN**s, which replace fully connected or *dense* layers with *convolutional* layers:\n", "\n", - "- Linear filtering: Convolution and correlation, and linear filtering in general, are primordial concepts from signal and image processing where they have proved immensely useful.\n", - "- Shared weights: The idea to replace a very large weight matrix $W$ with a much smaller $kernel$, which we will denote by $g$, was attractive from a computational resources point of view.\n", - "- Translation invariance: Intuitively, a useful feature at one location in the image (or 1D signal) should also be useful at another location. \n", + "- Linear filtering: convolution and correlation, and linear filtering in general, are primordial concepts from signal and image processing where they have proved immensely useful.\n", + "- Shared weights: The idea to replace a very large weight matrix $W$ with a much smaller $kernel$, which we will denote by $g$, is attractive from a computational resources point of view.\n", + "- Translation invariance: Intuitively, a useful feature at one location in the image (or 1D signal) should also be useful at other locations.\n", "\n", "The concept of translation invariance is important,\n", - "and warrants some more explanation. \n", + "and warrants some more explanation.\n", "Suppose we apply kernel $g$ to input $x$ to obtain the output $f$.\n", "Translation invariance implies that if we shift the input by $t$, the resulting output\n", "will merely be the original $f$ also shifted by $t$.\n", @@ -652,10 +647,10 @@ "id": "ZkRfQQ_lnZjY", "metadata": {}, "source": [ - "```{index} convolutional layer\n", - "```\n", "### Going Wide in CNNs\n", "\n", + "```{index} convolutional layer\n", + "```\n", "As with fully connected neural networks, we can go wide with multi-layer CNNs.\n", "For **convolutional layer** $l$, denote the number of input channels by $n_{l,i}$ and the number\n", "of output channels by $n_{l,o}$.\n", @@ -672,12 +667,12 @@ "id": "UIY-fsIbZ-N5", "metadata": {}, "source": [ + "### Going Deep in CNNS\n", + "\n", "```{index} activation function, sigmoid function\n", "```\n", "```{index} pair: rectified linear unit; ReLU\n", "```\n", - "### Going Deep in CNNS\n", - "\n", "To go *deep*, we specify a nonlinear operation or **activation function** that is applied after the linear convolution step. \n", "This activation function plays the role of $\\theta$ in our fully connected networks above.\n", "Indeed, *without* an activation function, it makes little sense to have two successive linear layers with weights $A$ and $B$: one could just as easily replace them with a *single* linear layer with weights $W = A B$. In addition, from the multi-layer perceptron work we know that the *threshold* operation is a crucial step in *activating* a feature, i.e., deciding when a feature is really present or whether the generated signal is just due to noise. Think back to our primitive edge detector above as well: both the thresholding and the threshold value (which in a perceptron is encoded in a bias $b$) are important for the final result.\n", @@ -699,10 +694,10 @@ "id": "E7MdHRzRv0ZM", "metadata": {}, "source": [ - "```{index} pooling layer\n", - "```\n", "### Pooling Layers\n", "\n", + "```{index} pooling layer\n", + "```\n", "Finally, CNNs also frequently have **pooling layers**. A downside of a convolutional layer is that each output layer is as large as the previous one: the convolution, even when in `valid` mode, only slightly reduces the image size, and not at all when using `same` mode with zero-padding. So called *pooling* layers were again inspired by the human visual system, where an experimentalists observed that while early processing layers where \"retinotopic\", i.e., had a one-to-one mapping to locations on the imaging surface, successive layers gradually became coarser *and* activated by wider \"receptive fields\", defined as the area on the retina that were able to influence the activation of a neuron at a given processing stage. There are also computational reasons to wanting to \"downscale\" the resolution in deeper layers, as many neural net architectures increase the number of features with depth, and hence reducing the resolution correspondingly yielded in approximately the same amount of computation per layer.\n", "\n", "Formally, a pooling layer is most often an averaging or maximization operation over an input window of a given size. For example, a \"max-pooling\" layer in 2D implements the following equation,\n", @@ -726,7 +721,7 @@ "\n", "> A historically important example.\n", "\n", - "Convolutional neural networks were pioneered by [Kunihiko Fukushima](https://en.wikipedia.org/wiki/Kunihiko_Fukushima) in the 70s, and and [Yann LeCun](https://en.wikipedia.org/wiki/Yann_LeCun) in the 80s. The latter created several CNN-style neural networks for the task of handwritten digit recognition, motivated by an application for the US Postal Service. This work took part from the late 80s to well into the 90s, and is described in a [highly cited 1998 overview paper](https://ieeexplore.ieee.org/abstract/document/726791). Below we show the architecture of LeNet-5 described in that paper.\n", + "Convolutional neural networks were pioneered by [Kunihiko Fukushima](https://en.wikipedia.org/wiki/Kunihiko_Fukushima) in the 70s, and and [Yann LeCun](https://en.wikipedia.org/wiki/Yann_LeCun) in the 1980s. The latter created several CNN-style neural networks for the task of handwritten digit recognition, motivated by an application for the US Postal Service. This work took part from the late 80s to well into the 90s, and is described in a [highly cited 1998 overview paper](https://ieeexplore.ieee.org/abstract/document/726791) {cite:p}`Lecun98ieee_LeNet`. Below we show the architecture of LeNet-5 described in that paper.\n", "\n", "LeNet-5 takes a single-channel, $32\\times 32$ grayscale image as input, and has the following layers:\n", "- 6-channel, $28\\times 28$ convolutional layer with $5\\times 5$ kernel;\n", @@ -751,10 +746,10 @@ "id": "O_Uicl2-Lohl", "metadata": {}, "source": [ - "```{index} semantic segmentation\n", - "```\n", "## Semantic Segmentation\n", "\n", + "```{index} semantic segmentation\n", + "```\n", "> What can neural networks do for robots?\n", "\n", "In section 5.6 we will see some other applications, but a very relevant task for robotics is **semantic segmentation**. In this task, every pixel in the image is classified into a finite set of *classes*, such as road, vegetation, building, sky, etc. Think back to the trash sorting robot's need to classify pieces of trash into different categories. Semantic segmentation is similar, but we now do this for *every pixel* in the image. This is a very useful capability for a *mobile* robot, e.g., it can help plan a path over drivable surfaces.\n", @@ -764,7 +759,7 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": null, "id": "Kq53rmYbQeMg", "metadata": { "tags": [ @@ -783,7 +778,7 @@ "source": [ "model = torch.hub.load('pytorch/vision:v0.10.0', 'deeplabv3_resnet50', pretrained=True);\n", "model.to(DEVICE) # DEVICE will be equal to 'cuda' if GPU is available\n", - "model.eval();\n" + "model.eval();" ] }, { @@ -791,12 +786,12 @@ "id": "uv3VayhXCXmd", "metadata": {}, "source": [ - "We load a highway driving example to test it:" + "In Figure [7](#fig:highway_image_by_robot) we load a highway driving example for the purposes of testing." ] }, { "cell_type": "code", - "execution_count": 13, + "execution_count": null, "id": "AkF9cgxxLwo0", "metadata": {}, "outputs": [ @@ -825,7 +820,7 @@ "highway_image = diffdrive.read_image(image_name)\n", "print(f\"resolution = {highway_image.width}x{highway_image.height}\")\n", "highway_image = highway_image.convert(\"RGB\")\n", - "plt.imshow(highway_image);\n" + "plt.imshow(highway_image);" ] }, { @@ -838,7 +833,7 @@ }, { "cell_type": "code", - "execution_count": 14, + "execution_count": null, "id": "obT1inxwsSsp", "metadata": {}, "outputs": [], @@ -856,7 +851,7 @@ "\n", "with torch.no_grad():\n", " output = model(input_batch)['out'][0]\n", - "output_predictions = output.argmax(0)\n" + "output_predictions = output.argmax(0)" ] }, { @@ -864,12 +859,12 @@ "id": "Yqj60w4k7cG8", "metadata": {}, "source": [ - "Again, we use example code from PyTorch Hub to display the result as a color-coded per-pixel segmentation image. Note the method calls `.cpu()` and `.numpy()` which respectively transfer a PyTorch tensor to the CPU (if it's not already there) and then convert the tensor to a `numpy` array, to play nicely with `matplotlib`:" + "Again, we use example code from PyTorch Hub to display the result as a color-coded per-pixel segmentation image. Note the method calls `.cpu()` and `.numpy()` which respectively transfer a PyTorch tensor to the CPU (if it's not already there) and then convert the tensor to a `numpy` array, to play nicely with `matplotlib`. The result is shown in Figure [8](#fig:semantic_segmentation_by_robot)." ] }, { "cell_type": "code", - "execution_count": 15, + "execution_count": null, "id": "ZQVQ8IXeK-oA", "metadata": {}, "outputs": [ @@ -897,7 +892,7 @@ "r.putpalette(colors)\n", "\n", "import matplotlib.pyplot as plt\n", - "plt.imshow(r);\n" + "plt.imshow(r);" ] }, { @@ -914,7 +909,7 @@ }, { "cell_type": "code", - "execution_count": 16, + "execution_count": null, "id": "WsOJPfbHbft6", "metadata": { "tags": [ @@ -938,10 +933,10 @@ } ], "source": [ - "model_type = \"MiDaS_small\" # MiDaS v2.1 - Small (lowest accuracy, highest inference speed)\n", + "model_type = \"MiDaS_small\" # MiDaS v2.1 - Small\n", "midas = torch.hub.load(\"intel-isl/MiDaS\", model_type, verbose=False);\n", "midas.to(DEVICE);\n", - "midas.eval();\n" + "midas.eval();" ] }, { @@ -954,7 +949,7 @@ }, { "cell_type": "code", - "execution_count": 17, + "execution_count": null, "id": "dL1NePYbXCxn", "metadata": {}, "outputs": [], @@ -963,7 +958,7 @@ "normalized = (np.asarray(resized)-[0.485, 0.456, 0.406])/[0.229, 0.224, 0.225]\n", "transposed = np.transpose(normalized, (2, 0, 1))\n", "image32 = np.ascontiguousarray(transposed).astype(np.float32)\n", - "input_batch = torch.from_numpy(np.expand_dims(image32,axis=0))\n" + "input_batch = torch.from_numpy(np.expand_dims(image32,axis=0))" ] }, { @@ -971,12 +966,12 @@ "id": "NL6LwIQxNkEO", "metadata": {}, "source": [ - "After evaluating the network, we show the result using matplotlib:" + "After evaluating the network, we show the result using matplotlib, in Figure [9](#fig:depth_estimation_by_robot). The output of this network, which was selected for its high inference speed, is not particularly accurate." ] }, { "cell_type": "code", - "execution_count": 18, + "execution_count": null, "id": "gOmgNJsAxZRy", "metadata": {}, "outputs": [ @@ -998,7 +993,7 @@ " prediction = midas(input_batch)\n", "\n", "output = prediction.cpu().squeeze().numpy()\n", - "plt.imshow(output);\n" + "plt.imshow(output);" ] } ], diff --git a/S57_diffdrive_summary.ipynb b/S57_diffdrive_summary.ipynb index 7bb3b4eb..a601f8fe 100644 --- a/S57_diffdrive_summary.ipynb +++ b/S57_diffdrive_summary.ipynb @@ -288,14 +288,16 @@ "and [Planning Algorithms](https://lavalle.pl/planning/) by LaValle {cite:p}`LaValle06book_planning`\n", "provided updated treatments of the rapidly expanding field.\n", "\n", - "\n", "Excellent introductions to the material on machine learning can be found in\n", "[Deep Learning](https://www.deeplearningbook.org/) by Goodfellow, Bengio, and Courville {cite:p}`Goodfellow16book_dl`\n", "and\n", "[Dive into Deep Learning](https://d2l.ai/) by Zhang et al. {cite:p}`Zhang23book_d2l`.\n", + "The historically important papers references in Section 5.4 are the Neocognitron paper by {cite:t}`Fukushima80bc_neocognitron`,\n", + "and the LeNet paper by {cite:t}`Lecun98ieee_LeNet`.\n", + "\n", "The seminal reference for transformer-bassed architectures is the famous \"Attention is all you need\" paper by\n", - "{cite:t}`Vaswani17neurips_attention`, and for vision-transformers the equivalent is the \"An Image is Worth 16x16 Words\" paper by {cite:t}`Dosovitskiy21iclr_VIT`.\n", - "A seminal reference for vision-language-action models is the RT-2 paper from Google {cite:p}`Brohan23_rt2_vla`." + "{cite:t}`Vaswani17neurips_attention`, and for vision-transformers the equivalent is the \"An Image is Worth 16x16 Words\" paper by {cite:t}`Dosovitskiy21iclr_VIT`. A VIT architecture of note is the \"Segment Anything\" model by {cite:t}`Kirillov23iccv_SAM`.\n", + "Finally, a seminal reference for vision-language-action models is the RT-2 paper from Google {cite:p}`Brohan23_rt2_vla`." ] } ], diff --git a/references.bib b/references.bib index ccfed43c..188ae664 100644 --- a/references.bib +++ b/references.bib @@ -509,12 +509,14 @@ @article{Dosovitskiy21iclr_VIT year={2021}, url={https://openreview.net/forum?id=YicbFdNTTy} } -@article{Kirillov23_sam, +@inproceedings{Kirillov23iccv_SAM, title={Segment Anything}, author={Kirillov, Alexander and Mintun, Eric and Ravi, Nikhila and Mao, Hanzi and Rolland, Chloe and Gustafson, Laura and Xiao, Tete and Whitehead, Spencer and Berg, Alexander C. and Lo, Wan-Yen and Doll{\'a}r, Piotr and Girshick, Ross}, - journal={arXiv preprint arXiv:2304.02643}, + booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision}, + pages={1--10}, year={2023} } + @inproceedings{Brohan23_rt2_vla, title={RT-2: Vision-Language-Action Models Transfer Web Knowledge to Robotic Control}, author={Anthony Brohan and Noah Brown and Justice Carbajal and Yevgen Chebotar and Xi Chen and Krzysztof Choromanski and Tianli Ding and Danny Driess and Avinava Dubey and Chelsea Finn and Pete Florence and Chuyuan Fu and Montse Gonzalez Arenas and Keerthana Gopalakrishnan and Kehang Han and Karol Hausman and Alex Herzog and Jasmine Hsu and Brian Ichter and Alex Irpan and Nikhil Joshi and Ryan Julian and Dmitry Kalashnikov and Yuheng Kuang and Isabel Leal and Lisa Lee and Tsang-Wei Edward Lee and Sergey Levine and Yao Lu and Henryk Michalewski and Igor Mordatch and Karl Pertsch and Kanishka Rao and Krista Reymann and Michael Ryoo and Grecia Salazar and Pannag Sanketi and Pierre Sermanet and Jaspiar Singh and Anikait Singh and Radu Soricut and Huong Tran and Vincent Vanhoucke and Quan Vuong and Ayzaan Wahid and Stefan Welker and Paul Wohlhart and Jialin Wu and Fei Xia and Ted Xiao and Peng Xu and Sichun Xu and Tianhe Yu and Brianna Zitkovich}, @@ -522,3 +524,27 @@ @inproceedings{Brohan23_rt2_vla year={2023}, url={https://robotics-transformer2.github.io/} } + +@article{Fukushima80bc_neocognitron, + title={Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position}, + author={Fukushima, Kunihiko}, + journal={Biological Cybernetics}, + volume={36}, + number={4}, + pages={193--202}, + year={1980}, + publisher={Springer}, + url={https://en.wikipedia.org/wiki/Neocognitron} +} + +@article{Lecun98ieee_LeNet, + title={Gradient-based learning applied to document recognition}, + author={LeCun, Yann and Bottou, L{\'e}on and Bengio, Yoshua and Haffner, Patrick}, + journal={Proceedings of the IEEE}, + volume={86}, + number={11}, + pages={2278--2324}, + year={1998}, + publisher={IEEE}, + url={https://ieeexplore.ieee.org/abstract/document/726791} +} \ No newline at end of file From 1ec3550c1bc57f9a6cc81b01fe0100a2fa0c3903 Mon Sep 17 00:00:00 2001 From: Frank Dellaert Date: Sun, 9 Feb 2025 19:31:42 -0500 Subject: [PATCH 9/9] Sections 5.5 and 5.6 --- S55_diffdrive_planning.ipynb | 120 +++++++++++++++++------------------ S56_diffdrive_learning.ipynb | 86 +++++++++++++------------ 2 files changed, 103 insertions(+), 103 deletions(-) diff --git a/S55_diffdrive_planning.ipynb b/S55_diffdrive_planning.ipynb index 518ef22b..55d6e2ad 100644 --- a/S55_diffdrive_planning.ipynb +++ b/S55_diffdrive_planning.ipynb @@ -23,7 +23,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "id": "DoqquVwwHP7X", "metadata": { "tags": [ @@ -40,12 +40,12 @@ } ], "source": [ - "%pip install -q -U gtbook\n" + "%pip install -q -U gtbook" ] }, { "cell_type": "code", - "execution_count": 2, + "execution_count": null, "id": "4IY_SDk0UjoN", "metadata": { "tags": [ @@ -66,7 +66,7 @@ " import plotly.io as pio\n", " pio.renderers.default = \"png\"\n", "\n", - "import gtsam\n" + "import gtsam" ] }, { @@ -98,13 +98,14 @@ "id": "OCASg5LQbAo9", "metadata": {}, "source": [ - "```{index} complete\n", + "```{index} complete, path planning\n", "```\n", "Path planning is the problem of finding a collision-free path for\n", "the robot from its starting configuration to a goal configuration.\n", "This is one of the oldest fundamental problems in robotics.\n", "Ideally, a path planning algorithm would guarantee to find a collision-free path whenever such a path\n", - "exists. Such algorithms are said to be **complete**.\n", + "exists, and to terminate in finite time when no such path exists.\n", + "Such algorithms are said to be **complete**.\n", "Unfortunately, it has been shown that the path planning problem is NP complete.\n", "Numerous hardness results have been obtained for different versions of the problem,\n", "but the sad fact is that planning collision-free paths is generally intractable\n", @@ -116,7 +117,7 @@ "\n", "In this section, we will describe several approaches to path planning,\n", "all of which operate in the configuration space, to illustrate\n", - "the range of trade-offs that exist in this domain.\n" + "the range of trade-offs that exist in this domain." ] }, { @@ -126,6 +127,8 @@ "source": [ "## Configuration Space Obstacles\n", "\n", + "```{index} free configuration space, configuration space obstacle region\n", + "```\n", "Although the robot moves physically in its workspace, the path planning problem is more easily addressed\n", "if we work directly in the robot's configuration space.\n", "As in Section 5.2, we will denote a robot configuration by $q$ and the configuration space of the robot by ${\\cal Q}$.\n", @@ -162,6 +165,8 @@ "source": [ "## Value Iteration\n", "\n", + "```{index} value iteration, value function\n", + "```\n", "In Chapter 4 we saw how the value function could be used to plan a path that led a robot with stochastic actions to a goal while avoiding obstacles.\n", "We can apply this same method to the problem of planning collision-free paths in the configuration space.\n", "We merely place a large negative reward along the configuration space obstacle boundaries, and a large positive reward at the goal configuration.\n", @@ -194,10 +199,11 @@ "source": [ "## Artificial Potential Fields\n", "\n", + "```{index} artificial potential field, artificial potential function\n", + "```\n", "Instead of exhaustively applying value iteration to a 2D grid representation of the configuration space,\n", "we could try to construct a function that can be expressed in closed-form, such that following\n", - "the gradient of this function would lead to the goal configuration while avoiding any configuration\n", - "space obstacles.\n", + "the gradient of this function would lead to the goal configuration while avoiding any obstacles.\n", "Path planners that use artificial potential functions aim to do just this.\n", "\n", "The basic idea is simple: define a potential function on ${\\cal Q}_\\mathrm{free}$ with a single\n", @@ -205,7 +211,7 @@ "the boundary of ${\\cal Q}_\\mathrm{obst}$. \n", "If this function were convex,\n", "since the potential increases arbitrarily at obstacle boundaries,\n", - "gradient descent will achieve the goal of constructing a collision-free path to the goal.\n", + "gradient descent would achieve the goal of constructing a collision-free path to the goal.\n", "Unfortunately, we can almost never construct such a function.\n", "Convexity is the problem; at the moment we introduce obstacles, it becomes very difficult to\n", "construct a convex potential function with the desired behavior.\n", @@ -225,7 +231,7 @@ "can be captured by using a parabolic well for $U_\\mathrm{attr}$, and defining $U_\\mathrm{rep}$\n", "in terms of the inverse distance to the nearest obstacle:\n", "\\begin{equation}\n", - "U_\\mathrm{attr}(q) = \\frac{1}{2} \\| q - q_\\mathrm{goal} \\|^2 ~~~~~~~~~~~~ U_\\mathrm{rep}(q) = \\frac{1}{d(q)}\n", + "U_\\mathrm{attr}(q) = \\frac{1}{2} \\| q - q_\\mathrm{goal} \\|^2 \\,\\,\\,\\,\\,\\,\\,\\,\\,\\,\\,\\, U_\\mathrm{rep}(q) = \\frac{1}{d(q)}\n", "\\end{equation}\n", "in which $d(q)$ is defined as the minimum distance from configuration $q$ to the boundary of\n", "${\\cal Q}_\\mathrm{obst}$.\n", @@ -263,7 +269,7 @@ "Note that the basic idea behind potential field planning is similar to the basic idea behind\n", "using the value function to construct a path:\n", "Create a function whose optimal value is at the goal\n", - "(this is a maximum of the value function, but a minimum of the potential fiels),\n", + "(this is a maximum of the value function, but a minimum of the potential fields),\n", "and assign high cost (or, in the case of the value function, negative reward)\n", "along the boundary of ${\\cal Q}_\\mathrm{obst}$.\n", "Then use gradient descent to find the path to the goal.\n", @@ -283,10 +289,12 @@ "source": [ "## Probabilistic Road Maps (PRMs)\n", "\n", + "```{index} pair: probabilistic road maps; PRM\n", + "```\n", "As discussed above, the value function is guaranteed to find a path because it essentially explores the entire configuration space (applying dynamic programming outward from the goal configuration), while potential field planning is efficient because it focuses computation on the search for an individual path.\n", "A compromise approach would be to build a global representation, but to encode only a small number of\n", "paths in that representation.\n", - "Probabilistic Road Maps (PRMs) do just this." + "**Probabilistic Road Maps** (**PRMs**) do just this." ] }, { @@ -348,12 +356,14 @@ "id": "IFVQktcvDFMu", "metadata": {}, "source": [ - "## RRT\n", + "## Rapidly-Exploring Random Trees (RRTs)\n", "\n", + "```{index} pair: rapidly-exploring random tree; RRT\n", + "```\n", "An alternative to building a single, global PRM is to grow a random tree from the initial\n", "configuration $q_\\mathrm{init}$ until one of the leaf nodes in the tree can be connected\n", "to $q_\\mathrm{goal}$.\n", - "This is the approach taken with Rapidly-Exploring Random Trees (RRTs).\n", + "This is the approach taken with **Rapidly-Exploring Random Trees** (**RRTs**).\n", "\n", "An RRT is constructed by iteratively adding randomly generated nodes to an existing\n", "tree.\n", @@ -399,14 +409,14 @@ "\n", "RRTs have become increasingly popular in the robot motion planning community, for reasons that will become more apparent when we revisit RRT-style planning for aerial drones in Chapter 7.\n", "For now, we will use a very simple implementation of RRTs to construct motion plans for our DDR.\n", - "Because the DDR can turn in place, we have will build an RRT for the *position* of the DDR.\n", + "Because the DDR can turn in place, we will build an RRT for the *position* of the DDR.\n", "\n", "First, we need to be able to generate new nodes and calculate the distance between them:" ] }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "id": "8cIpmgm0HUe4", "metadata": { "colab": { @@ -431,17 +441,15 @@ " \"\"\"Generate a random node in a square around the origin.\"\"\"\n", " return rng.uniform(-10, 10, size=(2,))\n", "\n", - "\n", "def distance(p1: gtsam.Point2, p2: gtsam.Point2) -> float:\n", " \"\"\"Calculate the distance between 2 nodes.\"\"\"\n", " return np.linalg.norm(p2 - p1)\n", "\n", - "\n", "def find_nearest_node(rrt: List[gtsam.Point2], node: gtsam.Point2):\n", " \"\"\"Find nearest point in RRT to given node (linear time).\"\"\"\n", " distances = np.linalg.norm(np.array(rrt) - node, axis=1)\n", " i = np.argmin(distances)\n", - " return rrt[i], i\n" + " return rrt[i], i" ] }, { @@ -455,7 +463,7 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "id": "phpW0ht3AVhN", "metadata": {}, "outputs": [], @@ -463,7 +471,7 @@ "def steer(parent: gtsam.Point2, target: gtsam.Point2, fraction = 0.1):\n", " \"\"\"Steer towards the target point, going a fraction of the displacement.\"\"\"\n", " displacement = target - parent\n", - " return parent + displacement * fraction\n" + " return parent + displacement * fraction" ] }, { @@ -476,7 +484,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "id": "7_7cl_1KfeBa", "metadata": {}, "outputs": [], @@ -494,7 +502,7 @@ " if (distance(new_node, goal) < 1.0):\n", " print(\"Found motion.\")\n", " return rrt, parents\n", - " return rrt, parents\n" + " return rrt, parents" ] }, { @@ -507,7 +515,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "id": "UNkKEB6UDZVf", "metadata": {}, "outputs": [ @@ -523,18 +531,36 @@ "start = gtsam.Point2(0,0)\n", "goal = gtsam.Point2(9,9)\n", "print(start, goal)\n", - "rrt, parents = rapidly_exploring_random_tree(start, goal, fraction=0.1)\n" + "rrt, parents = rapidly_exploring_random_tree(start, goal, fraction=0.1)" + ] + }, + { + "cell_type": "markdown", + "id": "63ec1c79", + "metadata": {}, + "source": [ + "In Figure [1](#fig:rrt_example) we use plotly to visualize the RRT.\n", + "You can see that the RRT increasingly \"fills\" the entire area of interest.\n", + "The algorithm terminates when it finds a leaf vertex that is near the goal." ] }, { "cell_type": "code", - "execution_count": 7, - "id": "jDMJhH5r2weS", + "execution_count": null, + "id": "sNy4-jMNTonp", "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4XuydCbgU1Zn+36q+wAVFuLgAisI1yuIGuIIL4IoYA4GMMJC4kajRTMzoZDRj/nESk5hMJtFEJzrRRBIdF9RgcEHAhUXD4sJiVBY1XFBZNAiIst6u+j9fFUXX7dv3dnV/Vd3V1W89j0+it86pc37f6XPeOvWd7zNs27bBiwRIgARIgARIgARIgAQSSsCg4E2oZdktEiABEiABEiABEiABhwAFLwcCCZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAWvbww0ptMwDROmaTQbGVs/2wb5e12njhw1JEACJEACJEACJEACFUSAgnePsbbv2IVxV/0QV37tS7jw3MF7Tbht+w7c+JPf4cW/Lnb+23FHfQF3/uRaHNClUwWZmU0lARIgARIgARIggeolQMEL4Jf/OxmTHnnWGQX/9f2rmgje3z/0DB57ajYeuPP7aF/bFld/73bUH9YdP75hYvWOGvacBEiABEiABEiABCqIAAUvgM1bPsOOXbsw4Zof4/orxzYRvP90xX9i+LCTcMVXL3TMOmP2K7j+h3fhzVmTYBjNXR8qyPZsKgmQAAmQAAmQAAlUBQEKXp+Zh4//d3x74pgmgvekEd/ET278uiN65Xp7ZQMuuvKHmPfUb9Gp4z5VMUjYSRIgARIgARIgARKoZAIUvK0IXtu2ccyZl+Oun12HoYP7O3e+1/AhRl72fTw/+Vfo3nX/SrY9204CJEACJEACJEACVUGAgjfADu9Pv/cNnDf0xJw7vGs3bq+KgcJOFkagY/sawDCwddvuwgry7sQTOHj/9k4fOXck3tQFd5DzRsHIqqqAN3dUVadD7CwFbx7BKz685595Mr4x4YvOndk+vFy0QhyNCaqKC1eCjBlyVyh4QwaaoOo4byTImBF0hYJXB5WCF3Di69qWjQsv+Q9885KRuPCcwWjTpsYhe++DT+Pxp+c4URo6tG+Hb954W5MoDRS8ugGY1NJcuJJqWX2/KHj1DJNaA+eNpFo2nH5R8Oo4UvACTtQF2bn1X0/f/zNH2H6+bQe+e8vdmLtgqfPnY/rU486ffgcHHdDZ+XcKXt0ATGppLlxJtay+XxS8eoZJrYHzRlItG06/KHh1HCl4A/LbsvVz7N7d2CzhBAVvQIBVdhsXriozeAHdpeAtAFaV3cp5o8oMXmB3KXgLBJZ1OwWvjh93eJX8klqcC1dSLavvFwWvnmFSa+C8kVTLhtMvCl4dRwpeHT8KXiW/pBbnwpVUy+r7RcGrZ5jUGjhvJNWy4fSLglfHkYJXx4+CV8kvqcW5cCXVsvp+UfDqGSa1Bs4bSbVsOP2i4NVxpODV8aPgVfJLanEuXEm1rL5fFLx6hkmtgfNGUi0bTr8oeHUcKXh1/Ch4lfySWpwLV1Itq+8XBa+eYVJr4LyRVMuG0y8KXh1HCl4dPwpeJb+kFufClVTL6vtFwatnmNQaOG8k1bLh9IuCV8eRglfHj4JXyS+pxblwJdWy+n5R8OoZJrUGzhtJtWw4/aLg1XGk4NXxo+BV8ktqcS5cSbWsvl8UvHqGSa2B80ZSLRtOvyh4dRwpeHX8KHiV/JJanAtXUi2r7xcFr55hUmvgvJFUy4bTLwpeHUcKXh0/Cl4lv6QW58KVVMvq+0XBq2eY1Bo4byTVsuH0i4JXx5GCV8ePglfJL6nFuXAl1bL6flHw6hkmtQbOG5Vh2S07N+OHL9+IGX9/CvL/x/a7GD88/b/QqV3nSDtAwavDS8Gr40fBq+SX1OJcuJJqWX2/KHj1DJNaA+eNyrDsxGfGYsaqp5s09qK+X8Ovz7kn0g5Q8OrwUvDq+FHwKvkltTgXrqRaVt8vCl49w6TWwHmjMix7yP90aNZQ2d19+4q1kXaAgleHl4JXx4+CV8kvqcW5cCXVsvp+UfDqGSa1Bs4blWHZXIJ3v7adsOzKdZF2gIJXh5eCV8ePglfJL6nFuXAVZ9k1awz06GHDNIsrXwmlKHgrwUrlaSPnjfJwL/SpdGkolFg87qfgVdph7cbtyhpYPIkEuHAVbtU33zLw6J9TGDbExlnD0oVXUCElKHgrxFBlaCbnjTJAL+KRclDtuuevxPwPX8Knu7ZA/Hd/dMYveGitCJalLELBq6RNwasEmNDiXLgKN+xtd6SwebOBMaPSGNDfLryCCilBwVshhipDMzlvlAF6BT2SLg06Y1Hw6vjRpUHJL6nFuXAVbtnf/s7Ehg0mTh1k4fzzrMIrqJASFLwVYqgyNJPzRhmgV9AjKXh1xqLg1fGj4FXyS2pxLlyFW3bdegN335PaW7BXTxv1vWzU97TQrRuwfQewZTNg2cBnnxs4tIeNumjDXhbeiQAlKHgDQKrSWzhvVKnhA3abgjcgqBZuo+DV8aPgVfJLanEuXMVZ9vVFBuYtSOHjfwQrX1tro1tXOMK4W1cbXbva6FIXrGy57qLgLRf5+D+X80b8bVTOFlLw6uhT8Or4UfAq+SW1OBcunWV37ABWNRhoWGNgxQoTn34KdOwItGtno7YWaF8LrNsAx+c3+zr6aAvjvhJflwgKXt3YSHJpzhtJtq6+bxS8OoYUvDp+FLxKfkktzoWrNJb1hPH6DQbWrzewbIUrgAcOEHcIC0f1s9G2TWnaEvQpFLxBSVXffZw3qs/mhfSYgrcQWs3vpeDV8aPgVfJLanEuXOWxrPgBT7rfxI4drvA9c6jl/BOni4I3TtYI3pYZz4l/uY2TT7RQF9Bt5v0PDOzfxUaH5om5cj6Y80Zwe1TjnRS8OqtT8Or4UfAq+SW1OBeu8ln24ckpZ6dXDr1JiLPOMTvYRsFbvrGhefKPf9YGu3e74fJOON7CqAtbf5FqbARuubUG4mc+YWwavXrlfzrnjfyMqvkOCl6d9Sl4dfwoeJX8klqcC1dpLfuPjcC775notJ+Nhx9NoXNnG9dcmXb8feN2UfDGzSLB2rNpEzB7bgqLl7pfD2769zRq27ceL/rue0ysW28Gji3NeSOYLar1LgpeneUpeHX8KHiV/JJanAtXaS379DQTr7yWyUc8sL+N0aPima2Ngre0YyPsp931uxTEZ3zEcAuDT2m+y/vOuyY++NB96qw57pi85ebGQM3gvBEIU9XeRMGrMz0Fr44fBa+SX1KLc+EqrWU//NDAG28ZWLXKcMQId3hLy7+anjZlagpL9uzyyjg7a6iFvn1szF9oYv5CY6//uMdEwuVdc1Wwly/OG6UZSR+uNVBTA3Q9qLIyOlLw6sYHBa+OHwWvkl9Si3PhKp9lvR24fn1sjB8XTGiUsrXc4S0l7WieJYL3xTlmzrB44rMrUUJq22GvAB4/No1+ffOLK84b0djLX6ttA//5Y9e3+uor0oEPIEbfsvxPoODNz6i1Oyh4dfyqWvDu2Am8+pqJ0wanYZrN46Eq0VZ0cS5c5TOf+FrefW/K2Wk76ECgQwfbidsru3GSma1P7+Cn7KPoBQVvFFTLU6cnfD/dauCk4y0c3c9qcjjtL0+aWLQk42pz4AHAzl02Rl5oofcRzQUw543S2NF7KT72GBsXjYnfS3FLFCh4deODglfHr6oF7213pJwdjgsvsJxQPbwyBLhwlWc0SFiyZ2eYaFjd+gtYXWcbA/rbZQlZRsFbnrFRjqe+93dg3gITH//DaLYbLH7mw4Y03WHkvBG9lcSvWv6RHd7LL7HQvVv+nffoWxXsCRS8wTi1dBcFr45f1QreJ6a6p5Xj7CupNK2qOBcuFb6iCr/0VxPPveDupslidv65Frp3B3busLF9h4HNWwwsW27sFcPiv/etb5Z+d4eCtyjzJqLQ28tMrFhp7I30IJ3yC1/OG9GaefpM03kBkevyS9Oo71k5YlfaTMGrGx8UvDp+VSN4H/tzCh986O6abd9hO5+LK9EHSmnuwMW5cAVGFdqNT00zHRcbSTQhp+dzhSRLp4Ef/bTGeWaupBRr1xto3w6oq4tuIaTgDc3kFVtRdogz6YjEjR58koGTTzSwddvuiu1bHBsurifTZmQS0owZZWFA/8r7KknBqxtdFLw6flUheC0L+PHPaiBiwX8NHmRhxHmVN2koTR6oOAVvIEyh3yRj1cy4TOas/7nnTbw0z73pyCNsXDwhDUlRLF8tJGFF1642vhXwVH0xHaDgLYZaMst4wnfVaux1eeh9BPDFEY0VdZgqztYRxrff6b7kSsSM885J44gvxLnFLbeNgldnNwpeHb+qELyCKG0Bn36agXX7He4p1+uvjWdwf6VZ1cUpeNUII61g+QoTU6a6IaTkINumze7jZEx/cYSF/sdyhzdSA7DyJgRee93Ek880fVM7dZCF87mhoBopDQ3AQ4+6B1jjHJs7aCcpeIOSyn0fBa+OX9UI3mxM9/0p5fhC5vosrESaiOIUvPE3o3zmlJiq3tW3t+UkE6iri7bt3OGNlm8l1v7gwyZWvGNi9IXAh+uAV153e+HF+ZUDlps2GU4mQTMzZEvSVXk5lB3o4edaqKRgPBIXWQ6wyiW+ul+dkEbbNiVBFtlDKHh1aCl4dfyqVvAuX2nioUfcyeTqK9MVddJVafK9xVe+Y2LmC8ClX7XRsWPTHUEK3rAoR1ePHGCTNMTiOznx0tIdXqPgjc6mlVrz7kZgyxagvkcNYBhY8+FuTHrAjYLjv4YNlUQX0Y9VSdW9e7eB2XNMx81HrmuvacQBB8SfsN89SVqbpJ1yCl7d+KPg1fGrWsEr2LwTr7ILMfgUG926WqjvpQRaQcVfmGVizksmcvkyU/DG35DeDlCpfdEpeOM/NsrVwux5IzvBxVe+nEb/46Jzt5F+b9li4Fe/yWwji5vPmUNljo//eY102sYvb6/B59vcQ9VjRtno2yf+7Q463ih4g5LKfR8Fr45fVQteQecF8PYwllo8KM2nKi7+Yffd7/oy33RD010XCl4V2pIUnjXbwKy5Keez8eUXly7jEgVvScxbkQ/JNW988gnwyGOpvSmzox6rO3cCzzybwpI33J3d675dOQfoVjUYmHR/6X/TpRpsFLw60hS8On5VL3jlMMCCV4AXZ7s7Al/6ooWTTkjOG3W+4XHrL9wDERMvaWySYYmCNx+58v992zYDf3zAdISEnN4Wt4ZcoczCbmkxglfG2LyFBk4blEa7dsxqGLZN4lJfS/OGfKaXcxMyVuUFbfzY6BMmeHPbTTc0luR3EYYNvKQSSd14oeDVjRIKXh2/qhe8gm/xEgNPPOm+VUvUhmq6np1pYv4C0/EDPX6AjZ6HuYeeKHgrYxSIkLjrHtdX0vPlXbvO/RzaJaLDa8UIXi+r4TlnWxhyWvW8UFbGKAqvlfnmDe+wsIzPQSfbqK+3I0ue4H29u+LyNA49NFo3irAIenzGj02jX9/KaHMhfafgLYRW83speHX8KHgB/M/dNfjo49yB/JV4Y1/8/fcN3Dup6bHpoUNsfHmE4Rw+YQD52JsQEqcz+4CQ7PheE1Es3kIF78OT3fjAzGoY/7GkbWE+wSv1T/lLxt3g8Hobl10czSaD95J15cQ0evSoDPF48y1uvN2kHqSm4NX9wih4dfwoeAFI2JqHJru5yasxLu9HH5l4bxXQ0GA4wkQSH/zqp6DgVf62Sll83XoDd9/jvrjITu/JJ1o45uhoFvlCBK/3iZZZDUs5Gsr3rCCCV1rnuRvIS5m8nIV9eckacp1PCPtZYdUnv+F7/5BC4x79L23v1hU4+igbp5yUjK8iFLy60ULBq+NHwbuHn/cpqV8fG0OHWDi4e/iTsNJUJSnuLUS3/qeN9u1N7vCWhHo4DymVz2JQwbtqtYFJf3JF+IRxVqJOm4djseTVElTwSuIfSZYicaPDjp7g/+LRt4+NCeOi2UEO03r+mLvyJURclcTv3bvkv132tTS6dAnzqaWvi4JXx5yCV8ePgncPPzlMIT5f3jXoZAsXnJ+Mt+pChogn/Cd+DTj2GLo0FMKunPfu2g38+o4afPY5cN21jU72taiuIIJXRMe996Xw2edM7hKVHeJYb1DB++ifU3jzLVfQSQaxYUPCizLi+e7KzvFFY9I48MA4knLb1FrMXfkNrV5j4MU5puOjP2yIhbOGVfaaRMGrG4sUvDp+FLw+fq8uMvHU024yiguGWxhUAXEbleZvVtw7xFZXZ2PEOQb69msM+xGsLwICkpFJdonkijpQfRDB+8TUFBYvNSBfTMZXwA5bBCapyiqDCl6BIzF6p80wnZ1M+Xwvc65kZNNcDz6SwoqVrr941OHPNO2UsvIF5OHJmf63FnP3nfdMHN7TQsp18a3Yi4JXZzoKXh0/Cl4fP+9wTTUv0rKjMPkx09mZk2ufDsC4i9KOXyiv+BKQ+J3PTDfx0Ueu3Y452sK5Z0WTZjif4PV84qUdlRQDNb7WrZyWFSJ4pVeyizl7rvtyJNdBB9n4+qVptG9feJ+ff9HE3JdN5wzCd74V/9i7Egpz9lw3pKBEZYg6JXjhRMMvQcGrY0rBq+NHwbuHn+dDVQk7A0qTByq+eFENnnjavfXE4y2MvLCyP6UF6nQF37RmDfB/j7gxlf1X2J+Lpe6WBK8kMlm20nTC/Ek7ovDPrGATVUXTCxW8HhS/D6vs9vbrAwwdkg4cWs87pCb1nX+e5XzliPMlv5UX56TQsNoVvBKC7KQTLey7T5xbrW8bBa+OIQWvjh8FL4CP/wHceZf7rYiHa9wBdf8DNXh3FSri06DyJ5CI4vJ5eMrUlLMTP3iQjeXLjb27ZtJBEb5DTrew//76nXq/4H3jTQMfrjX2ilwPZvduthNaiVd1EShG8H76KfC/v6/BZ581ZyUbEGcNze/q4LnQyDgfPSqe4078dRcvdV8I5cxI9tW2DfD//iPZLmQUvLr5gIJXx4+CF0AlTJZKMxdUXJJwyKQsfrxyMrgaPrUVBCimN+/eDbRpk2lc9ufidu1sXHOl3p6dO9Ri3is2/jKtscmOsuxUSSKBvr1t1PfSC+uYYmazWiFQjODdtQv45W9SqD9MdmddsSpuDqtWwzmsJZcIXxGzA45r7qbjD8kXNxeaxrSN6TNSMExg6Rvulw+5ZBd78Ck2eh9p4+OPgY2bDOxfZ6t9mOM+OCl4dRai4NXxo+AF4AUoj9tkqTRtUcXfWmY6Prxy/du1Bjp13l1UPSwUHwIifOUlRj6falx2ZIdq1lwTS5ea2Lbd7Z+I3IEDRIxYFZO+NT6WSV5LihG8rVGQLxdelALvPhG+A/unnVTo/kyDcXOhESH+3Asm3n0vs5srX2BOHWRXbYg+Cl7db56CV8ev6gWv+FLdd39NVaYVzjV0ZAGRFwDZifjWFUDX7sn+xKb8+VRUcS9cUyGiV8aDHIgTH0sRzN7V5wgDp5262xEdvEjAIxC24PXqlXl68dLM4Tb57zKOO3eCMy4POhD4l6vjMVdJW+cvdLMLepeTCOYoq+p/LxS8urmCglfHr+oFr+fOcOZQC/IPL8DLjnVId+Cgrja2bbNhWUDPnsBpp1R+aJxqtbGIV4mzLP6DsjM78dJ0i7uynsBYtqJpAPzBgyycfmIb9DnSrPq5o1rHUWv9jkrwes/Mjk3rCV9JT9ylrrwW8R9Ek5aI24J8/ThzCL9+eJah4NWNUQpeHb+qX7ToztB8APlPTGf/9ewzLQw9gy8Gyp9d2YrnE71i+1lzMr6G0lD5DCunyD23hXxhycrWOT647ASiFrz+DnrzlIxPeXkr15VL6Ip/rmSQq60tV6vi+VwKXp1dKHh1/Kpa8KbTwI9+6kZnuOmGxqqfnEQMzZ5rYt4C14f35BOAQw5Jo66zjc1bDHyy2YB8mkt66BzlTyr2xd3UqzXYvDkThUMa7fn55hK5/k5R8MbexGVrYCkFr8xXt/6ifPO3uFK8ODvj6uMdRKPQbXn4UfDqfpoUvDp+VS14BZ2XSreaXRo+/NDAg49kkk0IlwkXGTjpBGDrNh5aU/7EYll862du6l/vFLws1l7Gq9YyPklnKHhjadJYNKqUgtc/f0viBvkKUYpLDqPNnmPu9dGl0A1OnYI3OKtcd1Lw6vhVtOCV0FmSavG4Y4qf6LxDazJp3XRD+T6LKc1YdHGZvCfdn0lvecABwLlnWziubwowDAreoslWRkHPh11am8+v1+sRBW9l2LYcrSy14PXOG4hv+YjzonW1yha6Ht9qypSmHVMUvDqCFLw6fhUreL1YsdJ9LzNPfS8LffvY2LTZwLTpJrZuNXDp11pPMenP0CNBvyX4d7VcfrGb7QdX6oWrWpjHsZ+eaJC2jRklQf5bFw4UvHG0YjzaVOp54+e/rMG2be4akO/LRLGEWtrRre9pYcqTma8kXhzqnoe5saglxTGvpgQoeHUjgoJXx6/iBK8I1IcfdU+ayyQnhwK8z7ItoRAxd/wACzIReUkUZGd33YZM1ptyH3xQmrHg4u9/YOCBh9ydXTlJPHpk093tUi9cBXeABUIlUIjopeANFX2iKiv1vDF/gelkFPQyl0lKYUktHMYl4fgWLMy4Lkid4vrm99HduBH44//VYMuWpk+UjZOvX56GZBzklSFAwasbDRS8On4VJXhlYlm71vU1lBiM48dazoQiInj5ShOvLzLxj43AQQfa6N7dBSNuD/5LBK/cn+sKmsZSiTwWxT2B06+PjfHjmrtylHrhigWUKm+EX/Refmka9T1zL9YUvFU+UFrpfrnmDX9kmcLiTBt48y0D4tO+YyeweZMrnjdtbtpJcZnIF15MNlFWrXYFuGzCyIbM5Ze4axQvlwAFr24kUPDq+FWM4PWnj5S0pXJIIWjIF8nWs6rBnYi8q64z0Lev5QQub19rN8nmIxNmfU9g2BB9GlaleSIr/uxME7I70lJ2onItXJF1mBUHIvDw5EzAfO8F0P9lxL9ord24J91aoJp5UzUQKOe84UYfcV0MRGxeMFzcc3KLTRGny1a6X/i8dL8t2ecro9Pof2xhotX7HVH0NqVKwaubBSh4dfwqQvD6fU1zfX4PisCLuTvxksacGW9EGC9a0jSjlEx0MuEl7frDpBRWv29gwri04/ecfZVz4Uoa60rpj3+XzIva4LVd0rn27WPhyCNsdO5Qi/06GhUxd1QK+6S0Mw7zhv8gpozbEcPdzREJY7Z4qYlly40mWQPFnU02UeSezp1sdOtmQzZE5LJt5+xuUZd3zkR+S+PHWS1+MSmq8gotRMGrMxwFr45f7BethgYDDz3asq9p0O57i7nsWl1/besCVnYKpj8nE6N76qCQT2RB21PO+yTlpexAyPW9f0ujwz4UvOW0Rxye7Xdn8EL05foy4rV1/y6u65CIA4nRvHIlMHiwjSGnheM/GQcmbEPhBOIgeKXVMnanzXDXDZm/JWSZfzfXy4I24Dg7UpcD/+Hq1tyECiddmSUoeHV2o+DV8Yu14N2x3cCt/+0KM83OrpUGfv6rlDP5tbS7m41RklK89LKJRT5/rNY+kSnNUNLiXuzhltwZpDFxWbhKCqZKH/bIYym8vczdxsoVpSE7nWuH9sC2FrwZvPTckpzkpBMpfqttSMVp3vAfcPbs4GUNlINnpbqmz8wk8wkSBaVU7SrHcyh4ddQpeHX8Yi145S19ytSUk9pUkzqyMQ38/r4UDu1h44sjCp/osj+Rdepk4/RTLbRtq4RfhuL+MGy33NzYYgvitHCVAVPVPPLV10089YwZOKyTvAgeelB7/GOjjVeW7nRCAMqn4I8+MvD64qbffq+7tnHvp+GqAVrlHe1gH6cAACAASURBVI3jvOH503bpDPzrtS3PeVGaLtcXlCif11rd8huWkGnFumpo2k3Bq6EHUPDq+MVa8HoTlYTMkh3ecl5+/0ZpxwnHWxh1YeHiuZx9kGd7TMW3bfSoll074rhwlZtd0p4vPo133eMe8hkzKt3iAZ/sfueK0mBZwNyXTcfnUS7TsHHG6RZMs0gHyKTBrpL+xHHekHEu5zfkC99//Hsa7duXZy3xi95c4dM+3wa89ZaJk0+KZl0RDrKOzXnJxH772bhiYhod9y3twKTg1fGm4NXxi63g9QSmrJc3frd8k5Qf7/sfAA8+ksK2be4iLqKxUiI5yKlkf5D0fJmJ4rhwKYc6i2cR8L5cFPoFhWHJOJRaIhDXeSMumydLlpqYMtU9GyIhIYcNtZwDdP6DdC2FitSMOgm19scHzL1rl9R18YS0cwi1lBcFr442Ba+OXywE70t/lZBhJo492sKA4ywniPd999c4PStk50mJInBxcbV4cY65N+GF/yRw4EpKdKOwnDY9E2pKsgHt1xEYdIqFI77Q8mQX14WrRNgS/xi/a8t13249G2E2DArexA+PojsY13mjlCmI88GTqEMPP5pZP7z75SCdXLITHWa6Yv+XHHm5PWtoGp06YW8SpnztDfPvFLw6mhS8On6xELz+2J/+7hw/0MKXvxTN5x0lNid5xey5qb2xfeOatMKb6GUyHXyK7WQKCnLFdeEK0nbek59AUNeWXDVR8ObnW613xHXekC9csolS6NeMqOzoHWTzokX06205oTL9sYTDiA7kP7gnIvqaq8obYpOCVzeiKHh1/GIheKULMiEtXprC394yUVNj4fgBNoafa5XFsb4QpDKhSOgZ+Swll2TV+eeL4pOwIl+CiZb6GteFqxDb8N7cBLzDoMUuqBS8HFmVNG/I2jJthpuOXg5q/egH5Tm45mf2wmwT8+ab+JdvNv+64hepxcTwlR1dSYss/Z2/0E2sIb/1y76WRpcu5R27FLw6/hS8On6xEbzKbpS9uN/NQSapMaPcQP3lvrwQZEHDsXntpeAtt+WKe37DGgO9Dsu4qshhMjmR7b+8BCzFugtR8BZnm2ooFad5Q1wHnp2RSSQkou+L51vo07u0fqvF2t3/5VP8eiXmdX1PC926oVmWUe9Amidw/c+UsnJAOWhm0mLbG6QcBW8QSi3fQ8Gr40fBq+TnLy6TjhwEksQOcnkB/EN8ROCqnOQZMzNt+ZerG3HQgYGLMw5vcFSxufO11008+YzpfLbt3t3GqlXuLk9LV6G+u149FLyxMXnsGhIHwSvz8LMzMu5mhbpzxQnq9OdSmDe/+W9Y+tS5kyt827Sx8f4HmRTJ8rduXeHMAd272oGjr5Si3xS8OsoUvDp+FLxKfrmK+8PPiPiQsGp1dRE8KEeVsqshSQQWvupOgDL5nX6qjSGnF7bbHIeFqzTEkvMUiYk76f7mh2Fa6qHseP3rtxphpgoLHUbBm5wxE3ZPyjlv5NrllE0HSTIRh93NYlnLS+v69cC6DQbWrWuaFtlfp8z1E8amHV/guF4UvDrLUPDq+FHwKvm1VNx/ElcSVYikGHuRhR4Hh/85TSZ6OUC3aIl7wtd/XTkxjR49Cn9mOReuiExSFdV6L1teyCPxKc++PB/Bj/9h4MbvNqK2XWFoKHgL41VNd0c1b0h8Z3+iBJlf5dzE5i1wROD6DU3nPhn/559Xuo2GUtnYO3wnL6tjRroH0NZtMB3XDbmGDbFx1rDyHkxrjQUFr26kUPDq+FHwKvm1VlyExW9+WwPxo5RL3sD79QHqe1noeZgdyq7v8hUmps0w9oZI8079SlxHSSgwfmzaySNf6BXVwlVoO3h/YQQ8v78gdt+120bbNoXt7kprKHgLs0k13R3FvLHyHRP/97CbDbB9e8OJZJDratcWOLzechI3fOHwZFKfPdfAi7NTOPvMNIaekZnX/fF9i/XNLwUxCl4dZQpeHT8KXiW/fMX//EQKa943sG27gZ07mwpPeUuv7+kK4OOOsWGm8tWW+Xt2dAgvvqL/c9Ynm4AuRbpSRLFwBe8d7yyWQLGHFAt5HgVvIbSq694o5o0PPjTw+z+mYPk2Lus6A716iR+r+KvaqO9lV7TbQtBRIgxeXWTixOMtpLLWC+/rTlxCr+XqEwVvUEvnvo+CV8ePglfJr5DiIlJXrzGwqsHEshVNP8F16AAMPcPCwP6t+5t5fmoyucklux4XDLdCP5gQxcJVCCveWzgB+cwrPrzi1hLlgUkK3sJtUy0lwp43ZL57aLIb9lHmui+PtHBUEV+sqoG/P4XyOWdaOPYYK5SviGGyo+DV0aTg1fGj4FXy0xQXgbJ8hQjgzEGEXAkstm8HlrxhNkk/Kc+NMsNb2AuXhhPLtk4g+1S6d/ehPWxcMTF8fz4KXo7IlgiEOW/449HKvDh+rOXEOefVMoGnnzXxyquZOIQifIecUdiB5Sj5UvDq6FLw6vhR8Cr5hVVcxO/sObLz6/pUygQvYWeyD2N4Pron9LdxUNfoJv8wF66wGLGepgSyd/vlr4MHWU4oIkl93a4d8K0IMitR8HIkRi14w844Vi0Wk8N9ixabjhvd4qUGDjnYwlXfoOBNiv0peJWWXLtxu7IGFg+TgBw2E3eFXPFTTx1kYdiQ0oTYoeAt3qqNjUBNTfHl85UUoTtvoYkFe7Ioebv9w4Y0PZWefbI9X71B/07BG5RU9d0Xxrzhj3Aj/rkTL41H0oRKs+a2bQZkLtpvv+g2Rgplwh3eQok1vZ+CV8ePO7xKflEVf+NNwwllduQRNmbPNTFvgekczihVLvQwFq6o2MSpXvGXXfKGgU2b0STRg+zQDz7FRt/e4frRzV9oYtacTJD51sKPRcWJgjcqspVfr3beWLXawMOTXT90OXw1YRzFbuWPikwPKHh11qTgzcPvhZcW4dof3NHsrkUz70W7tm0oeHXjrySlZUfvrntSTpixKA8j+TujXbhKAqbMD5F00tNmuItza9eA/haO72+pAsJL/M1pM1J7d/5zReUoFQ4K3lKRrrznaOYNf2itgQPchD28kkWAgldnTwrePPyef+l1/Met9+Lxe3/U5M7DDjkIhmFQ8OrGX8lKewHHxYf36iuiD6iuWbhKBqWMD/r7KgN/fMCNCyTiU2Idy//WdXbDI4kYlmgc4kfnXRJCacxoC70OC/6JMTtdtewcXzDcRt8+5fPLo+At48CL+aOLnTfky4WXPKFUL/UxR5nI5lHw6sxKwRtA8P7oV3/ES3+5M+ed9OHVDcBSlvaSCpQizmKxC1cpeZTzWZJM5IVZJg7tgVbFpwhWicQhh8hkh36fDnBOTXfvamHtehN9jrSw//65eyK+3PN9frpxEQIUvOUcefF+djHzxpNPm3htkRtZYMRwNxUwr2QSoODV2ZWCN4Dg/c4P7sSo4aehXbu2OLF/HwwfdhJq9kStpuDVDcBSlvbHWYw6m04xC1cpWVTis+77YwoNa5q7P8iurYSY69vHdsIuZbsvxC1NKgVvJY6+4G2WlzkzE9kqeEEAhc4bkinyocnuw0ZdaOGE4yl2CwJeYTdT8OoMRsGbh9/flq/CjNmvoFPHfbB2w0Y8+uQsTBh9Nr7/nYudklu3N+oswNIlJfDm28Af7rfRvhb4wfeA9rWFp4YN0uB2NSbk1NzO3VyAgvAKcs/mT4EFrwDbd9hYu9a13bt/t7F9R6a0vIem97gu1tXZmPBPBo74QjQ2DtLmXPc4ooZzR7H4Yl3uv26z0WjZuP5fjKLmlkLmjU8+Af7nXgubNhkYfo6B88+JNRo2LgQC3twRQlVVWQUFb4FmnzJtLn7wi/uw9IU/OLu8W7ftLrAG3l5uAvc9APztbRFMwCHdRUABhglceZmNjvuGI47atpFdFwO7dvPgSNT2FtH76uuGY1NP/B5zlI2vXxyOLcNuf8cObVzBy7kjbLRlr+/u3wMr3wOGnwOcf3bhzSlk3vjlHcCH64BjjwImuvsvvBJOwJs7Et7NyLpHwVsg2pcW/g3fvPFXeH3GPaht15aH1grkF4fbn5iaanIYymvT2WemMfSM4AeiWutLoZ8m48ClktvgP7TTti0w9isWeh8Zz911ujRU8khrve3e4VjDAG6+qRF7PN8CdzjovCH+6fKPuPNcfnH0h3ADd4A3RkqALg06vBS8efg99MQL6POFQ3FU717YsvUz/Pst/4s2NSncd/uNTkn68OoGYKlLSwai2+90PylPvMR1R2lXa2DjRuDoo2zIQhXGFXThCuNZ1V6HxB6d9Cc34sOEcVZZIzAEsQUFbxBKlXvPzbe488stNxfu7hZk3vD77V59ZZrpgit3qBTccgregpE1KUDBm4ffbb97FH94eNreu4476gv47x98Ez26H0jBqxt7ZSnt7e7KIafRo6JzNwiycJUFQMIeKi8wd9+bcmL5xiUKQz7EFLz5CFX236MUvLKD/NCjlTXeK9ua8Wo9Ba/OHhS8Afjt2LkLH2/cjI77dEDnTvs2KcEd3gAAY3TLbXe4CSiu+3Yj6uqiaxgFb3RsvZr9CUUkEsP4cdG9wITZGwreMGnGr64wBe+GDQb+3mCgocHAqtVokqRFwiseP8DCgP7huGHFjyRblE2Aglc3Jih4dfzo0qDkV8riksxgytSUk+BA8stHeVHwRknXrfu+P6XQsNooacroMHpFwRsGxfjWoRG8NUYKf3vLRMMHaSxekkmB7fVW0qPXdQaWrfAlZIkoDXd8CVdvyyh4dban4NXxo+BV8itlcW93N+oYvNInCt5oLTt9pol5Cyrz0A4Fb7Rjo9y1//yXKWzbZgRKAiFfKZYtN7D+IwPLl5vYtLlp6yWeb58jbfTqJbGmLScLoVziyrN6TSYhi1dq2BALZw2L52HNctslCc+n4NVZkYJXx4+CV8mvVMW93V051Xz9tdHu7lLwRmtVf0SGSjy0Q8Eb7fgod+1eBAVph5wVGNg/jV693FaJwF0lLgprDKxaZWD9hqanZCVU4hGHG+hxaNr5EiWJVPJd2Wm4ZRd4/FhGbsjHrRL/TsGrsxoFr44fBa+SX6mK//qOFD7ZbKAUu7sUvOFb9bXXTWz9DNixE5i/oLLTqFLwhj8+4lajX/R6bRMhmi1wa2ttdOsK9OtrO+myj+1XAwkVU0yM5nXrDTz8qJuCW65SzXVxY5/k9lDw6qxLwavjR8Gr5Feq4hq/umLaSJeGYqjlLrN7N/Djn7mhnvxXpURlyG43BW94YyOuNXnRYETkykuaJ0KlvbJzW9/LRn1Pa+/Or9cP7bwhO8jPzmgeZ7xfHwvDhgbbMY4rU7YLoODVjQIKXh0/Cl4lv1IUl52Pu+9JlfRwk3bhKgWXSnrGgoXmnixqBj762MZbb7u7vOKicsFwO/axd/2sKXgraeQV3lZ/rG8vGsw/NgKffWY4Yre1K6x5Qw68zX4p5fj6+i/5vZw1lJEdCrdqPEpQ8OrsQMGr40fBq+RXiuJyKOThR0sTnSGsnZpScKnkZ0g80ilPuiHm5BJfyWFDKsNvMUzB29gIvDDLxFF9bRx6aH5/z0q2eZRtt2zADCnpjCbWd1iC189KRO+CV03ncJz3exHhK24UnTvBcaXwfIyjZMy69QQoeHUMKXh1/Ch4lfz8xT/5BOjUCQWn48zXBM+frpSfwKNYuPL1sxr/7j/AJv0fMdzC4FPCO6X+3t+BRYtTTr377huOoAxL8MrnaxFXEqLqoAMlU6CFujob3Q4CugU47FSN4yVXnxe+auKZZ01n91Xi2vY8zHZidD8z3XQE4WmDg4+nXLu7hXCOet6QA24vzsn4+frb1qWzjauusNC+fTjjvJB+895gBCh4g3Fq6S4KXh0/Cl4lP6/4Bx8YuOc+1+0g7BPGz840nYNOYYuh1roe9cIVEvZEVCMiY/bcjN+i7F6NGZk5GV9oJ6W+2loDDauBefNNrH4/WIipoM8JQ/BKetkpU5vHafXaIIehLr/Eck75f/ChhL0ysU97C+1rbeyzL3DgAUFbm/z7Vr5j4v8edl1kvEti3UqIMBF///HvwaO6aHZ35dmlmjfkC8mq1a7wXbceew/TnTrIwvnnBRf4yR8d8eohBa/OHhS8On4UvEp+Uty/U+VVJxPvKSfJjpX+AV6CgomXNJbs012pFi49neTUICJw2ozMZ9ti3Bw895dsKueencYZp4Wz86URvCLGn3jSTbghl+xKyo72zp0i0AysX29g3YbMISkRvpJ2Ofv6t+80Ol9TeGUIZIf38v4S9MuQdne3lII32+7eOQf575UY6q9axjEFr87SFLw6fhS8Sn7+UDqyOPfrAyxe6i7QslN3VD+gR3cLh/RwxcaWrMDs8t9kod+8xS3jnIjeZDgHnLZskf/NLPiXX5J2TkeX4qLgLQXl3M/wh4SSMXXB8PyHdCRt66zZGSHpiUkZL0ceYaPHIeGNm2IEr7wUivuG9E2ufP3KZtCvL7BjO5zfxX4dgS+PbERNTUhOq+UzdSRPtm03scOKd0w8O8PlPXCAjdEjW9/p9XZ3Nbuk5Zw3vDFz9plpDD0jvPEeiZGqtFIKXp3hKXh1/Ch4C+AnnwhlB0piUYpAXbcu8ynN78qQ/Ym6gEe0eqv/M29YdbZUTzkXrqj7Vgn153JzGD/W/cTvv+TT7otzMkJXxsjgU2wMOslC+w7R9LQQwStCd/kKA9NmmHt3amXnesTw9N6sWy21UtyE2rQBunaleCnWkvJC/oc/prBrl/sCLrv8n33m1iYv17J7LmNN7CTzmoyfq68o/vBkOecNEfpvLzecA5AG34WKHTKRlqPg1eGl4NXxq0rBu7sRaJMVFlUmy82bZVe16Y5rtrDNhVsm1x/9oLHZn7yT+LKwtG/nzsCdOjVfvMXtQRaa2nbyv0BtO8k3b6NTZ0AyF8l/e3iye7hHrgnjrMjDWJVz4VIO6UQVzz6k47k5yDjNJXTFPcBL3xoViNYE78JXDLz5tokP1xqQCAz+S9wXxA89SPatqNpejfXKWPnt71I5XUOyeYgovuziNLoU6YrFeaMaR1jwPlPwBmeVU2vYtkgVXsUSWLtxe7FFK7Lc22+beORxc28YqPbt3U+tc182kc5ztkNEaX1PoHOdGw5HFnBJr1nTxsbJJ0Y/DKfPNDFvT5auoH55xRqJC1ex5KIp5//EX1ODvWLS29EthdD1etaS4JUsWcuWNz08Je3ruC9w7tmVFWs4GiuWp9YlS+WAoGuXfn3svREwvJdr+d/OneEcfNu6NeNbLREf+vZ1X8SDXpw3gpKqzvsoeHV25w6vjl/V7fCuWm3g4cmZz6vt2gI7d7kQ5WSz7MDKjmvnTvbe3dYDDwQO2N/993JffuETpejlwlVuSzd/vnx6nj4zs9MvfpkjzsvvGhB2T3IJXjmIJskCROCeOdTGwP7R7zSH3a8k1ucXu/nmi4YGA68vMbH0jYw/wInHWxh5YfCoB5w3kjiKwusTBa+OJQWvjl/VCV7Bleuk+FlDiw8DpTRBwcX9i1iQwygFP6CE4YWKaVu1l/m/h0ysfNcsaeY9P/NswfvkUyZeW2w6YtcLJVbtNopD/wsRu9nt9acWnnhp8JcqCt44WD6+baDg1dmGglfHryoFryBb+Y6Bx6akcMHwtHOCudIu/061uFZMGBd8UQrSVy5cQSiV5x45YHTXPW6WNs2J+mJbny14f/bLFLZvMxgOqligEZTTiF1pzrvvGbj/wZTTMvHrvWi0FSgTHueNCIyZoCopeHXGpODV8atawavEFovi/pBoYSe84MIVCxO32Ah/3NFSHGLM3uH9dCvw2NM7HTcGL04u45/GY8xoxa7XC/kSNukB98VqQH8bY0blT2DBeSMeYyCuraDg1VmGglfHj4JXya/cxf2LkuzEXH6xG1JIDrdJSKeTTgjuf+fvCxeucls2//O9tMTaUFL5n5S5Q8bbKwvb4q+vZMaVl9VL7hr7lTSOObryvpgUwiDO94YldveK3s3A7XfUOO4qN91AwRtn21dC2yh4dVai4NXxo+BV8otDcfnELdnYvDiaEkFC/r9ct9zcPFxakDZT8AahVP57vHB1+RI5aFsqIfYWL82kP5b6JETawP6u77v/MOWYUZIoo7gXLW07q7l82GLXYynJKzp2BE4/Nb9NOW9U8wjM33cK3vyMWruDglfHj4JXyS9Oxe/6nSt65RIBdNnFFg7uXtxuW5gLlwjyp55J4f0Pge98K42U6xrIKwQCEkrvyaczQlTCTp1/XvGJA/xNEpG7bKXZxG1B/n7OMBPnDDGxCzua9ICiNwSDtlDFn59IoedhNk5s4YtNVGK30B6FOW8U+mzeH38CFLw6G1Hw6vhR8Cr5xaH4qgbg4UebB5bPF4aotbaHtXBt/czAvfeJH6Arwv/tXxvRri3TIIU9bvwJKrS7vctXmJg918DadRk7SZ1yuFN2+fr0bO80P1cMb7/oHTrEwtnD8u8Khs0iafXJgUA5GCiX323J62dcxK60J6x5I2k2ZH9cAhS8upFAwavjR8Gr5BeH4vPmG5j+XMoRlMcPBDrtZ0M+Q8olO36jRxUewSGshcvbdQ77UF0cuMetDdnpiAvd7c2VplhEbr/eluO24F35Ugv7Re9JJ1r40gUUvdqxIi8h02YYzgEyubyMe6vXZJJKaF5wte3zyoc1b4TVHtYTLwIUvDp7UPDq+FHwKvnFpbgkz5AkGt7lj+CQa1coX7vDWLhe+quB515IObtS11xZuOjO10b+PTcB2e2dNsNNriLszxoqPrW5XVvE3UQOv81fmIm2kG+HOJ/glVZJ2LxJf3J3JRm9IbyROvWpFF5f3PwLSRzErvQyjHkjPFqsKW4EKHh1FqHg1fGj4FXyi3NxfwQHETHjx1mo7xnMpzeMhevHt9ZgdyMFTznGSLOsbP1tDBuS8e2V3dz5CzNZ26SNQbOkBRG8Up+XvEB2msePy3/CvxycKvGZ/t+1Z7errwjHb1vLI4x5Q9sGlo8vAQpenW0oeHX8KHiV/OJeXHbwRHgsW+HuCo0YbmHwKfk/MWsXLi9OrOwwXn8txU65xkn2bm+/vnazQ2iSuKSQTINBBe+O7QZuu9PdaQ467srFqZKeO32m6YQdlEt+X+Lm0PtIG18bX/7fmXbeqCQ7sK2FE6DgLZyZvwQFr44fBa+SX6UU9/tVBsnOpV24vOeJr6H4EPMqHwHZEZRDjV4ED2nJ/l2AzVsMfOPyRhxycLBdf68HQQWv3C++pw9NdsXZ976bRocOhT2rfNTi+eQnnkw5LyxyeeHfRPweeogdKBNa1L3SzhtRt4/1l5cABa+OPwWvjh8Fr5JfJRX3EhVIm/OlI9YuXLfd4WZomnhJY5MDT6XktXs3nOQbvAD/jvuYkW7s3GKvQgSvPGPOSwZWvGPi0q+m0a5dsU+t7nLZsbYvv8RC927xe3nQzhvVbeXk956CV2djCl4dPwpeJb9KKx70MFuxC5d8QpcdJ283sVyfsp9/0cTcl00MOtnCBefnd+GoNDsW2t4wd9wLFbyFtpX3NyXg36EXF4bxY+MpdqXVxc4btHl1EKDg1dmZglfHj4JXya8Si/sXUDmo1PsIYPTIRqRqMqe/C124ZPd41pymJ/3Fd1OuUro1fLDWwHPPm1jV4D77oq9YOPZoCl7JxNew2sD4sWmIH6/mouDV0CusbEupwwurpXR3FzpvlK5lfFIcCFDw6qxAwavjR8Gr5FepxeUT6a2/qNnb/OOOsfFPYzK+tkEWrlwhrSTervgISxgs2e2dMrXlgPlRsHvyaROvLXJ9RsvpThFF3zR13nyLa+tiU037n03Bq7FE8LLyNWbS/e6hP/ldTbw0/qH9gswbwQnwzqQRoODVWZSCV8ePglfJr1KL2zYcMbprJ/ZGcPDH621t4coldFs66Z8dGm3MKBt9+0S34/rGmwYen+KmYf36ZTwsJ+Nz2XLDObQmNhLRpL0oeLUE85f3H/arpLBuFLz5bVvNd1Dw6qxPwavjR8Gr5JeE4n6/XumP+N2eN8wEDANbt+3e28VChK6fS3ZotCiC5H+yCXjlNROLFmfcKm74tzT23Uf3+T4J9vXi4YbFnYI32lHhP1waJKJKtK0prHYK3sJ4VdvdFLw6i1Pw6vhR8Cr5Jam4J4ykT8ceBXz5QgNtandDdmkXvGpi/p7Yn/L3QmO3Shl/aDQpP3pk8cHyJXHCug0mGhoMrFoN57Ovd0kkgBMG2jj3nDRSrndDVV9exIywMp5R8BY3nGbNSaHTfhaO6mejtjZ3Hf7fSFgvKMW1trhSFLzFcauWUhS8OktT8Or4UfAq+SWtuHxKnTbDcEKK1dXZ6HZQxuVB+iqfV4cNLf6UuKScfWKq6dQf9MS57BDLIbSGNQbWrTOcw1f5rnwpdfOVT8rf5WXl9jtrHNZhJQCh4C18dHy+zcB//dL1Z/deGOt72ajvaaFbN8lyB+SKsVv4k8pbgoK3vPzj/nQKXp2FKHh1/Ch4lfySWFxE0qOP1+DDdZneHXO0jXPPKn5H1s8pOxFCdugycbEQUbt5C7B8uYlNm5tTrusM9O1roXMnOD7B8u/btwMrVhp4cY4rqOWqduHrfR4PM1IGBW9xv/rXXjexaKmBDz5o/sKWShlIp133m6FD3Mx3Rv73uuIaEmEpCt4I4SagagpenREpeHX8KHiV/JJY3O9DKFm4JKbudd9uxH77hdtbf4pUEWTOTm6We4I8UUKndesKyI6YnFaX/23pk7DXQokQQeELeOHIxoxKO5EzwrgoeHUU5YVvtXytWG84Xy78GfC8ms8eZmHokOgOd+p60HJpCt6oyCajXgpenR0peHX8KHiV/JJW3C9CR3/JwMCBmUNrUfTVH7rMq18Ebr++cMSt+PpqMkpVo/D9+yrg5Xkm+vYGnn7WdWKWF5a6unAsSMEbDkd/Ld6Lidjo4O4WThtko0ePcF5QL29HcQAAIABJREFUwm8tBW8pmSbpWRS8OmtS8Or4UfAq+SWluD+SggjOr15k4pij0SRKQxR93bUL+MnP3RixkhQhyO5tMe2oJuErqXxfmJXxF92nA3DtNY1o36EYcs3LUPCGw9Ffi/dVZfAgCyPOq7ydXa8v3OENf2wkqUYKXp01KXh1/Ch4lfySUDxX6tLe9almYcmi6KtlAT/6aQ0kLvBNNzTmdVXQtiGX8BV3Crl27HSjPVhpoGcvG1/oZYW2K6ptd6HlX11k4qmnMyEqZLf8mqv0MXilHRS8hVoj//3i3nD3PSnHfeemG8KxU/6nhn8HBW/4TJNUIwWvzpoUvDp+FLxKfpVePDubk+yyymfVUi5cYaa9DWqPXK4UucqKGB42JJzDekHbFsZ9jzxu4u23TZgpYL+OwObNwJFHWKhJGfj0M2DkF4uPtEHBG4aFmtdx6y9SzgvXddc2OocwK/Eq5bxRiXyqvc0UvLoRQMGr40fBq+RXScXlQNinWwz0P87d0Vyy1MSUqe4uoIQbGz0qk7q0lAvXszPdGL+ljDsq/Zb+yyWfkWvbuaGhatvZ2LHTwPr1BhYvzRyTF+Fb38tNmRz3ywtvJbuFl19iYckbRpMYytL+w+ttXHZxcTuJFLzRjICHJ6ecrIfZUUuieVo0tZZy3oimB6w1SgIUvDq6FLw6fhS8Sn6VVPzHt9Zgd6ObNOLg7jbm7UkkkSubUykXrpXvmPi/h82Sfc6V8E8/+mkb53kiCFs6FCeuHrPnppoI37iHOcsWu9K3zz438OabBtq1s1HX2ca0GSknIcdVV1Dwxun3K+mfJQ20uJ9c8fVGtKmpvLhkpZw34mQ7tiUYAQreYJxauouCV8ePglfJr5KKi7B8+lk3qYR3tbSbVOqF6xe3pfDZZwZOPMFyPrdHfUmGtvbtga5d8+/YemGkssOc1fdErNwdcondXBzFX1oT45U7vOGPTvFl/+FP3MObcp05NI0zh+Yfm+G3RFdjqecNXWtZutQEKHh1xCl4dfwoeJX8Kqm4+Os+/GgmKcMZp9k49+zcu3ylXri8QzvC8xuXpXHYYfFc7MX3d9ESs0m2N3F3GNg/jV69gA0bDLz0VxMbPwGOO9bG4FOiF+/CLKjYDWO8UvCGQbFpHV5GPPmvMmZOOclCly7hPyfqGks9b0TdH9YfLgEKXh1PCl4dPwpeJb9KKS5hj2bNMZxDMfLJ9OijLAweZKNtm9w9KMfC5fnyVoIPY0vuDv7dcyH7ox80qnZTg4yvUopdaQ8FbxCrFHbPXb9LOQkowsyIV1gLwrm7HPNGOC1nLaUgQMGro0zBq+NHwavkF/fiEl/32ZkpLF7iujHk8tfN1YdyLFwNDcB999c4PsYTLy3Ov7TU9hDhu+QNE4uXys65+3Rh3Le3hV2NBnofEe1OdanFLgVv+CNs1hx5GTWdNNjXXJk5OBr+k6KtUX4Lj01pg/PPBQ47LNqENdH2hLVHRYCCV0eWglfHj4JXyS/OxcVN4ImpprNzJAe0xoyy0bdPsE/s5RC8wvL2O2vQrm14MWNLYZ+3lplYsNBw0sX26W3hq/8cjLG2beUQuxS8Wqs1Le93Zbj6yrQqq2C4LSusNunHpAdSzvmAs4YaGDaUgrcwgtVxNwWvzs4UvDp+FLxKfnEtLiG3ps3IuDB48XWDtrdcgld7oCpo/8K4b/t2QE7WN6xuepq+FOHVyiV2KXjDGDmZOm67wxWJQb+8hPv0cGrzi93uXYEbrjMiz9AYTstZS6kJUPDqiFPw6vglUvCuXSepVQ2893cTfY600auX7Xwmbyn8lBJhrIoX68KQ3YlyCd5YwczTmKeeMfHq6244tcGn2Ni5E3tDvUlRGXOSKrm+p+UcaAvr8uK15gurFtbzsuuhD284ZKfPNJ3xIq4M119bGS48uXr+hz+mnK8b0o9vX2Ggy/4UvOGMkOTVQsGrsykFr45f4gTvxo3Ab36bCe/jxyMCoV8foG8fG/36luazs9I8BRX3pwiWvl4w3MaA/sX1k4I3P/oPPzSw4h13d06SVsi1arWBZ6e7biTZlyeA5dBgv77F+fZ6WenKJXalTxS8+cdGvjtknEz6U8q57bpvN1ZsCmtp/zPPprDwVcNJHDPyfLMkKcnz8eXf40mAgldnFwpeHb/ECV6JZ/nIYya6d3OF7YYNwKoGE5JlzH+C/pabG5Xk9MVFoEoa3zCu5Sska1rxLgzZbaDg1VlFdtpXNRhoWGNg1SojpwAW4Vtfb6PXYe5OsCeaW3pyHMQuBa9uXEhpGRt33eO6MpTC/UXb4q2fAR33zV2LP9TheeekMeLsFAWvFniCy1Pw6oxLwavjlzjB2xqOBx5K4Z134xH6Z85cEy/MNnHyiRYuvKC4XVivr96nUfn37BTBxQ4PCt5iyeUu5wlg2fl1hHCW36+Uak0Ax0XsUvDqx8UTU93sfWLva66KtyvD87NMzH0pd9pv/0u2UPnSFy0MOMqkS4N+iDg1yMvEypUGzjjdgulmQa/4i4JXZ0IKXh2/qhK8/3O3iY8+bjp5b9osOy4GNm40cMzROuFZiCneXm7ikUfdWUxib4r/myyA7WtttKs1cvobi2jatNkVS5u3AJs3Gc7OtbRfrjDj11LwFmLN4u6VMGyrVpt5BfC6da7Ny+nG4O8hXRpatvf0mSmsXQe0rwW6dcu4rUhK556H2Y6v65SplePKILG7Z81x29vSJSmrd+7MuPCcfAJw2qmV7aZR3C86nFIidGfPMbFshcv0y19K4/iBxblAhdOi8Gqh4NWxpODV8asqwevP5iXiwROKHkIRnZdfnA7NzaAl04hwdRNB5H9tl3Z27gRH4Ga3N7t+af/wc9ykEtqLgldLsPDy+QTwIQfbuOob5d8RpOBt2ba33FqDxla8pbx5J8yX08JHWv4S4m4l4t0TXbnmS0lPPfxcy/Fhl7G7eKm7c+1d8iI/bIg7n762yMQrrxpOfO18rjv5W5fMO7KFrvRy8CALw8/hDm8yLV54ryh4C2fWpMTajduVNVRW8RnPmfjr/IzQrOsMyA7Fjp0ZH1+ZqGXnd9SX0tg/xPSentCdv9D1tZVLXBBkJ8h5/iYD23cAW7YYzvNbugYOEBHs7ghL2dWrDbw4J5My+LprGyH90lwUvBp64ZQVEfHQo6m92fFOHWxjwHH6lxlt6yh4Wya4axewdq37JWbzloz487uxxDmxSvbLuBeBRHyNs69cIQR3b6/BjBeBV17P3C3zlHeIU1w45N95ZQjI7/zFOU3DG4rQPXNI5jBsUnhxh1dnSQpeHb+q2uH1UH38DwMd9216SMgJ5zWj6Q7FV0an0f9Y/eScS+jKoie7PPlCpYnw/XybgQO62JCYnSKUZZdEymdfS5Ya2LbdjRqgvSh4tQT15WXc3PoLN+JIHA5Zej2i4C3OtvJV59kZprNrN+I8/W+0uFa0XErmj2kzzL0v4/LiP2J4YTuy3ryx5sPdmD031eSwsLyojx5Z/i8UYXMrtj7xgZ63IOPP36YNcOIJyRS62XNHscyqvRwFr3IEVNsObz5c3qKkzWnvP6Tk39EVoXrW0HRRcVmnP5fCvPnurtFNN8hCpBfjLfGg4M03UqL/uwgQ8feM244gBW9xtvdSCMctMoPsME6bkdq7Cxv0ZTwXhVzzxpNPmXhtsftVLWjfN20yUFcX3fxWnAULK7VmjYE33jTw0cfunC1f7uTK/nrn7aIPPiV5O7rZxLjDW9gYyr6bglfHryp3eFtD5p2glkmoW1cEThwgAnfZcgPrP8odhko+410wvDih67V31y4bP/l5mz2CtzFSXzgKXuUPK4Ti3liMm78nBW9xxo2b4M3205UzABK7O2j68aCCV+7zNhLk/+eLJOMldKn0w1qevVsaLZ062Th1kCStid9uf3EjPH8pCt78jFq7g4JXx4+C18fPn9c+F1bPL3b7juYH3nLd74WZOuxQG0f3C2e34uZbSvOJm4JX+cMKobiXdjZuiQkoeIszblwEbyF+uoX2tLV5Q5JtPDzZdZlo6YCwfw6O27gvhIW4K7w4242/LV8LB/ZPo9OecxXa8xWFtCNu91Lw6ixCwavjR8Hr4+fPaz9siOWEi2otbqofvcRJ7NN7TwrjruGmkvWe4/l0yu6zuDREeVHwRkk3f91eRJE4pp2l4M1vv1x3eIJXdjglA6KEIMy+JCRhtqtStkASwbh2nSQrcXI87L28T+Xy95073Lq375ADsu4t3iE6v4tVMX66rfU+37whgnbSA27SDRnbkpLba58c3PWStGhdyoqzkL6UG30ncyhZbDl+nIX6HGcu9E+rvBooeHU2o+DV8aPg3cPvlddMPD2t9bz2n24F0mk3xmY5Qut4IuiQ7jauuoKCVzn0Y13cE0dxXPgpeAsbOhLHdlUDsH5D/tCChdVc/N0aP12N4JWyInAfmtw0KkGuOuPmytNav+X36n+R8MR8NbkrBBmNFLxBKLV8DwWvjh8F7x5+a9cZ+N97U87uStS7p8WazPODkzBqR/WVRBOFnaAu5Ln5dmoKqYv3Fk7Ay6w2ZlQaA/qH4w5TeCtyl6DgLYzkD39SA0l5Lpe4OXXvDshOZ65LxKA/iYPc01qIQn8d3k6wzA/eC7nzct7evUtCGaYbgYMPAY7qG43faNB5Y/0GE3f9zt1gkJc6uaTNte1sNKw298bzLSQ2uvTNSAGmb9e7MEsVd/cdv63BPza6ZeVFQvxyNX7QxbWiMkpR8OrsRMGr40fB6+N36y/csF9hxLFVmqVZcckE95vfNs14JIvBpV9NY//9w34aEHThCv/JrFEIeL7aN90Q7eHEYmhT8AajJqL2iSczO5kSLvD8kMKRLV5iOHXLDqLshMblKmTekHjoRxxuo2uOuLwSocQfWzxfdIfnX0xh7stuhsqrr4z265eftdhAbCEi+7KLG4uKvhMX25WiHRS8OsoUvDp+FLw+ft7kFcdPadu2GXjgQRPHHevG7vXCCJ022HKyHYV9FbJwhf3saq9Pon08/Gj8wpF5dqHgzT9C5dDSlKmuL6d8NZowVhehJfuJcTkAl92uMOeN7Njo8oI/fmzz2OUSVu2++93DvIcfbuOyr0UveOVlRn6jcsZD7Dvunyx84fB4fYnJP0pLfwcFr445Ba+OHwWvj5/nMiCHSv55bLrJgRAl5kiKr1hpor7eQls3UlmoV5gLV6gNq4LKvHBk+Xa1yoWCgrd18tNnSkIBN+5svhBcxdqwGgSvx0bOLjz8aCaTpLgN5MpGecrJFr54fvgv/9k2+uwz4De/rcHOnXBcMnKJ8GLtmvRyFLw6C1Pw6vhR8Pr4LVps4i9PZdIOnzkkjTOHVedbOwWv8odVZPHlK9xsV3KK3cum5wSsN4AR56XRt0/5xyMFb27j+nf95I4ovxS9PM/AzOdTscvaFuW80VJcW9lh9UKdXXNldOcaPKv/7S0Dj/055YjdUjyvyKkklsUoeHVmoeDV8aPg3cNvyVL5BOmKXUnxmErZGHWhhaOPKr/AUJq4qOJRLlxFNSjBheST7LKVpuMLKAt3S5f4f4aRNlqLkoK3OUG/C0Mpdv0WvGJi2nQzdln4op435EVw7Xrx1bXgD9d21+9c9wLZUR821MLmzW4YNnGLkHBnmzcZzuG/zVuAXbsMJ2V8fS+ryYFQuXf7nhBubWoM7Luv7ZRZv97YW5fUKRE3pB1xTRGt/X1HWZ6CV0eXglfHj4IXgF/sxvUzstLMBRePeuEquEEJKJC2gPfXAOs2SPB9dyFdtbppqCrZ1XX+tsFwFnRxrZEdrPbtDNS2j8fLVxDBa6UBs+kZywRYsHkXHD/Tme7BJbmicmHIfnIpY3IXYrhyzRuyu373ve6h41JdcUv5Xap+a55DwauhB1Dw6vhVveD1fyaj2M0MpnItXMrhHOvi/317DbZubd5EEbQDB9jo19tNWCJiRpKgyOJdiWHJlrxhYMpfUpF+0o+Dof2+pWLDM4eWNk2sF7pu4iXxiQ5QznnD2+Xt1Ak4uJvthGOTUGwS7kz+t3NnoK6zje3bgeUrTby+yMRHH2dGkvNiWesK5q2fAY2Ncr8bSq5znY3adm49Uod3SC6OUVTi8NtoqQ0UvDrrUPDq+CVW8L69zMCGjzJv+/JZy3v737Hd/XTl7aQJwij97ZQmKkvxci5cZelwCR7q/+zarZvtLJ4HHwx0Paj5zq18In9osuns7l59RRp1dSVoYMBH5NvhjeuBqoDdC3Sbd8BVbhZBNHpU8+gBgSpS3PTsTBPz9xyOCzPkmaJJZQtnKGHMpkx1PymcOTTtvHxEeXkvG1w3CqNMwVsYr+y7KXh1/BIpeL0YlYWgkeDnXx4Z/8gMhfRJcy8Fr4Ze7rLzFxh4YXYK117TiP32y1//w5NTWLbC9UscPy76UEv5W+Te0Zrg3d0IPPq4CYkgUgrhEbTNYd0nL8kSRUPsIpcITUlDXo7Mi41p4KmnU3uTNMiZgxOOjz5KQWssyzVveGnhxTf3zGFpdIn4BdET2P7EGdlcDjrAxtFHu8Jb/PRnv5RyEn98/fL4/JbD+l0ErYeCNyip3PdR8Or4JU7w+sWuiFiZkLxLdtTkksXJy2P/5tsmJK2wXOeebeGM08q7YCjNGVrxci1coXUgARXF1bWhNcH7y1+n8OmnTf0oZQfU+TTczUbHfYETT6ic39j2bUCqDZzQf9kuDGNGxSOjlvivLn4jhSGnp1FTZr/pcswbL8xKYc5LhjPXX39tacTke+8Bf3rQjf3b2uWtP3LITa6DDgT+5erGfMUS+3cKXp1pKXh1/BIneKfPTGHeAgOF+ON6nycPPdTGFVX89u0fSuVYuJRDOZHF4+ja0Jrg/fHParB7N3DQQTY+/bTpgTzPQCKAx4+Nl5tGS4PH74bi7epWUvtL/aMox7zxq1/XYMunQKl9mSVSRk0NcvrlC/fFSw0nmoO7yVJ6H+9S2z7I8yh4g1Bq+R4KXh2/xAlewSGhZPwha4Ig8lK53nJz9b59U/AGGSmlvydurg0tCV7vxTF7p012RiVMlESe8IuAuPidtmRR+Qz9yOM12LYtc4cIl29eEf0n89KPsnCeWErBu6rBwPKVxl4/5ssuTuPw+mh9dwulVA3+7IUwoeAthFbzeyl4dfwSKXiLQULB25RaKReuYuxVTWXi5tqQS/DKZ/Xb73Q/8U4YZ6Fvn9xuC7nSxV4wPB6uAdJ2ad/ipSYkvXPD6oxrhuzqSgxX7+BrnyMtfLrVgMRGru8VL5FVzt9GqeaN55438dK8TJIg6fO4i9I4ul+8bLFgoekkkpEoLKNHlsbdopz2z/dsCt58hFr/OwWvjh8F7x5+t99R4+wMX3dtY8G7w0oTxLJ4qRauWHY+ho2Kk2tDLsHr7UKL3/zoUfkXdhHIkx5I7f3kKwfzxMe3vqcbmq3UV67kH164uAHH2ejezYa0efbczCExaeN551g4/dTK8UmOmmsp5g1/KElJ/iDh/Oq6GOi0X7zErrCWrxt33+NmZRszMl2WsR21zQupn4K3EFrc4dXRylF67cbtoddZiRV6C7bkRe/XlwtYKRauShwn5WxzXFwbsgWv/8T65RcX5pvrD+/lZys7qt27wcmG1bUrHMEZxSXPz97NlYQC/fraGNg/d/QFfwrhuLtlRMGstTqjnjeenWFCbCbXmFGSKS3+c7X39dDjJuNLvgrIGJf/LUeEj1KPC+95FLw68tzh1fHjDu8efl5cxa+NT6P3kdEsrkpTlbR41AtXSTuTkIfFxbXBL3j9O7XFJsnwBHOPHjYad7u+vrmusISC7LpJcgx/Kufs5B/5hoy3cyf3FXJANl+9lf73KOcNf6zdShG7Ys+33jaw8h0T69bnHtuy+9u9qxvF5JijLRx4QKWPgpbbT8Grsy0Fb0B+Wz/bhsZ0GnWdOjYpwR1eF4d3Gls+p36hl4VTTi5PbM2A5oz8tigXrsgbn+AHxMG1wS94JSatHETTxgr+ZBOaxE4VF4NVq00n/fK6Ddjr+uA3rQiF+p6uUOjeNb8rhLCTCC5+39x8u7mtDaV58w1Mf86NA1aqlMJxH9pRzhterN1KTvbgphR3x7YcuvOPRbFthw42vvfd/C5BcR8HLbWPgldnOQrePPy2bd+BG3/yO7z418XOnccd9QXc+ZNrcUCXTs6/U/DC8c3zDtx4OA/YH/jq+Ebs30U3QCu1dJQLV6UyiUu7y+3a4C1aL87b6WSDk+u6bzdGmg1OfqOy8yv/5BIKnm1EwHbvLgLYxsp3DbRpYzgZ7eYvNPYeOCt0N7c1u69abeCJqaYjyI852sbYryRXrAQZ/1HNG/Pmm5j+nFnSWLtB+hvGPf4oJoccbCf6CyMFr27ERCZ4/zh5Onod2g2nn3IsalJljuatYPT7h57BY0/NxgN3fh/ta9vi6u/djvrDuuPHN0yk4N3D1S8gxCdsyVITK94xcMP1aeeNuxqvqBauamQZdp/9rg3iQ9q2HTDsDAtm00PrYT92b32yaG3bbuPmn+92hF65dtxEKMgOmbML3MLnYj8E8ZmU0/It+eYWC0zE+IznTUiWL/H9reZLO2/s2G5g0xbZ0Qc2b3FfcJYtz0THkBeaCePSVeX3mqTxRMGrs2ZkgvdHt/0Jjz45C10PrMOlY8/Hl4efjk777aNrbRlK/9MV/4nhw07CFV+90Hn6jNmv4Pof3oU3Z02CYRhVvcMrC6bszng+g9+8ohEHdy+DkWL4SO3CFcMuJaZJ2eNWOlaKtLJbdm7GD1++ETNXPYXj/vFLHLnrEuzXKY3vfic+Is/vCuElihB3g8Gn8IR8KX4Amnlj1pwUZs3J7b8tu/JeSDhxZYlTKLtScE3KMyh4dZaMTPBKs/627O94ZOqL+Mv0l51Wjh15Jv551Fno84VDda0uYemTRnwTP7nx647olevtlQ246MofYt5Tv0WnjvtUreBtaDBw3/3uzr1MoBKdIaqT4CU0d2iP0ixcoTWCFTUjMPnPJt56q/lWrufPOmxIYVESCkE88ZmxmLHqaXRp7I/Rn77qFN16/L/j9gt/Vkg1JbnXtoGf/XfKEUkMNVgS5M5Dipk3sl/gJGlQp04SxQCoq3NjNMt/80fHcF/y0jjh+Pi8bJWOcuU+iYJXZ7tIBa/XtE82b8XU6S/jgT/PxIaPN+GkAX1x8VfOw9BT+8fa3cG2bRxz5uW462fXYejg/k533mv4ECMv+z6en/wrdO+6v45+BZf+60ILkx5K47RTTIwbbaJD+9w7CxXcRTY9gQSu+/5upNPAOcNSGHCMgVcWWXh1sYWNn2Q62+cIwxnXp54cno+DuDCcfMt3sK/VE713XoK2dmcsav9jrKq7E5tu3BQ70k/PSOMv09yQVeKHP2pEKlQesetwhTZoyd9s/M/v3eyWYqeJE1Loc2Tr43bSg2n89RULI0ekMPL88MZ4hSJks6uIQEkE75ZPP8eTM/+KSZOfdQRvh/a1kMNgXTp3xDcvGYWvjjkntshlh/en3/sGzht6otPG7B3e2Da8BA3btRto26YED+IjSCAkApIcZb+OQPaxgjUf2HhhjuUIAe8SAdH3CBNfOt/EAfsX9kInAnfxGzbeX2tj3kIL27LCdW9JvYvHOx2FTu06YfP3NofUu/Cq+XSrjVkv25j3Snrvy8Chhxj4zxvcbHC84kFg9ssWHnw8jS+dn8I5Q41AGw+PTEnj+TkUvPGwIFtRSgKRCt43V6zC5KmzMGXaXKdPZ502EBNGn4NTjj8KK95bgwcen4kFi97Gi4/dXso+F/Qs8eE9/8yT8Y0JX3TK0Ye3IHxVe3MxnyarFlaMOi6ffVevMfDiHDdygHfJYZ/jB0ig/tyfgOUgnBP9YI2BVavcw0L+S3woN9W+iqW7JmOXsQXvtn0IttGIi/p+Db8+554YEWjaFHFteO11E09Nc3cCb7nZ3U3kFQ2BYuYNy0JBBy5fmG1izlw3YsM1V/IAWzSWjKZWujTouEYmeL1Da7KbKzu4F31pGA7p1jwi9Jatnzu+sHG97n3waTz+9BwnSkOH9u3wzRtvY5SGuBorRu0qZuGKUfPZFABygGvx0qapcP2+vlu2AMtWmli3rnk8UBG44kMpUQecUF/dbMihteuevxIL1r6ELTu3OGL3R2f8Ap3adY41b+Fw3/01Tmara66q7rBhURuqVPOGFzddkpVcOZE2jdquYdVPwasjGZngvfv+qejR7UCcO/RE1LZrq2tlGUt/vm0HvnvL3Zi7YKnTimP61OPOn34HBx3gLlKMw1tG48T40aVauGKMIDFNa2nXN7uDXiaz+p6tJ3HITi0cd1CPTknhzTcNDB5kYcR58U9FG3eerbWvVPOGjOkHH0lhd6PEgKbgrZQxQ8Grs1RkglfXrPiVlp3o3bsb9yac8FpIwRs/W8WhRaVauOLQ12pqg7fb6e/z8HMtnDAweGbBShO8Tz5jOm4N48dZ6NeHgjfK8V7KeaOxETANwKzcMPlRmiKWdVPw6sxCwavjxx1eJb+kFi/lwpVUhnHs16w5JuQfcW2QrE4rV5r41283omPTjOOtNr3SBG8c7ZDUNnHeSKplw+kXBa+OIwWvjh8Fr5JfUotz4UqeZf0ptK++Ml103GkK3uSNjbB6xHkjLJLJrIeCV2dXCl4dPwpeJb+kFufClTzL3venlJOKd2B/G6NHFe/3SMGbvLERVo84b4RFMpn1UPDq7ErBq+NHwavkl9TiXLiSZdklSw1MmZoKJZQTBW+yxkaYveG8ESbN5NVFwauzKQWvjh8Fr5JfUotz4UqOZcWVYdIDKScu75hR6RZj8QbtMQVvUFLVdx/njeqzeSE9puAthFbzeyl4dfwoeJX8klqcC1dyLPvEVDcWb78+NsaPK96VwSNCwZucsRF2TzhvhE00WfVR8OrsScGr40fBq+SX1OJcuJJh2eUrTDw02c0ydt23G1HJBe63AAAgAElEQVRXp+8XBa+eYVJr4LyRVMuG0y8KXh1HCl4dPwpeJb+kFufClQzL3naH68owYriFwaeEE4OWgjcZYyOKXnDeiIJqcuqk4NXZkoJXx4+CV8kvqcW5cFW+Zf0xd6+/Vu/KQJeGyh8TUfeA80bUhCu7fgpenf0oeHX8KHiV/JJanAtXZVvWH3M3LFcGCt7KHhOlaD3njVJQrtxnUPDqbEfBq+NHwavkl9TiXLgq27J3/S6F9Rv0MXdzUaBLQ2WPjShbz3kjSrqVXzcFr86GFLw6fhS8Sn5JLc6Fq3Itu3iJgSeeDCfmLgVv5Y6DcrSc80Y5qFfOMyl4dbai4NXxo+BV8ktqcS5clWnZx58w8cbf3KgMYcTcpeCtzHFQrlZz3igX+cp4LgWvzk4UvDp+FLxKfkktzoWrMi37l6dSWLTYQLeuNq65KryDan4adGmozLFRilZz3igF5cp9BgWvznYUvDp+FLxKfkktzoWrMi27bRvw6/9JYceOcEORUfBW5ngodas5b5SaeGU9j4JXZy8KXh0/Cl4lv6QW58JVuZb1kk3U1tq4+op0KMkmKHgrdzyUsuWcN0pJu/KeRcGrsxkFr44fBa+SX1KLc+GqbMs+PDmFZSvCSydMwVvZ46FUree8USrSlfkcCl6d3Sh4dfwoeJX8klqcC1dlW3bHDkCyrIlrw4RxFvr2CSfLmlChD29lj40oW895I0q6lV83Ba/OhhS8On4UvEp+SS3OhavyLet3bZBMa7W14fSJgjccjkmshfNGEq0aXp8oeHUsKXh1/Ch4lfySWpwLVzIse9+fUmhYHa5rAwVvMsZGFL3gvBEF1eTUScGrsyUFr44fBa+SX1KLc+FKhmUlxfDd94br2kDBm4yxEUUvOG9EQTU5dVLw6mxJwavjR8Gr5JfU4ly4kmPZ+QtNPDvDROfONq65Uu/aQMGbnLERdk84b4RNNFn1UfDq7EnBq+NHwavkl9Ti1bpwNTQAf11gYv/9DZx/bjSJG8oxZjzXhlMHWTj/vMwBtpXvGnjzTRM7dwLbdwDHHmPjpBNaP+BGwVsOC1bGM6t13qgM65S/lRS8OhtQ8Or4UfAq+SW1eLUtXLYN3P9gDd77u2vRAw8Avn1NY2LMK64Nt99Z4/SnXx/bEbdyrd8AJ5KDdx1ysI2rvtG60KfgTcywCL0j1TZvhA4w4RVS8OoMTMGr40fBq+SX1OLVtnBt22bg579MOebs1dPGhHH6T/9xGxvPzjQxf4HZrFntOwBfvjCN9rU2evQAalxd3OJFwRs3y8anPdU2b8SHfGW0hIJXZycKXh0/Cl4lv6QWr8aFa/ESA0886Yre0SPTGDjATpR5/W4NfXtn3BYOOQRo0yZ4Vyl4g7Oqtjurcd6oNhtr+kvBq6EHUPDq+FHwKvkltXi1LlzeAa9UCrj2msbQ0/KWa7z4XRpuuVnnqkHBWy4rxv+51TpvxN8y8WghBa/ODhS8On4UvEp+SS1ezQuXl5ZXXBsmXpqMg2tLlhqYMjXl+O+OH6frEwVvUn/1+n5V87yhp5f8Gih4dTam4NXxo+BV8ktq8TgsXK8vMvHiHBNfG59G926lcy+QtLx33ZPC5s0GzhxqYcBxlrPTO3++iXdXAf98kVWQC0Acxogn4seMSmNAfx1LCt44WDSebYjDvBFPMmyVEKDg1Y0DCl4dPwpeJb+kFg+6cG3dauDPf0nh1EFp9D5SJ6SyWT4/K4W5LxkYPMjCCF8orVIwX9UATLo/c3prYH8bi5e60QxuurERte1K0YrwnnHzLW5frvu23k2Dgjc8uyStpqDzRtL6zf4EI0DBG4xTS3dR8Or4UfAq+SW1eNCFyzvoJQe85KBXmJfExL3v/hrU1tq4/trSR01YvsLEvAWGk5rXu446ysY//1O4/QyTWa66PHeGsFw0KHijtljl1h903qjcHrLlGgIUvBp6PLSmowdQ8KoJJrOCoAuXJ0rDElPZNL3IAmcNTWPY0HB3kINaTg58SfQGEb7durp+vbW1QUuX9z6JL/znJ0y88aaJEcMtDD6l9aQSQVpLwRuEUnXeE3TeqE467DUFr24McIdXx4+CV8kvqcWDLlybNgO33+F+Lped2G5dgfa1QOc6e+9nfxGJnTu7f/cu2wK6dMlPz9tB7n+cha98WS/W8j8x9x1+v97D621cdnHhu7wLFprYsRM443QLqebhcIttWrNy0tZNm92d6beXGVi9xt2hDsOdQeqh4A3NVImrKOi8kbiOs0OBCFDwBsLU4k0UvDp+FLxKfkktXsjCde+kGrz/fnEkRAzXdQa67TmUJv8uCRA6dYbz32fNMZ1/ZHdXdnkLuaw0YLphdUO53nrbwOTHU84BNhGPhVx+n2DZZZXdVu0lLxvr1xtYv8FwhPSqVQY2b2maOc3/jJtuaAxlZ5qCV2u55JYvZN5ILgX2rCUCFLy6sUHBq+NHwavkl9TixSxcsrP40UfAxk9EgBmQfxchtm6dgU8+MWDbNmpqMv6wItiCXnJobGD/NLbvcOv1rs1bMvXJ/1+3Dk1En0RZkH/CuDTuG+m07Rzue/Mtd2vXEcyZpjdrnoj9Dz5wE2GIQE6lbEj/PIHbGjvZSe/cCaivd3fZly13RTFdGsIYBayjNQLFzBskWj0EKHh1tqbg1fGj4FXyS2rxUi1c69Yb+HCtga1bXXG8eZOB7TuA9Rta3qkslHnnzrI7bKnDcTWsMXDfH90t4wnjLPTt07KQFt/Z7dsNmKZEdzAdES7uGTt2tKJyC+yYszMu7iJ1rsAVP+q6znazXVwvmUZYftbc4S3QUFV0e6nmjSpCmqiuUvDqzEnBq+NHwavkl9TicVm4ZDd3VYPsUJp4/wOgTQ1Q2z5DvXMnG++8Z2DtWgOjLkyjWzdxhXBFn+zITnnSjacrlwjfyy9Oq7KnTZ8pkRtM1KSAHj1aPkS38RNAQrZlX7L72qaN4ZRv7fLv4B6wP3DooSJq3X+kj4XEJRaGt92RcsR2GG4NFLxJ/dXr+xWXeUPfE9YQBQEKXh1VCl4dPwpeJb+kFq+khcvas9FqtnAQTMJySQILT/j2PsLGIYe4YtVziRB/3y5dXKHcvavliMrsSAxyAOwvT5kF7dKKwK3vCfTqZTs7sIUIVenXm28bOO4YfXSKZ2eamL/AdNw7tC4eFLxJ/dXr+1VJ84a+t6yhUAIUvIUSa3o/Ba+OHwWvkl9Siydx4fJ2Zwux2SHdLRx3HJyd1UceS0FcFcSNYMjpNvbdp2WXhk82GZjzcgpfPN9C7yPD8SEupN3Z9/pjGo/8ooVjji5eRFPwaiyR7LJJnDeSbbHS9o6CV8ebglfHj4JXyS+pxZO6cInP8PIVGVcDEbJy7dzl+tzKobDWfIhlx/amGwqLFhGXMXLrL1y3BrmGn2vhtMHFCXEK3rhYNH7tSOq8ET/SldkiCl6d3Sh4dfwoeJX8klqcC5drWXGHWLvOwJYtrhDu29vGqUUKxXKPlWkzUliw0BW84kM88sJ0UQf5KHjLbcn4Pp/zRnxtE4eWUfDqrEDBq+NHwavkl9TiXLiSadlsf2Y5yHfBcLvViBPZJCh4kzk2wugV540wKCa3DgpenW0peHX8KHiV/JJanAtXUi2b2bn2H+STOMfDhgSLYEHBm+yxoekd5w0NveSXpeDV2ZiCV8ePglfJL6nFuXAl1bJN+yUxemfNycQHDiJ8KXirY2wU00vOG8VQq54yFLw6W1Pw6vhR8Cr5JbU4F66kWrZ5vzZtAmbPTWHx0sxhPhG+I4anc6YipuCtnrFRaE85bxRKrLrup+DV2ZuCV8ePglfJL6nFuXAl1bIt9ytb+LaUoY6CN55jw7JsNKw2cXh98SHntD3jvKElmOzyFLw6+1Lw6vhR8Cr5JbU4F66kWjZ/v0T4PvFkCg2rMxnq/KmZKXjzMyzHHS/OSWH2HAMDB9gYPbI8ofM4b5TD8pXzTApena0oeHX8KHiV/JJanAtXUi0bvF8tRXQYNLAtPt0KWKkdwSvjnZET+PgfwL33ubGWw8ioV0yDOW8UQ616ylDw6mxNwavjR8Gr5JfU4ly4kmrZwvuVLXy9GiZemnbSJfOKD4FVqw1M+lPKaVCQA4hht5zzRthEk1UfBa/OnhS8On4UvEp+SS3OhSupli2+X/5MbV4tIqrqe1n/v70zgZaiuN741z1PNiUsikA0Am4sGhAVARdwBdxQNNFgXImAYtS/JlGjcV8SNYoRxYUIioqiRgIuiAl7VHABERVQ9AGigCiLiGxvZv7ndtPv9XvMvOme6m26vz7HY+Lr6u763Zqqr6pu3VtUAoviv4QlayMw90PNcEmx2+jIIzJotpv/kxP2G2ybtRGg4FVrHxS8avwoeBX5xbU4B664Wra4ej03NoUFizTIQbYzTi7D3I+ymDu/KjWxpFxu31ZWFdNo3bq4d7CUdwRWrdLw9iy9MvKGpgGXDEyjZQt/RS/7De9sKL70U6encOQRaezezLvnhvkkCl41+hS8avwoeBX5xbU4B664WtZ9vaZOl1i9OkTUXjowjQP2rW885JPFm7B0mQZ7Agv57yKKZeX3oI4ZNGni/n01S3w4T8eEV3Wc/esM2u5fJbLVnxz/J4hoeuPNqsnKRec5Sy5SLBn2G8WSq15uxUoNo0brhj/2sUdncHSPeLR7Cl619kHBq8aPgleRX1yLc+CKq2Xd1UsSU0ycpBuFLrogjTatssgVpUGE1YcfmSuK69ZVxfNt0TyLw7tl0GqvbFHid+EiHWPGmu8//dQ0Du7s7wqlOzqlc/fwx1JYuUqD2ENWenUTqecX+w11pDLBe3m8aaAwI26o12THJ1DwqlGl4FXjR8GryC+uxTlwxdWyzuslq0yPPG76gp7YO4PuXc1VpkJhyZYsAebOk1VFGCtU1tW+bRbt2zn395WwaCO3H8AKK+qAc1rRvnPzZmD44yljMiIh5o7u6c+KIfsNtXbw/hwNE141f3MyUezTyx87qX1l8aUpeItnJyUpeNX4UfAq8otrcQ5ccbWss3rJiu2op02BVFNsFhK89jdIhIfyJVW+pNbfxOWhkL/v6GdTWPxFeCG2nJEqnbus1fru3TI40SchxX6j+PawYQNw79CyHSaYxT8xeiUpeNVsQsGrxo+CV5FfXItz4IqrZQvXy74aKKuy/c+unsTAjeC13iYCWvx953yoVya0kL+Jv2/7dll067Kjv++DD6fw3fcaLh3k/2GrwlRK/w5L8B7WJYNTTvRn5ZD9RlU7kcnaPntnIQcGnVyWr7xMBvudFk7iECffqXIPBa8KPa7wqtEDKHiVCcbzARy44mlXJ7WyR2QYMiiNevWqlypG8NqfkM/f10plbPn73nJHGTIZ4Ppr5Bvou+vEdrXdc/+D5or9GaelfQsjx37DtMA7s3RMfFPHEd0z6H2Cs8nFuPEpwwfe7j6kavOolafgVbMIV3jV+FHwKvKLa3EOXHG1bO31euNN3QhnJeIz34l+VcFr/wLL37d8KaoddpOEFitXmT7A119bgXp1k2kPr2r93XfAg8PLjImDHFpr6kH0jFzfxn7DpGL5vwvvq68wJ42yc7Jpe3JCadfio75uPbBihVbZ1qWs/PakvW/ZYi4Nr10HpFLApYMqSj48GQWv2i+agleNX6IE7xdfapgyTcevz0yjcSNFcDEvzoEr5gauUb1tFcDEN3S8P8c8HV6bG4GXgtf+Gfn8fUU0tGgOtGkt/zb/8SLcWZIsbK3aW3Xu0M5cebRzlMnHgs90zJ6tI7N9QV3Y//7SNH7W0Bkt9hsmJxG3sqJuP7RZiKC1i5GrjETVGHxx6bv2UPAWagW1/52CV41fogTvlOkpTJuuwc9DG4rmiExxDlyRMYXvHyKDs2ynSmIJuc44TSIp5N+G9UvwWhW1xyDVU0AmhzujHyJ4wis6vlujGbF+j+geHxcKmUi8PD5lrO7KxEFWFu2XfTW9ZmMTofXHqyqwy87OmmHc+435H2t48WUzikJZChj4u+oiVH5L4iv9zmxtB7Er/OvX07BlaxYNGwI/bwk0bmRO4GQyZ7kOicvP+vVAo8Ym8ybb/+3MAtG+i4JXzT4UvGr8EiV4ZQVj5GhzW0/8AnnlJxD3gYu2NwlInNvXJ1XFzj3u6Ax6Fghy76fgtUeH2LkBcMVlFcZqmcSQlX/Kl1Tf/q1pRxFvIh6kbKNGWdSt40y4rl1XlY53p52AG/9cEYsmIjwfGWGuNFq+u/Lfps1IYd58zfCRti7pF7t3lX8yO/htO4UR935j2Vcaxr6kY8OGqkmD5XMrE4vXJ5nJIuSSA58S/s3v7HZObROF+yh41axAwavGL1GCV1DdfHsZslkYJ8+lQ+KVm0DcB66k292egUtYiFCUgdvJ4Oyn4B0xKoWvvjITJAy4YMcDc5bd5PstEbxypQbxAXazfVyb/dvtLyz8zUgWRPuTCf7rk8yEE/lO/stkQly9GuwMI6mI6pWkfsPyd8816Tq2J1Ns52pLFLxqvzAKXjV+iRK8MkgOHWbGORz8uzT22EO9g1fEH9niSRq4ImsEnz5Mwh9ZW66yqndMT3NVz+nlp+Cd+ZaOr78GTu/rfpXRWAleCfxnqimaLSHvtF5ygMieJa5m1Ainz8l3n7FNHcDZAVm1f3uWeShKLnmnTB6aNPa/v0tav2G5iwhnaS8n9c6iXVvnvyXVNlVq5Sl41SxGwavGL1GC1wr7Euc4h4rNobJ40gYur7hF+TniGztuvG6s+MklOxwS77Nm2LFCdfBT8BZ6t5O/y4rlU8+kjFVrEXpuLmu7v2bUCOkzRMhIzOBirlnv6nj9DR0nHJfBUUf4I4jGTUjhy3Lx/zTtW8xkppi62cskrd+Y9B8db71jRjWRaAy8aidAwavWQih41fjFVvCmK4AZb+nGqobE9axfvyq15lWXV/CUd4F2k7SBS/FnFFhxEWRuIxTIyue0GWa4MWsl6oy+xW+5Rl3wLlio4bkXUoagr5k0w42hZPXOTaKM2p5tJRU47NAMTjnJH8F7023m7pWIL8sX1019vbg3Sf2G/K7uua8MFenao5p4wTUuz6DgVbMkBa8av9gK3iVLNIwcbZ6mtV/iGzhkMGfihZpNkgauQiyi8vevv9bw2BMpw7+1/1mFfUxlQF64qPpBmpppgoupW9QFryUuvair8MmXKENWkA8+SCJa1L7qW75Uw7PPlWHr1iz8PBBnCd7bbgrvwF2S+o25H5oHHVUnVsX8Bku1DAWvmuUoeNX4xVbwCpaPP9Ew/2O98kBLo58BZ56RRuu9ituWVERdUsWTNHCVimG2bgMeesTMlpUrMYS4LIjfprgsLFhY/RCXm0NphXhEXfBKhivJdOVHxiorUYZkxLJf4vLQudOOq+Yy6XjokTL8sMF0IenWNWNEkfDjuuseMxrD9ddUuHZT8ep7ktRvDH/MPBBohRsThhXpLPbdB2i3f/HuL17ZIorPoeBVswoFrxq/WAteCw3DkblvJEkauNzTCa+ErDbKdr0MtI0bA507ZZAvSoEMxOKfKwLZy1WoqAvekU+lDOE/4PwKtG7tn63yuTyI+D2oo5nUwTo3UL9+Fn/+k787S0MfLDOycl15WQV23dW/etf25KT0GzKRGf64Ofms7erQPmuI30K7AOFYK/i3UvCqMafgVeOXCMFrz3rj9yCoaI7IFE/KwBUZ4C4/RLI41RxsReB2PihrZBGUFV0JMWa1/XRa8yy2bNQFr7XyVlu2OJe4a729NpcHK1KCJHC45S/+uhqMfiaFxV9qvqxsO+WVlH7DamPGTsv5VROZ9euA8qW6ES/anuBD7mvcWMNPG4E2bTJGVBS/0js7tVUY91HwqlGn4FXjF3vBu3EjcO/QssoA61aooY6/zEIGIV65CSRl4CpF+9vjf+62K3DCcelqmZpq1knCbaVSQMNdvKlt1AVvmL6scmBOwoLVdHk49JAM+p7sz2E1y6qS4WviJL0yprJMfGRlOcgrCf2G9fvL5VZkZ2350E+ZrudcCZadgKN7FPbFD9J+fr+LgleNMAWvGr/YC94tW4Bhw1MQ/8fddgOWLze3oCRyw+8u9HeLUdE0oRZPwsAVKuAiXm5PASyruWecFk7MzygLXmF01z3hZlN8730dr7xuzqbFTlZCDGuy7df29icLNIx9seqgrpxZ+MP/+buqXLMZx73fsE823ewgiPvLDz8Ce+2Rxdx5qWoTosO7ZdC1i+kCE/eLglfNwhS8avxiL3hr4qE/r7MGE/eByxmF6NxV3XdXojQ4y4rmRw2iLHit33cxMXi9YiUretOm6/jlAVn8+sw0ROzYV/m8Fr65fIlFaB/RLVswTbRXdbaeE8d+Q9rUgs90SFQGa/JyztkZpQQTVrxnaydA2kSPI7KQnYA4XxS8atal4FXjlzjBK7iicJpZ0Wy+FJ8+Q0edukDL5hn8Z3IZMllg8MXBrhD5UrESf6hEX3juBXNb1GlIMj+rHFXBu2adhqlTdcybr0HTgJv+XIGUGZo20GvDBmD2+zo6HpjB7s2qXu2l8JWVbHFhsDLmWavJVvxdt8lEvAIUF8ErfOfO0yEuKnZf3NatAImlfOAB3gjTmsL3koFp/LxlsG4oXtneyXMoeJ1Qyn8PBa8av0QK3ufGprBgkWbEMi02c5Ii9sgV37IVuPNv1dXBnj8HBlHwhmor8Qd9eby5slRsZjSvKxBVwWvfbv55iywGXFiBOnVqP0XvNRsnz1MRvrLa+M5ss/+yLlnNln5MInaEJXStbyllwfvvV1L45mugzd7Zaqu51mHQ9vtnfIv6IcJ69WoNPY7yRkg7aYdh3EPBq0adgleNXyIFrxWns3u3DE7sVb2DyWZhrA4l8Vr0uY5PP9Uq/cv23Rs4/1yu8AbdFnKtLomfX58abTXo77LeF1XB+/0aDRNeNU/Ie5V0wk/GToSvtAXjxP8yrZoIk++SCVD3rsVnzPOjbqUseB8bkcLXK6I5kfDDVmE8k4JXjToFrxq/xAheiU8p2dfWrZeg/GZw/nzX/vtlcG7/eM+0a2s2ss325DMprF2rGWGuOndMY8Uqc3vvxx81XHbJNqRSCZ0VKP7e8hW3kkbU3EKVyZckUOh2WHTaY1QFr7D1OsuaT+au9thcwldO8IvQXbmqegKRxo2yRh8mq46XDozeCf9SFbxig5fHmwf+ZCHkoI5mWD9e3hKg4FXjScGrxi+2gtcSEAs/01Be7l6cJT1e7/q1ZRg+Ati0uXoD01PAdVenUS/gcEeKzTySxa3DMNI+7ROwILZQVYBQ8KrQy1+2pvC17qzpsmC5ZEUxTXopCl6Z4D8ywsxS50d2Pn9aS2k+lYJXzW4UvGr8QhG8776nG1tH/fp6FxZMVnAl41SurT9BJCKiTSugRYuscfBH/t2k8Y7wrBUi+ylvS5i02N0M7J+ESwauGW9rGPeKZOsy6y0pmSUtath+gqXMX7KAlZVlsfzrqhPfVvsUxn76CXrFjYLXK5LVn5NJA2NfqvLPPbhzFn1OSO/we7Nn+crlluXP1zl7aqkJXjtLWVXvd5p3Y5IzYsm6i4JXzd4UvGr8QhG8DwxLYc1ac9W1pujcsjWL/fcFDj6oyjdNOqXZ7+mVySOk3KZNMFbFam75WTiKFWn2rGzNm2chX2lffROxPOh3aZSFcPpb0dSuisvANXOWbPOZYtfLyYmrD4nBzbnCGpWayLWbgYLXn0Y5eaqO6TN1Y4JZKMayuL5IimmJAvH7S6PjZ19qgvdf41JGVA8JC3b1FRS7/rTsqqdS8KoRpuBV4xeK4P10oYbnX6gKkJ6vCtIJ1atbXXDmvbdRFk2bAm33z1amVS0WjXWoraZ4lkFGQkMNGpDGnnvGe6VXBq4PPgSeGWuujovIl6xevJwRyCdyrdKHHpxB31Oi45frrFbmXVEWvOMmpIzDXTJBK7XdmNvuKkNFBeAkxqsleOXgWv+zoyPUSknwfr5Yx9NjzAlGFP2h3fwmS+VeCl41S1HwFuA3eeYcXHHjgzvcNefNEahbZ6dQBK/9Y2RF1e4nKn5UCxeZkQJEXFqXuBjIdrp1yeENWR1u1Di3a4JKs7LcGmSL64AOWey5RxYNGmSNFeavv9bwi1/EW+wKO2vgenxUxgiBFGYgfxVb+lH2iy81/PADqgmqdBr431s6fpL4nbYA9fL+mj6573+g48AO2ZL1g46y4P3Xv1OY95EGcQc4/dToCMFC7dA6NOX0d2b1UXRpKEQ2/9+tVMxRY1h8jaJfkoJXzUYUvAX4/XfmB/jzXSPw0ohbq9251x67Q9O00AVvvs+fMVPHf6fqxlbTkEE7+rGpNZvaS7MjrBK8q9dsw/0Pmgc6nKw8+WmXqDz7nvtS+HGjuQ16bM8MJFXsuPHV04VG/eCZCssoC95Ro8tQvgSoWwe44brobPUX4r10mYbJU3SccHwGv3Cwe2Q/ayCr2VFJS1tKK7ylGNGjUDuK+t8peNUsRMHrQPDeet+TmPnvYTnv/Ob7TWoW8Km0xIR99jlzu0l8q4I8KCUrdLI1mmTfVfvAJckPxowNxxY+NS+lx0q4qHETzMxnNS/ZEeh6aLRioypVNkfhKAvepExWZZdgwmt6pXVkN+roHuEL31ISvJbrWinEbPb6NxzW8yh41chT8DoQvFfeOAyn9T4CdevWwaGd2qL30V1QljJ9aKMqeOXbnnqmDF98CRx6SBp9Tw7OjSCq/nFqPxV3pWsOXBJdQFJscnCo4lgzjJRMzq6/pnS20d21iKq7oyx4LR9e+VpZgT+pdwbt2gbXdxTLtJhyNdPSyjPCFr6lJHiZcbOYVqdWhoJXjV9iBe83K7/Da5Nn5aV37pm9UL9eHcxfWI5J095Fo4Y745tV3+OFCVNxTr/jcMOV5xllN2yK7rbfmjXA7fdkjYgId9+mQa9a0FBrNQVKL/4ii4dHAJJp7LJB7mP4+meZLB4AACAASURBVPpxAT28bpkOCVGxZZt5sGr5N8B9D5rC4Q9XaJC0w7yAijQMf96//yNr+KJfNhDYd594txlD1ESw7xg5Gpj/6Y7i9t47tFhHVZF+ctLkLN79oOoXedghQO/jNOMgb1DX5s1ZTJmu4ajDNTRsGL1JxqbNWSz+AlhcDrz3ftXZkROO1XBSr6AoJfs9Vt+RbArF1z6xgnfp8lV4fvyUvOQuH9APDerX2+HvL78+AzfeMxLzJj9hrPJu+Glb8fQDKDnqWWDpsiyu/4OGOnUCeCGANWtFaANNmwA3XhPMO6P2ljo7yexCw9ZtaSz7Chg/Efiy3PzK3sdn0ee4eIs6t/YY9xow439AjyOBfie7LV1a9zdssJMpeCPSd8iBwfuGAStWAdLlDRkIY0L23hwzfGGPI0qLb7Ffawpf4N05NuF7sAhfeC58RTxKKmdZZZaJ3pp1wBfbxeTJvXQcf0y4EUi2bQMmT5e41hqWfgXM/8QUuvZL2sq++2TR43DNWNzg5T8Bq+/w/03xfENiBW+x5pw5ez4uufY+fDDpcdSrWyfSLg3F1lG1nESOuOsecxXrtpuiuwKuWs/aytu3Jq1tYtmy7941i55HZiAZ13hVEZAwZCNHlxlJTYYMjrdbQ9RcGiy/XXFhuOi88P1Yw/5deOHqIH2g+KpLGmP535LUR+KR//STBomVXtt12KHAKSeF22++9Y6GSf/ZsZOystbJv5k6OPiWSpcGNeYUvAX4jRk3GW33+QU67N8a6zf8iD/d9ih2Kkth5NBrjZJR9uFVaxpqpW+6jYIXmmas4kmUBjmgddXlFZE5Da5mXX9K33WPGc3i+msqAj1k6U9t8j81aoLX8sU847S0ETGDl0mgkPDNJ2ola2WhSya/jRuZiYPq1QckTKRcs2brxoqvCMpzzg72sLH9mydP0zF9hukDd+ABWSPyRedOmVj/LgvZLAp/p+BVswIFbwF+9z/2Ap547vXKuzp22Af33ngJ9mzZjIK3FnZDHyyDdPxXXVGRMwWxWrONfmlrhXfCxDQkfI9cMsi1bysHY+IdhaBY61invk/snUH3ruFu6RZbByfloiZ4rckpJ2S5rZdL+Bays/zWWzQHWrY0k//IzkXjxiJwa08tvn5tGUY+k8XatWbYPrcr7hJ9Qt6z7z7F/36scGNSRx6yLWTpYP9OwavGm4LXAb/NW7Zi9ffr0HDnBmjcaJdqJbjCmxvg8MdSxhbegPMr0Lq1A8gxu0UE74y3NYx7xVy5kQHPnmJZBrP27bLo1iXDVd/ttreiezhNHlCqTSZKgtdtwoZSZe7Fd4vwHfW0uVtjTWBziVqVrX7pNyRt/LDHM8Z7pJ+QyBH2S9wk5Nq8qergmKw2b94iyYbMO4uNNmEXu2ecJjGyixfOXjDnM6oToOBVaxEUvGr86NKQh5+kPpYUyId3y6BPr+R1mqtWlBmRKuSyVixlwPzwI32HLHhWAoZWe2UTKX5lsJ47T4cIXgndJpc9KYXiTzRyxYMQvOI3vmIF0LIF0KxZFnv+PJNz4mkl/KA7Q+FmIr/focNMV63/+32F5wfZ5LnWzpAkrBHbSJZGlUt+R3JuoN3+hSfW9pB0FLsq1P0rS8GrxpaCV40fBW8efl8tB0aMNAeHQQPS2NNB9iNFU0Sm+IqVGp4cnTJ88fJtCcohrbnzUihfih1SQB98kJl9LO6XMJgy3YxPbF2yFSxJUqxVtDo7Ab86Q2LBxmfSFITgHfpgCmtrJPaQsIQigJrtaqb8bt4ceORx82AS3RkK/9qsyYGsnvY7zZ+DlTXj8E6eloKmZSXCYeVl+fvK76R+PbOfqFtPM1ym5P9J+vbFi3VXfQvFbmH7R+EOCl41K1DwqvGj4K2F3x1/LcPWbf6thiiarujisiIpYmLL5qzxb9lKlP8ml2w3fvmlhvU/AE5PW8u2cvkSc+XXfsnA2qZ1cOJ3WwWQzQIiMv26ROQu+EyHZOOTA2rWJW4Mx/as8m2umZQiTr6EQQhe4SqsV6zSjQgBNSdWdvvKgaSBA/wRcH61o6Cfa1/d9XNy4HXiCdk1kUyPufoWmUSKW5Vd7F50QRptWsV/sh10+/HqfRS8aiQpeNX4UfDm4WeFmZIVJUltHMa16lsNH83XcMjBGSMmsJPLErMrV5o+cfL/RcTKgLde/u3gBLa8Z982wGWDzSgNTi8RgAsXAXM+1KutegrDNq38PewmQvfvD6QgCQSHDPL2dHgulwVhYoU4qu30t92nsH1bc2UtyDTZTm3n5r6gBG/Nb5L2Je131Srg7Vm64VMuNjj15DSa7eamBsm7N4jVXaHqteC1W0omkTX7Fuvvsjp80fkZhhqLeNOm4FUzEAWvGj8KXhs/ETbvvq/hhx80fLva9McMQ6RYonXadN3wgZOVw6N75l+1EHH+xn9S+GaFM385K6SQCK8mTczoC3ISW66NG4WBjlNPBI7t6U7w2ptibf6+1mG3+g2Abds0NNzFmxWZBx4qgwTfF5sd3VN98MvnstD5IAlon9unNNfPUVaoXh5vrgiL+L/wvLTjCYziz9uX4mEJXntlLAF3TM8sjukZzoTUF7g+PDSo1V2/Ba+FRuqz8DMd78yWQ25mnxd2GDQfzBbLR1LwqpmVgleNHwXvdn4iMiUyg3WCOBdW6VQl7qSIlt2bAR3aZyRUbVFXRUUWny7UsXGjBHcH1q01A7vnW4GteWrZWnWcOr361ro9PmbjJqaQFZ85CSfUyAgrVPhzN/4EtNi1DFYc3sIlar9DBqhZ75mHuqwByl7iT1elXaUitSYEMiERduXl5r/tLgbW88VWEi/UCq/UplXG8BfMdxLdqcuCWyb2E/Kl7t4QtuC1dl+krV8ysLQnD27bkZv7vyzX8MYkHZu3mj7lfvruWt/l5wpvrrpLcoxxE/TKiBBuw6C54cl71QlQ8KoxpOBV40fBu52fpCe9Z2gKzZvB8AszwuRs1rBiJbByVX4x1f8sdyuJ8lzJDDXjfymk07lXNi3R2rIlIKc47P5rMmjVq5+t5kMqIcNk1XGfvTOGEPfi8mvgynXYrbaIBjIBWLI945Ply5lL2EqdhVtZSsNPm4Hdd6seRi0Xk5piWAZP+wE0Jy4LblhbCRLEXtLGfnlAFrvt5s3qtpvvUL03bMFrJUKJe7xjVTtJEojXJ5kxtGViLpEZZEfHz8uvfqO2b5bJ5HMvmGEk5WLEDj8trPZsCl41fhS8avwoeB3yM9NrmodozNXVqtVKEaEn9i7smyk+aDIAWYLNSG9ZI7B7rpXHfIHjax6UclgVR7cFMXDVPNglArTjgcCGH2GEpLLH/a350SIa27TZvnrbPIMWLUTwAjJxkcFdTvTLJREn5HCe2E1WgVes2J4qdfvgWPO5Iprduiw4AgrASpBgv19s2Ka1BPk3A/tPnKTj6280XH1lBXZu4PTJwd4XpuC14u6G6VsfLO3i3yauNGPGmj+Ek0/MoGsX/yOFBNFv5CPyxpu64dstl9RV6swrWgQoeNXsQcGrxo+CV4Gf/UBSbauUsqr5+qSqFYhihap9JcPvbfEgB66awtduEnHBEDHYooUpCuXfTtwynJjVEsPvzDbjhR5wQAZnn+nfILn4C92IIiHuLB9/omP+J7n9YUR0Xz4k45lvsxMWbu4JU/Baq7tcxavdYtK2R402J9d+9xX2Lwmy38hFQHbPZNIoV5D1dvP7SfK9FLxq1qfgVeNHwavIr2b2InmctWonvrOSkMCejOCk3lmlmKx2/0WJHuHXif+gBy5roBJm3btlDYYqGZ/cmNWauIQxQIpfs6xk290prOxUmbSGzhKho3G03B7CErzTZ+iYPE03Jj5DBvOgWr42bu+Tgk6cE3S/kYuBfWWbCSjc9IT+30vBq8aYgleNHwWvIj8pbj8FnetxsmIn2YJEUHlxjXwyhSXLzAgS/c/2Z+APcuAqX6ph1FNmAoF+fdOGS0GQV5iC115PWZV77gXzAI516SngikvTaNo0WCa18Q9L8Fqru0lN913oNzHzLR3ffS+TJ/OQWhgproPsN2rjYV/pZWzeQi0nuL9T8KqxpuBV40fBq8hPittjXIovr6zWyardggUa9tsvi6OOyHi6EvvtauCfo1LGdmWnjlmcebr3ojeogUvCiD36T7MuYaywiv1GjU4ZNjvhuDSOOiJcYSkuDzJYb9kCfPa5ZvjzSiSCn7cM97vsP5MwBK81qaTvbu4OS9rNzbebmSHlEk5ex6N20lUG1W84+RZrIssYvU5oBXMPBa8aZwpeNX4UvIr8pLi18uRnBqOanymrgVZa1Wv/mMbODbwVREENXPcOLcOGDTBWdWV1N4xr4ps63pmlI2qn/kWEf/GlxGHOQFZ6o3KFIXiDSpwQFcbFfIcxyV6oQYReWG4fQfUbTvlYWdhkAsCQZU6p+XcfBa8aWwpeNX4UvIr8rJPBYWwfzpuvGyuTp56UNjKMeXkFNXANfbDMiD186aB0YD67NTlFxaXBS/v5+awwBG8Yk0o/Gfr1bGsiLBOkW26o8Os1eZ8bVL/hpmISX10mA7lWvTOZLHS9yGDqbj6C9xoEKHjVGgIFrxo/Cl4FfvYoDQMuTKP1Xt6usip8mnLRoAYuKzZt/7PSRmzaMC4KXnfUgxa8ny/W8PSYFHbZJYtrrg5nF8AdofDutlbC27XN4hyf/Ptrq11Q/YYbwhJGcuRTpui1YpZLbO/ypWZ8dTkAKf+dl/8EKHjVGFPwqvGj4C2Sn13snnN2RinyQpGf4GuxoAauKLgTWAdcwvIh9tWQPjw8aMFr2ad714zhdsIrP4GwV8KD6jfctgEruk2ucmH5O7utQxzup+BVsyIFrxo/Ct4i+NnFblzD3gQ1cFksu3fL4MRe4YiZuR9KetJUqH7ERTTD0IoELXitSZFEJeneNW0kGdm0PRNi82bZSPk3h2YUAFFIyhFUv+GE8w8/AJ8s0I2VXfmNyyWuHpKUQtKNt2ubgewwWSu/DHXnhKraPRS8avwoeNX4UfC65PfhPB0vjzcDm0ftkJPLqtR6e1ADl2wtjhydCu2QjUCQgz6SmjTMg3Ne2s7vZwUteMe+mMInC3L7We6/bwbnnhPORMlvzm6fH4WkHEH1G4XYvD9Hx4RXt6db3H5zxwOzOOWk6rHLxd1h+OMpI4xb0DGLC9Uhjn+n4FWzKgWvGj8KXhf8rJVAKRL37e8gBy4r5e7111R4Gr7NqWnf+0DHK6/p2OPnWQy+mD6ihbgFLXi/+x6Y/7GOzVuA8nIz5J+EmpJVum6HZXBwZ/pfRmF1V9pNkP1Gbe3UEv+yK9C6dRYHtM/iZz/L3U7sEW/i6J5W6Pcc5N8peNVoU/Cq8aPgrYWfzP4lU9q69TC2xOSAg1y7Ns3iyt/HWxgFOXBZp6jP7Z/G/vsFL14sH1E/E3ko/kwjVTxowWuvvOWLGUZUlEgZocbHvPivlJGqWrhcdH4aWkiBB4LsN/LZoxjxb/UBYcfslZTjmzYBXQ6N564FBa9aL0LBq8aPgjcHP9nifuW1FH7cmB/uTTdsQ1kqpFFF0eZOigc5cFVGajg7g/Ztg+/oGaXBSYuouidMwfvxJxpe+Fe4LjDuaAVz99JlGp59Xjcm5V0OyeDUk4P/HUVlhXfKdB3Tpus475w09tvX+QTaHrNXYl8f1Ml5Wa+sfN8DKaz/QcNv+2fQdr9wbOhVXXI9h4JXjS4Frxo/Cl4bP1nRnTZDx9uzTN8vme23aQVjS0xWTlq2yBpphNeu07B3m+A7Q0VTuyoehuA9vHsGfU4IvpO3BHecfbJdGb/AzWEK3udfSuHTTzXUrQPccF3wcWa95Oj1s6zV77CSTkRF8Mp3VFQAZVWJ5xyhlsnC6DE6li83FzIkekPL5kDjJlnUqyvjgbm75+culNUX9Tgyi+OPjd8uIgWvo6aY9yYKXjV+FLzb+Ykf17jx5oleuZIufoISvAsX6Rgz1pxghCV4H/tnykjh2+OoDI4/JnjBrfgTDrx4WILX3lZ+8+sMOrSnrWoaP2x/+KD6DT8bvbhEyCqxHGTLd3XulEXnTmm0bu3dl1jps+WJQWbt9K4GhZ9EwVuYUW13UPCq8aPgBbD6O2DYcHM5QGb1/c/KhJb1S9GcnhUPauCyDpf06ZUxTkmHcVnf0Gw34PIhXDUsZIMwBK+IgUdGpIwt+7gfGC3Ev7a/W/7wA86v8FSMOf2moPoNp99T7H3bKoC339HRbNcsVq02he+69Zqxw7dkaZUQlvFCEuZ065JBkybu32adExE3Ouu5srN4xmnZ2MV2FzoUvO7biL0EBa8av0QI3i++1CAxGSXsVK7Lyk4kHc0Vl2Wwy87xdldw0mSCGListMwyaFx9RXjbd2vWAo9STDlpFsY9QQpe2XkRIfDObM1YcePBwtrNZMUsDmtSEES/4bih+nSjiN4PP9Ixd57ZJq3LcoEQF7iWzTPGhEMii6TTwM4Nqn+MuJ8s+Eyvdhhaxp8WzasEdW1h0kQoS1p52ZFcuVJDkyZZyKJB1C8KXjULUfCq8UuE4L3rnjJIB9GpYxZNm5hiVvyx6tXNYvMWDRMn6Ya/7qUD00XN0hVNEMnifg9c364GHnrEXFWPwvadfbucoYlqb5J+C16xxXsfaJCUwvbrZz8Dfn9JOKHrIvkjzfFRVujEsCYGfvcbUbODCNe581JY9Dnw0087ukA0bAhs2ABcOiiNdeuAJcvMJBhWxB+pj5wPkVXizp0yxrhkRYywC2kJwdemtYxXgERykGfaLyknYR2jflHwqlmIgleNX+wFr8zGx7ygY9Wq6kHIa2ILa0VE0Xy+Ffd74Pp2tYZ/jjJPlUeFvT000TE9qw4q+ga5RB/sl+BdtUrDE0+ZbcK6mjQG2rXLYOcGWXQ9LGscVuOVn8DadcDQB8uMCbzsaIlQslYbg+Dmd78RRB2KfYeMNRIto3yJjvKlqNUHWA4WtmmTxdFHZVG//o47imvWAc9vzwKX73uMg9Qts2ixexadflkaGQcpeIttXWY5Cl41frEWvCJgpk43Z9Oy3XRo5yzS23d9ZKYs/33FClQGsucKb1VjCmLgKl+qYdRTKeOlv7swjVZ7he9K8vyLKXxqy+rVvHkGlw2O/lahYjfgqrhfgtdanZTfaveuVSterj4uwTeL+8dzL+Q/bCUiq2VLYPMm4Ps1wGGHZnFYF2/bdhD9RqmY2HI7GDfBnMT9Ys8s9t1HfHPNiD9OL7GrrA6L+0Imq2Gf1t4elnP6HV7cR8GrRpGCV41frASv+Eo98FAKsiokl3UIQLb3+p1WPaWkHZsVCkYGhAEX5L9PEXVJFQ9i4LIHiL/q8vCC5dsNIxOit97WsfgL03dUws9deF54/sVRbDR+CV5rhb17twxOLAF/xCjZRtxAXh5vTu6lH5NdExFI4mP61XINkq2u5iWrgmf287ZtB9FvRIl7oW/Z8CNw7/2m69ZtN0Xf5aBQfVT/TsGrRpCCV41frASvnKy9++9l2LrVhOL0tKs9n3pYvm+KZvS8eBADlxUd4YzT0qEEea8NWpS/zXNju3ygX4LXOjwq7iTH9PRWiLmsYkndbh3+lI8WN4Z+fXOzk7TMIn533TWLpk3hapXRKZAg+g2n3xKF+7ZtA27/qyl4xY/XzcpuFL7f62+g4FUjSsGrxi9WgtdCYcWidHMYyh726ITj0jjqCOdbToomiGTxIAYuy05RWvn46SfgtYlmmtawo0dEsmH4FKXBWu2XSerFF6Wxe7Oo1j463yUT9TFjU5U7WVGIHR5EvxEdCzj7EityRqeOGZx5urcuJM6+IDp3UfCq2YKCV41fZAWvuCekTPdOV5f4Oz3yeKoosWI/qX/N1WnssktyRW8QA1cUBa+4MYzc7lcsDa9rlwxOPjHZg1TNH6DXK7z2HZYorva76oACuHnLZg3/+rdmnPi3zidEJXZ4EP1GAIg9e4W4NAwbbsaPPv3UNA7unNwxRaBS8Ko1LQpeNX6RFLwSsurhR8twSOcM+p7iTGxInM6WzbN4eULK8FuTPOgyeLq9LH/eY3umcXTP5HZOQQxcURS80l6+/VbDJws0TJ1uRvb4za/S6NAhuW3Bb8FruTJwRd1ZbyWRAJ540lwNEH/d/mdFJ5xiEP2GM0rRuEv6EPlHMrPJOZKkXxS8ai2AgleNXyQF748bNTz4sHmyVTqKo3vU3qGPm5AyYhtalwycv/1NBs13dy9Sps3QMGVaylhdvvmG5B4yCGLgiqrgtdoRD1Hl7ly8XOG1dmTkTW5ckBS7vZIvbk3MxV83X0KdMCoZRL8RRr2Kfac1mYuCu0mxdfCyHAWvGk0KXjV+kRS8UqXPPtfwzHNVPg0ifPfbN4MDD6guYq0ZtPj+Wdt7F51X/IqHBA9/4CFTbCc5AUEQA1cxvtaKzd1Vcdlql6Ql0rauv4arMxY8LwXvm/9N4X9vawbjPidkEr/l67SBWpN8Cl6nxMK5T9yjxE0qrFTP4dQ6/1speNUsQsGrxi+ygleqJQfJps1IGSkc7au3bVoB3bpmjI5EsqTJJeJ07zYZY2W2GN9fO0b7AZqkxub1W/BmMsBf7y3Dli0m+fN/mzZiVEbt4oC1o0W8FLwywZz0HzNNq1wSo/Scszm5KPQ7sA5CRW3l0O9+oxCXqP3dmtRLFjTJhpb0i4JXrQVQ8Krxi7TglarZtzzr1kWlQLJX249O/7F/pvD1N+ZJfZUVY0XzhFY8iIHLHhkjaitVFnhrB+GI7hn0PsGZP3loRgvoxV4KXmtiO+pp0/f++OMy6HEEOecyZTZrpp2tXy+LufN0Y8IflSyF1vcG0W8E1MyVX8MdovyTZWW4CX0ABa+i4b/5fpPiE/wrLmJ31GjTveDwbhn06ZUxBPCHH2lYsNAMqu5Xhy9+xKOf0SuzsA24IGMcEEnKFcTApRJRIyg7WIK3yyEZnHoyhZhw91rwDn8sZfzOGAO79lb9+WIdT4+pniI9akk6gug3gvrtq75H2rS0bbmkbffpVbyrneq3RKU8V3jVLEHBq8Yvsiu8q1cDI0aZ4VzyBVPfshmo6/M2kXU4pEGDLK77Y3K2WoMYuKoSDWSMiUsUL8ulQU7Ct2+XnAlPbbbwUvBaSRNkJ2XIIGY5rI37lq3AlKk6Nm3SUL9BFu/M0o1J+JDB0emXgug3othP5PsmcY+bMr0q3fORh2fQ6/ho9nVBcKXgVaNMwavGL7KCV2Lpygpg2Ks+lihLWlgZvwcucWcYOszMQBTl0/lRjySh+PMvqrhXgteKgiEH1pLqK1+UAbYXuusec0EgSv6hfvcbKrzCKlvzLEqPozI4/phkil4KXrVWSMGrxi+ygtfaSg5zBaNURJliE8hZ3O+B6/mXUvj0UzPsXFTjU366UMPzL5hbkl0Py+DkPskcpGo2EC8Er903P8nRUFR+u9buw+HdJcJFNFZ5/e43VHiFWdY+lgy4oAKtW4X5NeG9m4JXjT0Frxq/yApeqdY/HkpBMq5dfWU4nfmLL6cw/+NoizJF8+ct7vfA9e8JKcz5UEPDhllc+fs06uzkV02Kf67lziJPCHunofhaeF9SVfDK4G8dUrN8873/yvg/0dp90jXglhujETPc736jVK16/4Pmocykt3cKXrUWTMGrxi/SgjddAegpQKuKSqZYW3fFX5uoY/Z7pp/cgAuS5V8YxMBlHVaKYjrZJUuAkaPLikpR7a6Vld7dxQpeObW+aTPwxqQUFizSIud/WkqWsK+Qn9s/jf33i4Z/eRD9RinZSb5VQmeK+w4zCTK1sGrbpeBVJBjlKA2KVVMuLgO0bBvKadt8B+eUXxLRBwQxcFluK35F2lBBG2UxrlIvL8q6FbxDH0xh7bodZ60ykZR42gcfFA2x5gWboJ5hrRhG7bcTRL8RFGMv3rN0GfDEk9E/q+BFXZ08gyu8Tijlv4eCV41fpFd4FavmSXH7SoqTNMeevDQCD/F74JKEA8+/aMYSjdJkQuw9a3ZVIoQoHQiKQLMwPsG14B1WZiSRkcNp9etp2Lo1i40/mQJYdnBuSXAK72JsaiXGieKKod/9RjG8wiyT1EPP+ZhT8Kq1RgpeNX4UvA74fTBXw/hXqtIciz9n+3YZHNQpvitTfg9cb72jGxm25PrFnlkMHBCOn7bd/BLY/5Y7yiD/ti4RFcf2jLetHfwEqt3iVvDmer7snpQvER9uYM894vs7csvWyf0TXtPw/gepSCbF8bvfcMInKvdIGx/+uOm7G+VINEHyouBVo03Bq8aPgtchv3xpjk8/NY292zh8SAnd5vfAVVEBPPdCCp8v1hCl4PmTp4rTOPDzFhmIy4W4s8glkxvxNeblfoWXzLwlILsjTz5dlRSn/9kZtGkVjUmD3/2GtyS9fdoHczSsWaujIp1FeblW2XfIWy4dlEbLFtGwkbe1dvc0Cl53vGreTcGrxo+C1yU/Eb5Ll1UFE99v3wzOOyd+4aqCGLii7MNrNQtr+1j+v2y9yxZ80i8vVniTztCL+tujiEh69c6dMqjncyKeQt8dRL9R6BvC+vvtf90J27ZViVpx4WnRHIbbFmNNm1ah4FVrnRS8avwoeIvkd899ZfhxI9C6VRb9+sYvZaTfA5cldgX/mf3S6PTLaK5+yAr002NShp0lUgcvrvBGpQ3Yt8ytb5KDgJIR8KCOGTRpEvyX+t1vBF8j52+UxZC3Z+uyQYQO7TJo3dosO25CCnM/1CLpguK8dt7cScGrxpGCV40fBW+R/D6Yo2P8q6YPquXn2bhRFut+0HBQx2iKNzdV9XPgGvlkCkuWma4CchDw9L7p0ELPFWKSSQNTZ+pou28We+5Z+nYtVF8nf+cKrxNK/t4jE8Z3ZmtGpjVjJXF3VP6mrDdb/VKrvbKBiV8/+w1/ifr7dCtJSNLPBFDwqrUzCl41fhS8CvxkRi9+qJafp32g6X9WpqR9tvwcuB4bkcLGTcAZN6oZrAAAE4lJREFUfdOVqyAKZmDRgAlQ8AYMfPvrJIKIbI+L0JWDUHLJAVrJVGi5MixYqGHhIh0LFsEQw9YlOxQtW2ZRry6wyy5Al0P8ccPys98Ih7o3bxVbjH5Wx/Kvq+zWp1f8dgYL0aLgLUSo9r9T8Krxo+BV5CfF58zVMG9+CtlsFuvWo3IwOql3xogzWooXB65StFow30zBGwxn+1tGjEzhq+VVAlZcF07qXfuEUfzPy5dUhdizP0/K9z/Le8HFfqP2tiE2mTJdrxwj5PyHnANJykXBq2ZpCl41fhS8ivxyFbf8U3scmcXxx5am3ycHLh8aRkweScEbvCFvus1MXiARTeS0vxu3KdmJ+malhh9+EBcIYO68qhXiPidkcHh37wQX+43CbcMe8Sdph9koeAu3j9ruoOBV40fBq8gvX/FMBtBNF9+SvDhwlaTZAvloCt5AMFd7ya13liGdBn7z6ww6tFcTqDVDLB7WJYNTTlR7pvWx7Dect40nRqWw9Ctz1V6ibHQv0d1A5zVmlAY3rHLdS8GrSJCphRUBxrQ4B66YGtaDalHwegDR5SP8COFnhdyTVcarr6jyA3b5adVuZ7/hnN6GDcB/p6SMFXe55EDbheem0bSp82eU2p1c4VWzGAWvGj+u8Cryi2txDlxxtax6vSh41Rm6fcI7s3VMnKR7noa7ZvQA1YgO7DfcWhbGIcPXJ5luJj2OzOD4Y71ZbXf/Jf6XoOBVY0zBq8aPgleRX1yLc+CKq2XV60XBq87Q7ROWLwceH1kGOWw2ZLB35wIkPOCzY1LYsrXqiw7vloVEECjmYr9RDDWzjETY2GfvDOrUqTqcWPzTolmSglfNLhS8avwoeBX5xbU4B664Wla9XhS86gzdPkHOBNz5tzJsqwCuv6bC84xq4t4w50PdCHuma8AtN1a4/UTjfvYbRWFLTCEKXjVTU/Cq8aPgVeQX1+IcuOJqWfV6UfCqMyzmCRPf1PHOLB3H9MwY//hxDX/MjCs+4PyKomJks9/wwyrxeSYFr5otKXjV+FHwKvKLa3EOXHG1rHq9KHjVGRbzBHs6bvG1/d2Fxbkd1PZuS1TLAarjjsmi0y/dCWv2G8VYNjllKHjVbE3Bq8aPgleRX1yLc+CKq2XV60XBq87Q7RM+WaBh7Ispo5hEVWjcCJ768lrfs2QJMPrZMlSkzbCKt/zFnWsD+w23lk3W/RS8avam4FXjR8GryC+uxTlwxdWy6vWi4FVn6PYJt9xZhkwale4M2Syg+Xi26dERkjkSuHSQu1Vk9htuLZus+yl41exNwavGj4JXkV9ci3Pgiqtl1etFwavO0O0TPl+sYeNG4KBOWbdFA72f/UaguEvuZRS8aiaj4FXjR8GryC+uxTlwxdWy6vWi4FVnGNcnsN+Iq2W9qRcFrxpHCl41fhS8ivziWpwDV1wtq14vCl51hnF9AvuNuFrWm3pR8KpxpOBV40fBq8gvrsU5cMXVsur1ouBVZxjXJ7DfiKtlvakXBa8aRwpeNX4UvIr84lqcA1dcLateLwpedYZxfQL7jbha1pt6UfCqcaTgVeNHwavIL67FOXDF1bLq9aLgVWcY1yew34irZb2pFwWvGkcKXjV+FLyK/OJanANXXC2rXi8KXnWGcX0C+424WtabelHwqnGk4FXjR8GryC+uxTlwxdWy6vWi4FVnGNcnsN+Iq2W9qRcFrxpHCl41fhS8ivziWpwDV1wtq14vCl51hnF9AvuNuFrWm3pR8KpxpOBV40fBq8gvrsU5cMXVsur1ouBVZxjXJ7DfiKtlvakXBa8aRwpeNX4UvIr84lqcA1dcLateLwpedYZxfQL7jbha1pt6UfCqcaTgVeNHwavIL67FOXDF1bLq9aLgVWcY1yew34irZb2pFwWvGkcKXjV+FLyK/OJanANXXC2rXi8KXnWGcX0C+424WtabelHwqnGk4FXjR8GryC+uxTlwxdWy6vWi4FVnGNcnsN+Iq2W9qRcFrxpHCl41fhS8ivziWpwDV1wtq14vCl51hnF9AvuNuFrWm3pR8KpxpOBV40fBq8gvrsU5cMXVsur1ouBVZxjXJ7DfiKtlvakXBa8aRwpeNX4UvIr84lqcA1dcLateLwpedYZxfQL7jbha1pt6UfCqcaTgtfGrSKehazp0XduB6oYff4L8vUmjhtX+9s33m9QswNKxJMCBK5Zm9aRSFLyeYIzlQ9hvxNKsnlWKglcNJQXvdn6bNm/F2YNvwaBzT8UpJ3SvpPrTps249o7HMOWtucZ/69hhHwy74wrs1rSR8f8peNUaYFxLc+CKq2XV60XBq84wrk9gvxFXy3pTLwpeNY4UvAD+/uhYjHp+okHy7hsGVxO8/xzzGl58ZRqeHnYD6terg0uvG4o2e7XE7dcMoOBVa3uxLs2BK9bmVaocBa8SvlgXZr8Ra/MqV46CVw0hBS+Adet/xOatW3HOkNtx9aCzqgneXw28Gb2P7oKBvz3FID1p2ru4+pbh+HjqKGiaxhVetfYX29IcuGJrWuWKUfAqI4ztA9hvxNa0nlSMglcNIwWvjV/v/n/C5QPOqCZ4u5x4Ce649neG6JXr08+W4NeDbsHbrzyMRg13puBVa3+xLc2BK7amVa4YBa8ywtg+gP1GbE3rScUoeNUwxlrwvvLm21i5ek1OQh32b40juhxY7W81BW82m8WBx1yE4X+9Cj27dzLu/WLJ1+h74Q3479j70LL5rhS8au0vtqU5cMXWtMoVo+BVRhjbB7DfiK1pPakYBa8axlgL3mdf/i+Wr1idk9DBv9wPJ/Q4tFbBK3+UFd47r7sYvXqa99Zc4VXDz9IkQAIkQAIkQAIkQAJ+E4i14HULL5dLg/jw9jnmMFx8zsnG42r68Lp9B+8nARIgARIgARIgARIIlgAFL2DE181msjjl/D/jkvP74pTju2OnncoMS4x49lW89Op0I0pDg/p1ccm191eL0hCsufg2EiABEiABEiABEiABtwQoeAEj6oKs3NqvV0f/1RC2G3/ajD/e9ghmzJpn/PnAtm0w7M4rsftujY3/L36+6UwGZalUTvbfrVmPBvXrGWKZFwmQAAnkS2JDMiRAAiRgEaC28L4tUPA6ZLp+w0Zs21ZRmXDCKiYH44aOeBFTXhxa7UnLvl5lrAYvXb7K+O9nnNQDN119AXYqyy2MHX4GbytBAnc//BxGvzip2pd3PnA/PPPQDSVYG35ysQQKJbEp9rksV/oEJs+cgytufHCHisx5cwTq1tmp9CvIGrgmQG3hGlnBAhS8BRHlvkEE7cA//t04FNe8WZMdBO+gP/0du+xcH3deNxArv/0eZw2+FTdddT5O7XV4kW9ksVIl8LeHxuCrb77FNUP6V1ahbt2d0KJZ01KtEr+7CAKFktgU8UgWiQmB/878AH++awReGnFrtRrttcfuRrx3XskhQG3hn60peItkK36/4q4w5X9z8c8xr1YTvLIafPiplxkreLKSJ9ed/3gaK79dY7hD8EoWARG86374EX+7flCyKs7aViNQKIkNcSWXgAjeW+97EjP/PSy5EFhzgwC1hX8NgYJXke3EKbNx7yPPVxO8Vqzeaf96AM12NX19n37pTYyf9NYOM3jF17N4CRAQwfvm9PfQ7eAOaNKoIY498mAc0nH/EvhyfqKXBAolsfHyXXxWaREQwXvljcNwWu8jULduHRzaqa2R7Cjf2ZDSqh2/thgC1BbFUKu9DAVvDT7frPwOr02elZfauWf2Qv16dSr/nqtRzv34c5z7+zsrs7HJzS+8Mg2Pjh6/g+uD9yblE4Mi8MFHn2HO/M9yvk6E7a9O6Wn8TXyxlixfafjifbyoHOKvd/8tQ9D76MOC+lS+J2QCTpLYhPyJfH2IBOYvLDcOThvZO1d9jxcmTMU5/Y7DDVeeF+JX8dVhEqC28J4+BW8NpnLI7PnxU/KSvnxAPyPqgnXVNgub/vI/Kg+5cYXX+8Yb9hNnzv4Ib7//Sc7PaNq4IQb+9pScf7vursexbv0GPHr3H8KuAt8fIAEmsQkQdom/6uXXZ+DGe0Zi3uQnuMpb4rYs9vOpLYoll78cBa8i01yNMpcP7+1DR+Pb79bSh1eRdxyKPzDiJcjq8NPDro9DdVgHhwSYxMYhKN6GmbPn45Jr78MHkx5HvbpVO4pEkxwC1Bbe25qCt0imskVZUZHGG1PfNcKSTRpzLzRdq5yNX/zHe/GzXXY20hIzSkORkGNSbOjjL6Jvr8Ox154tsOiLZbjo/+42MvcNPu/UmNSQ1XBCgElsnFBK5j1jxk1G231+gQ77t8b6DT/iT7c9aoSwHDn02mQCSXCtqS38Mz4Fb5FsF5d/jdMuqh5HVUKOWSfxy5etMOLwStgyuU7vcyRu+cOFlRncinwti5UggbMH32r47lqXtIUbrzqfKzclaEuVTy6UxEbl2Sxb2gTuf+wFPPHc65WV6NhhH9x74yXYs2Wz0q4Yv941AWoL18gcF6DgdYyquBtXrV5rxOPduUGV329xT2KpUiYg2bXWrt+AZrs2qXbosZTrxG8vjkC+JDbFPY2l4kJg85atWP39OjTcuQEaN9olLtViPXwiQG3hHiwFr3tmLEECJEACJEACJEACJFBCBCh4S8hY/FQSIAESIAESIAESIAH3BCh43TNjCRIgARIgARIgARIggRIiQMFbQsbip5IACZAACZAACZAACbgnQMHrnhlLkAAJkAAJkAAJkAAJlBABCt4SMhY/lQRIgARIgARIgARIwD0BCl73zFiCBEiABEiABEiABEighAhQ8JaQsfipJEACJEACJEACJEAC7glQ8LpnxhIkQAIkQAIkQAIkQAIlRICCt4SMxU8lARIgARIgARIgARJwT4CC1z0zliABEiABEiABEiABEighAhS8JWQsfioJkAAJkAAJkAAJkIB7AhS87pmxBAmQAAmQAAmQAAmQQAkRoOAtIWPxU0mABEiABEiABEiABNwToOB1z4wlSIAESIAESIAESIAESogABW8JGYufSgIkQAIkQAIkQAIk4J4ABa97ZixBAiRAAiRAAiRAAiRQQgQoeEvIWPxUEiABEiABEiABEiAB9wQoeN0zYwkSIAEScEVg2MiX8e7chbjzuoux1x67G2UXffEVbh86Gmf3PQan9jrc1fN4MwmQAAmQgDsCFLzuePFuEiABEnBN4Ls169FvwF/QvFlTjHn4L9hWkcavB92MXZs0wsih12KnspTrZ7IACZAACZCAcwIUvM5Z8U4SIAESKJrAnPmf4bzL78JvzzgB6zf8iLff+xjjRt6B3Zo2KvqZLEgCJEACJOCMAAWvM068iwRIgASUCTz14iTc8/BzxnPGPnYzDmzbRvmZfAAJkAAJkEBhAhS8hRnxDhIgARLwhMDM2R/hkmvvN571+jN3o9WezT15Lh9CAiRAAiRQOwEKXrYQEiABEgiAwPIVq9FvwI3oc8xh+OCjRShLpfD8ozejQf26AbydryABEiCBZBOg4E22/Vl7EiCBAAhs2rwV5wy5DalUyji09uWyFTjz4puM6Ax/u35QAF/AV5AACZBAsglQ8Cbb/qw9CZBAAARu/vsovPTqdEx89m7stYfpxvD8+ClGWLJb/3gRfnVKzwC+gq8gARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQS+H/ldYcCoa21CAAAAABJRU5ErkJggg==" + }, + "metadata": {}, + "output_type": "display_data" + } + ], "source": [ - "# Use plotly to visualize RRT\n", - "\n", + "#| caption: Rapidly Exploring Random Tree (RRT) from start to goal. The green points are the start and goal.\n", + "#| label: fig:rrt_example\n", "xs, ys, group = [], [], []\n", "for i, node in enumerate(rrt[1:]):\n", " # create line from parent to next node\n", @@ -553,36 +579,8 @@ " mode=\"markers\", marker=dict(color=\"green\"))\n", "fig.update_layout(showlegend=False)\n", "fig.update_xaxes(range=[-10, 10], autorange=False,scaleratio = 1)\n", - "fig.update_yaxes(range=[-10, 10], autorange=False,scaleratio = 1);\n" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "sNy4-jMNTonp", - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAArwAAAH0CAYAAADfWf7fAAAgAElEQVR4XuydCbgU1Zn+36q+wAVFuLgAisI1yuIGuIIL4IoYA4GMMJC4kajRTMzoZDRj/nESk5hMJtFEJzrRRBIdF9RgcEHAhUXD4sJiVBY1XFBZNAiIst6u+j9fFUXX7dv3dnV/Vd3V1W89j0+it86pc37f6XPeOvWd7zNs27bBiwRIgARIgARIgARIgAQSSsCg4E2oZdktEiABEiABEiABEiABhwAFLwcCCZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAUvxwAJkAAJkAAJkAAJkECiCVDwJtq87BwJkAAJkAAJkAAJkAAFL8cACZAACZAACZAACZBAoglQ8CbavOwcCZAACZAACZAACZAABS/HAAmQAAmQAAmQAAmQQKIJUPAm2rzsHAmQAAmQAAmQAAmQAAWvbww0ptMwDROmaTQbGVs/2wb5e12njhw1JEACJEACJEACJEACFUSAgnePsbbv2IVxV/0QV37tS7jw3MF7Tbht+w7c+JPf4cW/Lnb+23FHfQF3/uRaHNClUwWZmU0lARIgARIgARIggeolQMEL4Jf/OxmTHnnWGQX/9f2rmgje3z/0DB57ajYeuPP7aF/bFld/73bUH9YdP75hYvWOGvacBEiABEiABEiABCqIAAUvgM1bPsOOXbsw4Zof4/orxzYRvP90xX9i+LCTcMVXL3TMOmP2K7j+h3fhzVmTYBjNXR8qyPZsKgmQAAmQAAmQAAlUBQEKXp+Zh4//d3x74pgmgvekEd/ET278uiN65Xp7ZQMuuvKHmPfUb9Gp4z5VMUjYSRIgARIgARIgARKoZAIUvK0IXtu2ccyZl+Oun12HoYP7O3e+1/AhRl72fTw/+Vfo3nX/SrY9204CJEACJEACJEACVUGAgjfADu9Pv/cNnDf0xJw7vGs3bq+KgcJOFkagY/sawDCwddvuwgry7sQTOHj/9k4fOXck3tQFd5DzRsHIqqqAN3dUVadD7CwFbx7BKz685595Mr4x4YvOndk+vFy0QhyNCaqKC1eCjBlyVyh4QwaaoOo4byTImBF0hYJXB5WCF3Di69qWjQsv+Q9885KRuPCcwWjTpsYhe++DT+Pxp+c4URo6tG+Hb954W5MoDRS8ugGY1NJcuJJqWX2/KHj1DJNaA+eNpFo2nH5R8Oo4UvACTtQF2bn1X0/f/zNH2H6+bQe+e8vdmLtgqfPnY/rU486ffgcHHdDZ+XcKXt0ATGppLlxJtay+XxS8eoZJrYHzRlItG06/KHh1HCl4A/LbsvVz7N7d2CzhBAVvQIBVdhsXriozeAHdpeAtAFaV3cp5o8oMXmB3KXgLBJZ1OwWvjh93eJX8klqcC1dSLavvFwWvnmFSa+C8kVTLhtMvCl4dRwpeHT8KXiW/pBbnwpVUy+r7RcGrZ5jUGjhvJNWy4fSLglfHkYJXx4+CV8kvqcW5cCXVsvp+UfDqGSa1Bs4bSbVsOP2i4NVxpODV8aPgVfJLanEuXEm1rL5fFLx6hkmtgfNGUi0bTr8oeHUcKXh1/Ch4lfySWpwLV1Itq+8XBa+eYVJr4LyRVMuG0y8KXh1HCl4dPwpeJb+kFufClVTL6vtFwatnmNQaOG8k1bLh9IuCV8eRglfHj4JXyS+pxblwJdWy+n5R8OoZJrUGzhtJtWw4/aLg1XGk4NXxo+BV8ktqcS5cSbWsvl8UvHqGSa2B80ZSLRtOvyh4dRwpeHX8KHiV/JJanAtXUi2r7xcFr55hUmvgvJFUy4bTLwpeHUcKXh0/Cl4lv6QW58KVVMvq+0XBq2eY1Bo4byTVsuH0i4JXx5GCV8ePglfJL6nFuXAl1bL6flHw6hkmtQbOG5Vh2S07N+OHL9+IGX9/CvL/x/a7GD88/b/QqV3nSDtAwavDS8Gr40fBq+SX1OJcuJJqWX2/KHj1DJNaA+eNyrDsxGfGYsaqp5s09qK+X8Ovz7kn0g5Q8OrwUvDq+FHwKvkltTgXrqRaVt8vCl49w6TWwHmjMix7yP90aNZQ2d19+4q1kXaAgleHl4JXx4+CV8kvqcW5cCXVsvp+UfDqGSa1Bs4blWHZXIJ3v7adsOzKdZF2gIJXh5eCV8ePglfJL6nFuXAVZ9k1awz06GHDNIsrXwmlKHgrwUrlaSPnjfJwL/SpdGkolFg87qfgVdph7cbtyhpYPIkEuHAVbtU33zLw6J9TGDbExlnD0oVXUCElKHgrxFBlaCbnjTJAL+KRclDtuuevxPwPX8Knu7ZA/Hd/dMYveGitCJalLELBq6RNwasEmNDiXLgKN+xtd6SwebOBMaPSGNDfLryCCilBwVshhipDMzlvlAF6BT2SLg06Y1Hw6vjRpUHJL6nFuXAVbtnf/s7Ehg0mTh1k4fzzrMIrqJASFLwVYqgyNJPzRhmgV9AjKXh1xqLg1fGj4FXyS2pxLlyFW3bdegN335PaW7BXTxv1vWzU97TQrRuwfQewZTNg2cBnnxs4tIeNumjDXhbeiQAlKHgDQKrSWzhvVKnhA3abgjcgqBZuo+DV8aPgVfJLanEuXMVZ9vVFBuYtSOHjfwQrX1tro1tXOMK4W1cbXbva6FIXrGy57qLgLRf5+D+X80b8bVTOFlLw6uhT8Or4UfAq+SW1OBcunWV37ABWNRhoWGNgxQoTn34KdOwItGtno7YWaF8LrNsAx+c3+zr6aAvjvhJflwgKXt3YSHJpzhtJtq6+bxS8OoYUvDp+FLxKfkktzoWrNJb1hPH6DQbWrzewbIUrgAcOEHcIC0f1s9G2TWnaEvQpFLxBSVXffZw3qs/mhfSYgrcQWs3vpeDV8aPgVfJLanEuXOWxrPgBT7rfxI4drvA9c6jl/BOni4I3TtYI3pYZz4l/uY2TT7RQF9Bt5v0PDOzfxUaH5om5cj6Y80Zwe1TjnRS8OqtT8Or4UfAq+SW1OBeu8ln24ckpZ6dXDr1JiLPOMTvYRsFbvrGhefKPf9YGu3e74fJOON7CqAtbf5FqbARuubUG4mc+YWwavXrlfzrnjfyMqvkOCl6d9Sl4dfwoeJX8klqcC1dpLfuPjcC775notJ+Nhx9NoXNnG9dcmXb8feN2UfDGzSLB2rNpEzB7bgqLl7pfD2769zRq27ceL/rue0ysW28Gji3NeSOYLar1LgpeneUpeHX8KHiV/JJanAtXaS379DQTr7yWyUc8sL+N0aPima2Ngre0YyPsp931uxTEZ3zEcAuDT2m+y/vOuyY++NB96qw57pi85ebGQM3gvBEIU9XeRMGrMz0Fr44fBa+SX1KLc+EqrWU//NDAG28ZWLXKcMQId3hLy7+anjZlagpL9uzyyjg7a6iFvn1szF9oYv5CY6//uMdEwuVdc1Wwly/OG6UZSR+uNVBTA3Q9qLIyOlLw6sYHBa+OHwWvkl9Si3PhKp9lvR24fn1sjB8XTGiUsrXc4S0l7WieJYL3xTlmzrB44rMrUUJq22GvAB4/No1+ffOLK84b0djLX6ttA//5Y9e3+uor0oEPIEbfsvxPoODNz6i1Oyh4dfyqWvDu2Am8+pqJ0wanYZrN46Eq0VZ0cS5c5TOf+FrefW/K2Wk76ECgQwfbidsru3GSma1P7+Cn7KPoBQVvFFTLU6cnfD/dauCk4y0c3c9qcjjtL0+aWLQk42pz4AHAzl02Rl5oofcRzQUw543S2NF7KT72GBsXjYnfS3FLFCh4deODglfHr6oF7213pJwdjgsvsJxQPbwyBLhwlWc0SFiyZ2eYaFjd+gtYXWcbA/rbZQlZRsFbnrFRjqe+93dg3gITH//DaLYbLH7mw4Y03WHkvBG9lcSvWv6RHd7LL7HQvVv+nffoWxXsCRS8wTi1dBcFr45f1QreJ6a6p5Xj7CupNK2qOBcuFb6iCr/0VxPPveDupslidv65Frp3B3busLF9h4HNWwwsW27sFcPiv/etb5Z+d4eCtyjzJqLQ28tMrFhp7I30IJ3yC1/OG9GaefpM03kBkevyS9Oo71k5YlfaTMGrGx8UvDp+VSN4H/tzCh986O6abd9hO5+LK9EHSmnuwMW5cAVGFdqNT00zHRcbSTQhp+dzhSRLp4Ef/bTGeWaupBRr1xto3w6oq4tuIaTgDc3kFVtRdogz6YjEjR58koGTTzSwddvuiu1bHBsurifTZmQS0owZZWFA/8r7KknBqxtdFLw6flUheC0L+PHPaiBiwX8NHmRhxHmVN2koTR6oOAVvIEyh3yRj1cy4TOas/7nnTbw0z73pyCNsXDwhDUlRLF8tJGFF1642vhXwVH0xHaDgLYZaMst4wnfVaux1eeh9BPDFEY0VdZgqztYRxrff6b7kSsSM885J44gvxLnFLbeNgldnNwpeHb+qELyCKG0Bn36agXX7He4p1+uvjWdwf6VZ1cUpeNUII61g+QoTU6a6IaTkINumze7jZEx/cYSF/sdyhzdSA7DyJgRee93Ek880fVM7dZCF87mhoBopDQ3AQ4+6B1jjHJs7aCcpeIOSyn0fBa+OX9UI3mxM9/0p5fhC5vosrESaiOIUvPE3o3zmlJiq3tW3t+UkE6iri7bt3OGNlm8l1v7gwyZWvGNi9IXAh+uAV153e+HF+ZUDlps2GU4mQTMzZEvSVXk5lB3o4edaqKRgPBIXWQ6wyiW+ul+dkEbbNiVBFtlDKHh1aCl4dfyqVvAuX2nioUfcyeTqK9MVddJVafK9xVe+Y2LmC8ClX7XRsWPTHUEK3rAoR1ePHGCTNMTiOznx0tIdXqPgjc6mlVrz7kZgyxagvkcNYBhY8+FuTHrAjYLjv4YNlUQX0Y9VSdW9e7eB2XNMx81HrmuvacQBB8SfsN89SVqbpJ1yCl7d+KPg1fGrWsEr2LwTr7ILMfgUG926WqjvpQRaQcVfmGVizksmcvkyU/DG35DeDlCpfdEpeOM/NsrVwux5IzvBxVe+nEb/46Jzt5F+b9li4Fe/yWwji5vPmUNljo//eY102sYvb6/B59vcQ9VjRtno2yf+7Q463ih4g5LKfR8Fr45fVQteQecF8PYwllo8KM2nKi7+Yffd7/oy33RD010XCl4V2pIUnjXbwKy5Keez8eUXly7jEgVvScxbkQ/JNW988gnwyGOpvSmzox6rO3cCzzybwpI33J3d675dOQfoVjUYmHR/6X/TpRpsFLw60hS8On5VL3jlMMCCV4AXZ7s7Al/6ooWTTkjOG3W+4XHrL9wDERMvaWySYYmCNx+58v992zYDf3zAdISEnN4Wt4ZcoczCbmkxglfG2LyFBk4blEa7dsxqGLZN4lJfS/OGfKaXcxMyVuUFbfzY6BMmeHPbTTc0luR3EYYNvKQSSd14oeDVjRIKXh2/qhe8gm/xEgNPPOm+VUvUhmq6np1pYv4C0/EDPX6AjZ6HuYeeKHgrYxSIkLjrHtdX0vPlXbvO/RzaJaLDa8UIXi+r4TlnWxhyWvW8UFbGKAqvlfnmDe+wsIzPQSfbqK+3I0ue4H29u+LyNA49NFo3irAIenzGj02jX9/KaHMhfafgLYRW83speHX8KHgB/M/dNfjo49yB/JV4Y1/8/fcN3Dup6bHpoUNsfHmE4Rw+YQD52JsQEqcz+4CQ7PheE1Es3kIF78OT3fjAzGoY/7GkbWE+wSv1T/lLxt3g8Hobl10czSaD95J15cQ0evSoDPF48y1uvN2kHqSm4NX9wih4dfwoeAFI2JqHJru5yasxLu9HH5l4bxXQ0GA4wkQSH/zqp6DgVf62Sll83XoDd9/jvrjITu/JJ1o45uhoFvlCBK/3iZZZDUs5Gsr3rCCCV1rnuRvIS5m8nIV9eckacp1PCPtZYdUnv+F7/5BC4x79L23v1hU4+igbp5yUjK8iFLy60ULBq+NHwbuHn/cpqV8fG0OHWDi4e/iTsNJUJSnuLUS3/qeN9u1N7vCWhHo4DymVz2JQwbtqtYFJf3JF+IRxVqJOm4djseTVElTwSuIfSZYicaPDjp7g/+LRt4+NCeOi2UEO03r+mLvyJURclcTv3bvkv132tTS6dAnzqaWvi4JXx5yCV8ePgncPPzlMIT5f3jXoZAsXnJ+Mt+pChogn/Cd+DTj2GLo0FMKunPfu2g38+o4afPY5cN21jU72taiuIIJXRMe996Xw2edM7hKVHeJYb1DB++ifU3jzLVfQSQaxYUPCizLi+e7KzvFFY9I48MA4knLb1FrMXfkNrV5j4MU5puOjP2yIhbOGVfaaRMGrG4sUvDp+FLw+fq8uMvHU024yiguGWxhUAXEbleZvVtw7xFZXZ2PEOQb69msM+xGsLwICkpFJdonkijpQfRDB+8TUFBYvNSBfTMZXwA5bBCapyiqDCl6BIzF6p80wnZ1M+Xwvc65kZNNcDz6SwoqVrr941OHPNO2UsvIF5OHJmf63FnP3nfdMHN7TQsp18a3Yi4JXZzoKXh0/Cl4fP+9wTTUv0rKjMPkx09mZk2ufDsC4i9KOXyiv+BKQ+J3PTDfx0Ueu3Y452sK5Z0WTZjif4PV84qUdlRQDNb7WrZyWFSJ4pVeyizl7rvtyJNdBB9n4+qVptG9feJ+ff9HE3JdN5wzCd74V/9i7Egpz9lw3pKBEZYg6JXjhRMMvQcGrY0rBq+NHwbuHn+dDVQk7A0qTByq+eFENnnjavfXE4y2MvLCyP6UF6nQF37RmDfB/j7gxlf1X2J+Lpe6WBK8kMlm20nTC/Ek7ovDPrGATVUXTCxW8HhS/D6vs9vbrAwwdkg4cWs87pCb1nX+e5XzliPMlv5UX56TQsNoVvBKC7KQTLey7T5xbrW8bBa+OIQWvjh8FL4CP/wHceZf7rYiHa9wBdf8DNXh3FSri06DyJ5CI4vJ5eMrUlLMTP3iQjeXLjb27ZtJBEb5DTrew//76nXq/4H3jTQMfrjX2ilwPZvduthNaiVd1EShG8H76KfC/v6/BZ581ZyUbEGcNze/q4LnQyDgfPSqe4078dRcvdV8I5cxI9tW2DfD//iPZLmQUvLr5gIJXx4+CF0AlTJZKMxdUXJJwyKQsfrxyMrgaPrUVBCimN+/eDbRpk2lc9ufidu1sXHOl3p6dO9Ri3is2/jKtscmOsuxUSSKBvr1t1PfSC+uYYmazWiFQjODdtQv45W9SqD9MdmddsSpuDqtWwzmsJZcIXxGzA45r7qbjD8kXNxeaxrSN6TNSMExg6Rvulw+5ZBd78Ck2eh9p4+OPgY2bDOxfZ6t9mOM+OCl4dRai4NXxo+AF4AUoj9tkqTRtUcXfWmY6Prxy/du1Bjp13l1UPSwUHwIifOUlRj6falx2ZIdq1lwTS5ea2Lbd7Z+I3IEDRIxYFZO+NT6WSV5LihG8rVGQLxdelALvPhG+A/unnVTo/kyDcXOhESH+3Asm3n0vs5srX2BOHWRXbYg+Cl7db56CV8ev6gWv+FLdd39NVaYVzjV0ZAGRFwDZifjWFUDX7sn+xKb8+VRUcS9cUyGiV8aDHIgTH0sRzN7V5wgDp5262xEdvEjAIxC24PXqlXl68dLM4Tb57zKOO3eCMy4POhD4l6vjMVdJW+cvdLMLepeTCOYoq+p/LxS8urmCglfHr+oFr+fOcOZQC/IPL8DLjnVId+Cgrja2bbNhWUDPnsBpp1R+aJxqtbGIV4mzLP6DsjM78dJ0i7uynsBYtqJpAPzBgyycfmIb9DnSrPq5o1rHUWv9jkrwes/Mjk3rCV9JT9ylrrwW8R9Ek5aI24J8/ThzCL9+eJah4NWNUQpeHb+qX7ToztB8APlPTGf/9ewzLQw9gy8Gyp9d2YrnE71i+1lzMr6G0lD5DCunyD23hXxhycrWOT647ASiFrz+DnrzlIxPeXkr15VL6Ip/rmSQq60tV6vi+VwKXp1dKHh1/Kpa8KbTwI9+6kZnuOmGxqqfnEQMzZ5rYt4C14f35BOAQw5Jo66zjc1bDHyy2YB8mkt66BzlTyr2xd3UqzXYvDkThUMa7fn55hK5/k5R8MbexGVrYCkFr8xXt/6ifPO3uFK8ODvj6uMdRKPQbXn4UfDqfpoUvDp+VS14BZ2XSreaXRo+/NDAg49kkk0IlwkXGTjpBGDrNh5aU/7EYll862du6l/vFLws1l7Gq9YyPklnKHhjadJYNKqUgtc/f0viBvkKUYpLDqPNnmPu9dGl0A1OnYI3OKtcd1Lw6vhVtOCV0FmSavG4Y4qf6LxDazJp3XRD+T6LKc1YdHGZvCfdn0lvecABwLlnWziubwowDAreoslWRkHPh11am8+v1+sRBW9l2LYcrSy14PXOG4hv+YjzonW1yha6Ht9qypSmHVMUvDqCFLw6fhUreL1YsdJ9LzNPfS8LffvY2LTZwLTpJrZuNXDp11pPMenP0CNBvyX4d7VcfrGb7QdX6oWrWpjHsZ+eaJC2jRklQf5bFw4UvHG0YjzaVOp54+e/rMG2be4akO/LRLGEWtrRre9pYcqTma8kXhzqnoe5saglxTGvpgQoeHUjgoJXx6/iBK8I1IcfdU+ayyQnhwK8z7ItoRAxd/wACzIReUkUZGd33YZM1ptyH3xQmrHg4u9/YOCBh9ydXTlJPHpk093tUi9cBXeABUIlUIjopeANFX2iKiv1vDF/gelkFPQyl0lKYUktHMYl4fgWLMy4Lkid4vrm99HduBH44//VYMuWpk+UjZOvX56GZBzklSFAwasbDRS8On4VJXhlYlm71vU1lBiM48dazoQiInj5ShOvLzLxj43AQQfa6N7dBSNuD/5LBK/cn+sKmsZSiTwWxT2B06+PjfHjmrtylHrhigWUKm+EX/Refmka9T1zL9YUvFU+UFrpfrnmDX9kmcLiTBt48y0D4tO+YyeweZMrnjdtbtpJcZnIF15MNlFWrXYFuGzCyIbM5Ze4axQvlwAFr24kUPDq+FWM4PWnj5S0pXJIIWjIF8nWs6rBnYi8q64z0Lev5QQub19rN8nmIxNmfU9g2BB9GlaleSIr/uxME7I70lJ2onItXJF1mBUHIvDw5EzAfO8F0P9lxL9ord24J91aoJp5UzUQKOe84UYfcV0MRGxeMFzcc3KLTRGny1a6X/i8dL8t2ecro9Pof2xhotX7HVH0NqVKwaubBSh4dfwqQvD6fU1zfX4PisCLuTvxksacGW9EGC9a0jSjlEx0MuEl7frDpBRWv29gwri04/ecfZVz4Uoa60rpj3+XzIva4LVd0rn27WPhyCNsdO5Qi/06GhUxd1QK+6S0Mw7zhv8gpozbEcPdzREJY7Z4qYlly40mWQPFnU02UeSezp1sdOtmQzZE5LJt5+xuUZd3zkR+S+PHWS1+MSmq8gotRMGrMxwFr45f7BethgYDDz3asq9p0O57i7nsWl1/besCVnYKpj8nE6N76qCQT2RB21PO+yTlpexAyPW9f0ujwz4UvOW0Rxye7Xdn8EL05foy4rV1/y6u65CIA4nRvHIlMHiwjSGnheM/GQcmbEPhBOIgeKXVMnanzXDXDZm/JWSZfzfXy4I24Dg7UpcD/+Hq1tyECiddmSUoeHV2o+DV8Yu14N2x3cCt/+0KM83OrpUGfv6rlDP5tbS7m41RklK89LKJRT5/rNY+kSnNUNLiXuzhltwZpDFxWbhKCqZKH/bIYym8vczdxsoVpSE7nWuH9sC2FrwZvPTckpzkpBMpfqttSMVp3vAfcPbs4GUNlINnpbqmz8wk8wkSBaVU7SrHcyh4ddQpeHX8Yi145S19ytSUk9pUkzqyMQ38/r4UDu1h44sjCp/osj+Rdepk4/RTLbRtq4RfhuL+MGy33NzYYgvitHCVAVPVPPLV10089YwZOKyTvAgeelB7/GOjjVeW7nRCAMqn4I8+MvD64qbffq+7tnHvp+GqAVrlHe1gH6cAACAASURBVI3jvOH503bpDPzrtS3PeVGaLtcXlCif11rd8huWkGnFumpo2k3Bq6EHUPDq+MVa8HoTlYTMkh3ecl5+/0ZpxwnHWxh1YeHiuZx9kGd7TMW3bfSoll074rhwlZtd0p4vPo133eMe8hkzKt3iAZ/sfueK0mBZwNyXTcfnUS7TsHHG6RZMs0gHyKTBrpL+xHHekHEu5zfkC99//Hsa7duXZy3xi95c4dM+3wa89ZaJk0+KZl0RDrKOzXnJxH772bhiYhod9y3twKTg1fGm4NXxi63g9QSmrJc3frd8k5Qf7/sfAA8+ksK2be4iLqKxUiI5yKlkf5D0fJmJ4rhwKYc6i2cR8L5cFPoFhWHJOJRaIhDXeSMumydLlpqYMtU9GyIhIYcNtZwDdP6DdC2FitSMOgm19scHzL1rl9R18YS0cwi1lBcFr442Ba+OXywE70t/lZBhJo492sKA4ywniPd999c4PStk50mJInBxcbV4cY65N+GF/yRw4EpKdKOwnDY9E2pKsgHt1xEYdIqFI77Q8mQX14WrRNgS/xi/a8t13249G2E2DArexA+PojsY13mjlCmI88GTqEMPP5pZP7z75SCdXLITHWa6Yv+XHHm5PWtoGp06YW8SpnztDfPvFLw6mhS8On6xELz+2J/+7hw/0MKXvxTN5x0lNid5xey5qb2xfeOatMKb6GUyHXyK7WQKCnLFdeEK0nbek59AUNeWXDVR8ObnW613xHXekC9csolS6NeMqOzoHWTzokX06205oTL9sYTDiA7kP7gnIvqaq8obYpOCVzeiKHh1/GIheKULMiEtXprC394yUVNj4fgBNoafa5XFsb4QpDKhSOgZ+Swll2TV+eeL4pOwIl+CiZb6GteFqxDb8N7cBLzDoMUuqBS8HFmVNG/I2jJthpuOXg5q/egH5Tm45mf2wmwT8+ab+JdvNv+64hepxcTwlR1dSYss/Z2/0E2sIb/1y76WRpcu5R27FLw6/hS8On6xEbzKbpS9uN/NQSapMaPcQP3lvrwQZEHDsXntpeAtt+WKe37DGgO9Dsu4qshhMjmR7b+8BCzFugtR8BZnm2ooFad5Q1wHnp2RSSQkou+L51vo07u0fqvF2t3/5VP8eiXmdX1PC926oVmWUe9Amidw/c+UsnJAOWhm0mLbG6QcBW8QSi3fQ8Gr40fBq+TnLy6TjhwEksQOcnkB/EN8ROCqnOQZMzNt+ZerG3HQgYGLMw5vcFSxufO11008+YzpfLbt3t3GqlXuLk9LV6G+u149FLyxMXnsGhIHwSvz8LMzMu5mhbpzxQnq9OdSmDe/+W9Y+tS5kyt827Sx8f4HmRTJ8rduXeHMAd272oGjr5Si3xS8OsoUvDp+FLxKfrmK+8PPiPiQsGp1dRE8KEeVsqshSQQWvupOgDL5nX6qjSGnF7bbHIeFqzTEkvMUiYk76f7mh2Fa6qHseP3rtxphpgoLHUbBm5wxE3ZPyjlv5NrllE0HSTIRh93NYlnLS+v69cC6DQbWrWuaFtlfp8z1E8amHV/guF4UvDrLUPDq+FHwKvm1VNx/ElcSVYikGHuRhR4Hh/85TSZ6OUC3aIl7wtd/XTkxjR49Cn9mOReuiExSFdV6L1teyCPxKc++PB/Bj/9h4MbvNqK2XWFoKHgL41VNd0c1b0h8Z3+iBJlf5dzE5i1wROD6DU3nPhn/559Xuo2GUtnYO3wnL6tjRroH0NZtMB3XDbmGDbFx1rDyHkxrjQUFr26kUPDq+FHwKvm1VlyExW9+WwPxo5RL3sD79QHqe1noeZgdyq7v8hUmps0w9oZI8079SlxHSSgwfmzaySNf6BXVwlVoO3h/YQQ8v78gdt+120bbNoXt7kprKHgLs0k13R3FvLHyHRP/97CbDbB9e8OJZJDratcWOLzechI3fOHwZFKfPdfAi7NTOPvMNIaekZnX/fF9i/XNLwUxCl4dZQpeHT8KXiW/fMX//EQKa943sG27gZ07mwpPeUuv7+kK4OOOsWGm8tWW+Xt2dAgvvqL/c9Ynm4AuRbpSRLFwBe8d7yyWQLGHFAt5HgVvIbSq694o5o0PPjTw+z+mYPk2Lus6A716iR+r+KvaqO9lV7TbQtBRIgxeXWTixOMtpLLWC+/rTlxCr+XqEwVvUEvnvo+CV8ePglfJr5DiIlJXrzGwqsHEshVNP8F16AAMPcPCwP6t+5t5fmoyucklux4XDLdCP5gQxcJVCCveWzgB+cwrPrzi1hLlgUkK3sJtUy0lwp43ZL57aLIb9lHmui+PtHBUEV+sqoG/P4XyOWdaOPYYK5SviGGyo+DV0aTg1fGj4FXy0xQXgbJ8hQjgzEGEXAkstm8HlrxhNkk/Kc+NMsNb2AuXhhPLtk4g+1S6d/ehPWxcMTF8fz4KXo7IlgiEOW/449HKvDh+rOXEOefVMoGnnzXxyquZOIQifIecUdiB5Sj5UvDq6FLw6vhR8Cr5hVVcxO/sObLz6/pUygQvYWeyD2N4Pron9LdxUNfoJv8wF66wGLGepgSyd/vlr4MHWU4oIkl93a4d8K0IMitR8HIkRi14w844Vi0Wk8N9ixabjhvd4qUGDjnYwlXfoOBNiv0peJWWXLtxu7IGFg+TgBw2E3eFXPFTTx1kYdiQ0oTYoeAt3qqNjUBNTfHl85UUoTtvoYkFe7Ioebv9w4Y0PZWefbI9X71B/07BG5RU9d0Xxrzhj3Aj/rkTL41H0oRKs+a2bQZkLtpvv+g2Rgplwh3eQok1vZ+CV8ePO7xKflEVf+NNwwllduQRNmbPNTFvgekczihVLvQwFq6o2MSpXvGXXfKGgU2b0STRg+zQDz7FRt/e4frRzV9oYtacTJD51sKPRcWJgjcqspVfr3beWLXawMOTXT90OXw1YRzFbuWPikwPKHh11qTgzcPvhZcW4dof3NHsrkUz70W7tm0oeHXjrySlZUfvrntSTpixKA8j+TujXbhKAqbMD5F00tNmuItza9eA/haO72+pAsJL/M1pM1J7d/5zReUoFQ4K3lKRrrznaOYNf2itgQPchD28kkWAgldnTwrePPyef+l1/Met9+Lxe3/U5M7DDjkIhmFQ8OrGX8lKewHHxYf36iuiD6iuWbhKBqWMD/r7KgN/fMCNCyTiU2Idy//WdXbDI4kYlmgc4kfnXRJCacxoC70OC/6JMTtdtewcXzDcRt8+5fPLo+At48CL+aOLnTfky4WXPKFUL/UxR5nI5lHw6sxKwRtA8P7oV3/ES3+5M+ed9OHVDcBSlvaSCpQizmKxC1cpeZTzWZJM5IVZJg7tgVbFpwhWicQhh8hkh36fDnBOTXfvamHtehN9jrSw//65eyK+3PN9frpxEQIUvOUcefF+djHzxpNPm3htkRtZYMRwNxUwr2QSoODV2ZWCN4Dg/c4P7sSo4aehXbu2OLF/HwwfdhJq9kStpuDVDcBSlvbHWYw6m04xC1cpWVTis+77YwoNa5q7P8iurYSY69vHdsIuZbsvxC1NKgVvJY6+4G2WlzkzE9kqeEEAhc4bkinyocnuw0ZdaOGE4yl2CwJeYTdT8OoMRsGbh9/flq/CjNmvoFPHfbB2w0Y8+uQsTBh9Nr7/nYudklu3N+oswNIlJfDm28Af7rfRvhb4wfeA9rWFp4YN0uB2NSbk1NzO3VyAgvAKcs/mT4EFrwDbd9hYu9a13bt/t7F9R6a0vIem97gu1tXZmPBPBo74QjQ2DtLmXPc4ooZzR7H4Yl3uv26z0WjZuP5fjKLmlkLmjU8+Af7nXgubNhkYfo6B88+JNRo2LgQC3twRQlVVWQUFb4FmnzJtLn7wi/uw9IU/OLu8W7ftLrAG3l5uAvc9APztbRFMwCHdRUABhglceZmNjvuGI47atpFdFwO7dvPgSNT2FtH76uuGY1NP/B5zlI2vXxyOLcNuf8cObVzBy7kjbLRlr+/u3wMr3wOGnwOcf3bhzSlk3vjlHcCH64BjjwImuvsvvBJOwJs7Et7NyLpHwVsg2pcW/g3fvPFXeH3GPaht15aH1grkF4fbn5iaanIYymvT2WemMfSM4AeiWutLoZ8m48ClktvgP7TTti0w9isWeh8Zz911ujRU8khrve3e4VjDAG6+qRF7PN8CdzjovCH+6fKPuPNcfnH0h3ADd4A3RkqALg06vBS8efg99MQL6POFQ3FU717YsvUz/Pst/4s2NSncd/uNTkn68OoGYKlLSwai2+90PylPvMR1R2lXa2DjRuDoo2zIQhXGFXThCuNZ1V6HxB6d9Cc34sOEcVZZIzAEsQUFbxBKlXvPzbe488stNxfu7hZk3vD77V59ZZrpgit3qBTccgregpE1KUDBm4ffbb97FH94eNreu4476gv47x98Ez26H0jBqxt7ZSnt7e7KIafRo6JzNwiycJUFQMIeKi8wd9+bcmL5xiUKQz7EFLz5CFX236MUvLKD/NCjlTXeK9ua8Wo9Ba/OHhS8Afjt2LkLH2/cjI77dEDnTvs2KcEd3gAAY3TLbXe4CSiu+3Yj6uqiaxgFb3RsvZr9CUUkEsP4cdG9wITZGwreMGnGr64wBe+GDQb+3mCgocHAqtVokqRFwiseP8DCgP7huGHFjyRblE2Aglc3Jih4dfzo0qDkV8riksxgytSUk+BA8stHeVHwRknXrfu+P6XQsNooacroMHpFwRsGxfjWoRG8NUYKf3vLRMMHaSxekkmB7fVW0qPXdQaWrfAlZIkoDXd8CVdvyyh4dban4NXxo+BV8itlcW93N+oYvNInCt5oLTt9pol5Cyrz0A4Fb7Rjo9y1//yXKWzbZgRKAiFfKZYtN7D+IwPLl5vYtLlp6yWeb58jbfTqJbGmLScLoVziyrN6TSYhi1dq2BALZw2L52HNctslCc+n4NVZkYJXx4+CV8mvVMW93V051Xz9tdHu7lLwRmtVf0SGSjy0Q8Eb7fgod+1eBAVph5wVGNg/jV693FaJwF0lLgprDKxaZWD9hqanZCVU4hGHG+hxaNr5EiWJVPJd2Wm4ZRd4/FhGbsjHrRL/TsGrsxoFr44fBa+SX6mK//qOFD7ZbKAUu7sUvOFb9bXXTWz9DNixE5i/oLLTqFLwhj8+4lajX/R6bRMhmi1wa2ttdOsK9OtrO+myj+1XAwkVU0yM5nXrDTz8qJuCW65SzXVxY5/k9lDw6qxLwavjR8Gr5Feq4hq/umLaSJeGYqjlLrN7N/Djn7mhnvxXpURlyG43BW94YyOuNXnRYETkykuaJ0KlvbJzW9/LRn1Pa+/Or9cP7bwhO8jPzmgeZ7xfHwvDhgbbMY4rU7YLoODVjQIKXh0/Cl4lv1IUl52Pu+9JlfRwk3bhKgWXSnrGgoXmnixqBj762MZbb7u7vOKicsFwO/axd/2sKXgraeQV3lZ/rG8vGsw/NgKffWY4Yre1K6x5Qw68zX4p5fj6+i/5vZw1lJEdCrdqPEpQ8OrsQMGr40fBq+RXiuJyKOThR0sTnSGsnZpScKnkZ0g80ilPuiHm5BJfyWFDKsNvMUzB29gIvDDLxFF9bRx6aH5/z0q2eZRtt2zADCnpjCbWd1iC189KRO+CV03ncJz3exHhK24UnTvBcaXwfIyjZMy69QQoeHUMKXh1/Ch4lfz8xT/5BOjUCQWn48zXBM+frpSfwKNYuPL1sxr/7j/AJv0fMdzC4FPCO6X+3t+BRYtTTr377huOoAxL8MrnaxFXEqLqoAMlU6CFujob3Q4CugU47FSN4yVXnxe+auKZZ01n91Xi2vY8zHZidD8z3XQE4WmDg4+nXLu7hXCOet6QA24vzsn4+frb1qWzjauusNC+fTjjvJB+895gBCh4g3Fq6S4KXh0/Cl4lP6/4Bx8YuOc+1+0g7BPGz840nYNOYYuh1roe9cIVEvZEVCMiY/bcjN+i7F6NGZk5GV9oJ6W+2loDDauBefNNrH4/WIipoM8JQ/BKetkpU5vHafXaIIehLr/Eck75f/ChhL0ysU97C+1rbeyzL3DgAUFbm/z7Vr5j4v8edl1kvEti3UqIMBF///HvwaO6aHZ35dmlmjfkC8mq1a7wXbceew/TnTrIwvnnBRf4yR8d8eohBa/OHhS8On4UvEp+Uty/U+VVJxPvKSfJjpX+AV6CgomXNJbs012pFi49neTUICJw2ozMZ9ti3Bw895dsKueencYZp4Wz86URvCLGn3jSTbghl+xKyo72zp0i0AysX29g3YbMISkRvpJ2Ofv6t+80Ol9TeGUIZIf38v4S9MuQdne3lII32+7eOQf575UY6q9axjEFr87SFLw6fhS8Sn7+UDqyOPfrAyxe6i7QslN3VD+gR3cLh/RwxcaWrMDs8t9kod+8xS3jnIjeZDgHnLZskf/NLPiXX5J2TkeX4qLgLQXl3M/wh4SSMXXB8PyHdCRt66zZGSHpiUkZL0ceYaPHIeGNm2IEr7wUivuG9E2ufP3KZtCvL7BjO5zfxX4dgS+PbERNTUhOq+UzdSRPtm03scOKd0w8O8PlPXCAjdEjW9/p9XZ3Nbuk5Zw3vDFz9plpDD0jvPEeiZGqtFIKXp3hKXh1/Ch4C+AnnwhlB0piUYpAXbcu8ynN78qQ/Ym6gEe0eqv/M29YdbZUTzkXrqj7Vgn153JzGD/W/cTvv+TT7otzMkJXxsjgU2wMOslC+w7R9LQQwStCd/kKA9NmmHt3amXnesTw9N6sWy21UtyE2rQBunaleCnWkvJC/oc/prBrl/sCLrv8n33m1iYv17J7LmNN7CTzmoyfq68o/vBkOecNEfpvLzecA5AG34WKHTKRlqPg1eGl4NXxq0rBu7sRaJMVFlUmy82bZVe16Y5rtrDNhVsm1x/9oLHZn7yT+LKwtG/nzsCdOjVfvMXtQRaa2nbyv0BtO8k3b6NTZ0AyF8l/e3iye7hHrgnjrMjDWJVz4VIO6UQVzz6k47k5yDjNJXTFPcBL3xoViNYE78JXDLz5tokP1xqQCAz+S9wXxA89SPatqNpejfXKWPnt71I5XUOyeYgovuziNLoU6YrFeaMaR1jwPlPwBmeVU2vYtkgVXsUSWLtxe7FFK7Lc22+beORxc28YqPbt3U+tc182kc5ztkNEaX1PoHOdGw5HFnBJr1nTxsbJJ0Y/DKfPNDFvT5auoH55xRqJC1ex5KIp5//EX1ODvWLS29EthdD1etaS4JUsWcuWNz08Je3ruC9w7tmVFWs4GiuWp9YlS+WAoGuXfn3svREwvJdr+d/OneEcfNu6NeNbLREf+vZ1X8SDXpw3gpKqzvsoeHV25w6vjl/V7fCuWm3g4cmZz6vt2gI7d7kQ5WSz7MDKjmvnTvbe3dYDDwQO2N/993JffuETpejlwlVuSzd/vnx6nj4zs9MvfpkjzsvvGhB2T3IJXjmIJskCROCeOdTGwP7R7zSH3a8k1ucXu/nmi4YGA68vMbH0jYw/wInHWxh5YfCoB5w3kjiKwusTBa+OJQWvjl/VCV7Bleuk+FlDiw8DpTRBwcX9i1iQwygFP6CE4YWKaVu1l/m/h0ysfNcsaeY9P/NswfvkUyZeW2w6YtcLJVbtNopD/wsRu9nt9acWnnhp8JcqCt44WD6+baDg1dmGglfHryoFryBb+Y6Bx6akcMHwtHOCudIu/061uFZMGBd8UQrSVy5cQSiV5x45YHTXPW6WNs2J+mJbny14f/bLFLZvMxgOqligEZTTiF1pzrvvGbj/wZTTMvHrvWi0FSgTHueNCIyZoCopeHXGpODV8atawavEFovi/pBoYSe84MIVCxO32Ah/3NFSHGLM3uH9dCvw2NM7HTcGL04u45/GY8xoxa7XC/kSNukB98VqQH8bY0blT2DBeSMeYyCuraDg1VmGglfHj4JXya/cxf2LkuzEXH6xG1JIDrdJSKeTTgjuf+fvCxeucls2//O9tMTaUFL5n5S5Q8bbKwvb4q+vZMaVl9VL7hr7lTSOObryvpgUwiDO94YldveK3s3A7XfUOO4qN91AwRtn21dC2yh4dVai4NXxo+BV8otDcfnELdnYvDiaEkFC/r9ct9zcPFxakDZT8AahVP57vHB1+RI5aFsqIfYWL82kP5b6JETawP6u77v/MOWYUZIoo7gXLW07q7l82GLXYynJKzp2BE4/Nb9NOW9U8wjM33cK3vyMWruDglfHj4JXyS9Oxe/6nSt65RIBdNnFFg7uXtxuW5gLlwjyp55J4f0Pge98K42U6xrIKwQCEkrvyaczQlTCTp1/XvGJA/xNEpG7bKXZxG1B/n7OMBPnDDGxCzua9ICiNwSDtlDFn59IoedhNk5s4YtNVGK30B6FOW8U+mzeH38CFLw6G1Hw6vhR8Cr5xaH4qgbg4UebB5bPF4aotbaHtXBt/czAvfeJH6Arwv/tXxvRri3TIIU9bvwJKrS7vctXmJg918DadRk7SZ1yuFN2+fr0bO80P1cMb7/oHTrEwtnD8u8Khs0iafXJgUA5GCiX323J62dcxK60J6x5I2k2ZH9cAhS8upFAwavjR8Gr5BeH4vPmG5j+XMoRlMcPBDrtZ0M+Q8olO36jRxUewSGshcvbdQ77UF0cuMetDdnpiAvd7c2VplhEbr/eluO24F35Ugv7Re9JJ1r40gUUvdqxIi8h02YYzgEyubyMe6vXZJJKaF5wte3zyoc1b4TVHtYTLwIUvDp7UPDq+FHwKvnFpbgkz5AkGt7lj+CQa1coX7vDWLhe+quB515IObtS11xZuOjO10b+PTcB2e2dNsNNriLszxoqPrW5XVvE3UQOv81fmIm2kG+HOJ/glVZJ2LxJf3J3JRm9IbyROvWpFF5f3PwLSRzErvQyjHkjPFqsKW4EKHh1FqHg1fGj4FXyi3NxfwQHETHjx1mo7xnMpzeMhevHt9ZgdyMFTznGSLOsbP1tDBuS8e2V3dz5CzNZ26SNQbOkBRG8Up+XvEB2msePy3/CvxycKvGZ/t+1Z7errwjHb1vLI4x5Q9sGlo8vAQpenW0oeHX8KHiV/OJeXHbwRHgsW+HuCo0YbmHwKfk/MWsXLi9OrOwwXn8txU65xkn2bm+/vnazQ2iSuKSQTINBBe+O7QZuu9PdaQ467srFqZKeO32m6YQdlEt+X+Lm0PtIG18bX/7fmXbeqCQ7sK2FE6DgLZyZvwQFr44fBa+SX6UU9/tVBsnOpV24vOeJr6H4EPMqHwHZEZRDjV4ED2nJ/l2AzVsMfOPyRhxycLBdf68HQQWv3C++pw9NdsXZ976bRocOhT2rfNTi+eQnnkw5LyxyeeHfRPweeogdKBNa1L3SzhtRt4/1l5cABa+OPwWvjh8Fr5JfJRX3EhVIm/OlI9YuXLfd4WZomnhJY5MDT6XktXs3nOQbvAD/jvuYkW7s3GKvQgSvPGPOSwZWvGPi0q+m0a5dsU+t7nLZsbYvv8RC927xe3nQzhvVbeXk956CV2djCl4dPwpeJb9KKx70MFuxC5d8QpcdJ283sVyfsp9/0cTcl00MOtnCBefnd+GoNDsW2t4wd9wLFbyFtpX3NyXg36EXF4bxY+MpdqXVxc4btHl1EKDg1dmZglfHj4JXya8Si/sXUDmo1PsIYPTIRqRqMqe/C124ZPd41pymJ/3Fd1OuUro1fLDWwHPPm1jV4D77oq9YOPZoCl7JxNew2sD4sWmIH6/mouDV0CusbEupwwurpXR3FzpvlK5lfFIcCFDw6qxAwavjR8Gr5FepxeUT6a2/qNnb/OOOsfFPYzK+tkEWrlwhrSTervgISxgs2e2dMrXlgPlRsHvyaROvLXJ9RsvpThFF3zR13nyLa+tiU037n03Bq7FE8LLyNWbS/e6hP/ldTbw0/qH9gswbwQnwzqQRoODVWZSCV8ePglfJr1KL2zYcMbprJ/ZGcPDH621t4coldFs66Z8dGm3MKBt9+0S34/rGmwYen+KmYf36ZTwsJ+Nz2XLDObQmNhLRpL0oeLUE85f3H/arpLBuFLz5bVvNd1Dw6qxPwavjR8Gr5JeE4n6/XumP+N2eN8wEDANbt+3e28VChK6fS3ZotCiC5H+yCXjlNROLFmfcKm74tzT23Uf3+T4J9vXi4YbFnYI32lHhP1waJKJKtK0prHYK3sJ4VdvdFLw6i1Pw6vhR8Cr5Jam4J4ykT8ceBXz5QgNtandDdmkXvGpi/p7Yn/L3QmO3Shl/aDQpP3pk8cHyJXHCug0mGhoMrFoN57Ovd0kkgBMG2jj3nDRSrndDVV9exIywMp5R8BY3nGbNSaHTfhaO6mejtjZ3Hf7fSFgvKMW1trhSFLzFcauWUhS8OktT8Or4UfAq+SWtuHxKnTbDcEKK1dXZ6HZQxuVB+iqfV4cNLf6UuKScfWKq6dQf9MS57BDLIbSGNQbWrTOcw1f5rnwpdfOVT8rf5WXl9jtrHNZhJQCh4C18dHy+zcB//dL1Z/deGOt72ajvaaFbN8lyB+SKsVv4k8pbgoK3vPzj/nQKXp2FKHh1/Ch4lfySWFxE0qOP1+DDdZneHXO0jXPPKn5H1s8pOxFCdugycbEQUbt5C7B8uYlNm5tTrusM9O1roXMnOD7B8u/btwMrVhp4cY4rqOWqduHrfR4PM1IGBW9xv/rXXjexaKmBDz5o/sKWShlIp133m6FD3Mx3Rv73uuIaEmEpCt4I4SagagpenREpeHX8KHiV/JJY3O9DKFm4JKbudd9uxH77hdtbf4pUEWTOTm6We4I8UUKndesKyI6YnFaX/23pk7DXQokQQeELeOHIxoxKO5EzwrgoeHUU5YVvtXytWG84Xy78GfC8ms8eZmHokOgOd+p60HJpCt6oyCajXgpenR0peHX8KHiV/JJW3C9CR3/JwMCBmUNrUfTVH7rMq18Ebr++cMSt+PpqMkpVo/D9+yrg5Xkm+vYGnn7WdWKWF5a6unAsSMEbDkd/Ld6Lidjo4O4WThtko0ePcF5QL29HcQAAIABJREFUwm8tBW8pmSbpWRS8OmtS8Or4UfAq+SWluD+SggjOr15k4pij0SRKQxR93bUL+MnP3RixkhQhyO5tMe2oJuErqXxfmJXxF92nA3DtNY1o36EYcs3LUPCGw9Ffi/dVZfAgCyPOq7ydXa8v3OENf2wkqUYKXp01KXh1/Ch4lfySUDxX6tLe9almYcmi6KtlAT/6aQ0kLvBNNzTmdVXQtiGX8BV3Crl27HSjPVhpoGcvG1/oZYW2K6ptd6HlX11k4qmnMyEqZLf8mqv0MXilHRS8hVoj//3i3nD3PSnHfeemG8KxU/6nhn8HBW/4TJNUIwWvzpoUvDp+FLxKfpVePDubk+yyymfVUi5cYaa9DWqPXK4UucqKGB42JJzDekHbFsZ9jzxu4u23TZgpYL+OwObNwJFHWKhJGfj0M2DkF4uPtEHBG4aFmtdx6y9SzgvXddc2OocwK/Eq5bxRiXyqvc0UvLoRQMGr40fBq+RXScXlQNinWwz0P87d0Vyy1MSUqe4uoIQbGz0qk7q0lAvXszPdGL+ljDsq/Zb+yyWfkWvbuaGhatvZ2LHTwPr1BhYvzRyTF+Fb38tNmRz3ywtvJbuFl19iYckbRpMYytL+w+ttXHZxcTuJFLzRjICHJ6ecrIfZUUuieVo0tZZy3oimB6w1SgIUvDq6FLw6fhS8Sn6VVPzHt9Zgd6ObNOLg7jbm7UkkkSubUykXrpXvmPi/h82Sfc6V8E8/+mkb53kiCFs6FCeuHrPnppoI37iHOcsWu9K3zz438OabBtq1s1HX2ca0GSknIcdVV1Dwxun3K+mfJQ20uJ9c8fVGtKmpvLhkpZw34mQ7tiUYAQreYJxauouCV8ePglfJr5KKi7B8+lk3qYR3tbSbVOqF6xe3pfDZZwZOPMFyPrdHfUmGtvbtga5d8+/YemGkssOc1fdErNwdcondXBzFX1oT45U7vOGPTvFl/+FP3MObcp05NI0zh+Yfm+G3RFdjqecNXWtZutQEKHh1xCl4dfwoeJX8Kqm4+Os+/GgmKcMZp9k49+zcu3ylXri8QzvC8xuXpXHYYfFc7MX3d9ESs0m2N3F3GNg/jV69gA0bDLz0VxMbPwGOO9bG4FOiF+/CLKjYDWO8UvCGQbFpHV5GPPmvMmZOOclCly7hPyfqGks9b0TdH9YfLgEKXh1PCl4dPwpeJb9KKS5hj2bNMZxDMfLJ9OijLAweZKNtm9w9KMfC5fnyVoIPY0vuDv7dcyH7ox80qnZTg4yvUopdaQ8FbxCrFHbPXb9LOQkowsyIV1gLwrm7HPNGOC1nLaUgQMGro0zBq+NHwavkF/fiEl/32ZkpLF7iujHk8tfN1YdyLFwNDcB999c4PsYTLy3Ov7TU9hDhu+QNE4uXys65+3Rh3Le3hV2NBnofEe1OdanFLgVv+CNs1hx5GTWdNNjXXJk5OBr+k6KtUX4Lj01pg/PPBQ47LNqENdH2hLVHRYCCV0eWglfHj4JXyS/OxcVN4ImpprNzJAe0xoyy0bdPsE/s5RC8wvL2O2vQrm14MWNLYZ+3lplYsNBw0sX26W3hq/8cjLG2beUQuxS8Wqs1Le93Zbj6yrQqq2C4LSusNunHpAdSzvmAs4YaGDaUgrcwgtVxNwWvzs4UvDp+FLxKfnEtLiG3ps3IuDB48XWDtrdcgld7oCpo/8K4b/t2QE7WN6xuepq+FOHVyiV2KXjDGDmZOm67wxWJQb+8hPv0cGrzi93uXYEbrjMiz9AYTstZS6kJUPDqiFPw6vglUvCuXSepVQ2893cTfY600auX7Xwmbyn8lBJhrIoX68KQ3YlyCd5YwczTmKeeMfHq6244tcGn2Ni5E3tDvUlRGXOSKrm+p+UcaAvr8uK15gurFtbzsuuhD284ZKfPNJ3xIq4M119bGS48uXr+hz+mnK8b0o9vX2Ggy/4UvOGMkOTVQsGrsykFr45f4gTvxo3Ab36bCe/jxyMCoV8foG8fG/36luazs9I8BRX3pwiWvl4w3MaA/sX1k4I3P/oPPzSw4h13d06SVsi1arWBZ6e7biTZlyeA5dBgv77F+fZ6WenKJXalTxS8+cdGvjtknEz6U8q57bpvN1ZsCmtp/zPPprDwVcNJHDPyfLMkKcnz8eXf40mAgldnFwpeHb/ECV6JZ/nIYya6d3OF7YYNwKoGE5JlzH+C/pabG5Xk9MVFoEoa3zCu5Sska1rxLgzZbaDg1VlFdtpXNRhoWGNg1SojpwAW4Vtfb6PXYe5OsCeaW3pyHMQuBa9uXEhpGRt33eO6MpTC/UXb4q2fAR33zV2LP9TheeekMeLsFAWvFniCy1Pw6oxLwavjlzjB2xqOBx5K4Z134xH6Z85cEy/MNnHyiRYuvKC4XVivr96nUfn37BTBxQ4PCt5iyeUu5wlg2fl1hHCW36+Uak0Ax0XsUvDqx8UTU93sfWLva66KtyvD87NMzH0pd9pv/0u2UPnSFy0MOMqkS4N+iDg1yMvEypUGzjjdgulmQa/4i4JXZ0IKXh2/qhK8/3O3iY8+bjp5b9osOy4GNm40cMzROuFZiCneXm7ikUfdWUxib4r/myyA7WtttKs1cvobi2jatNkVS5u3AJs3Gc7OtbRfrjDj11LwFmLN4u6VMGyrVpt5BfC6da7Ny+nG4O8hXRpatvf0mSmsXQe0rwW6dcu4rUhK556H2Y6v65SplePKILG7Z81x29vSJSmrd+7MuPCcfAJw2qmV7aZR3C86nFIidGfPMbFshcv0y19K4/iBxblAhdOi8Gqh4NWxpODV8asqwevP5iXiwROKHkIRnZdfnA7NzaAl04hwdRNB5H9tl3Z27gRH4Ga3N7t+af/wc9ykEtqLgldLsPDy+QTwIQfbuOob5d8RpOBt2ba33FqDxla8pbx5J8yX08JHWv4S4m4l4t0TXbnmS0lPPfxcy/Fhl7G7eKm7c+1d8iI/bIg7n762yMQrrxpOfO18rjv5W5fMO7KFrvRy8CALw8/hDm8yLV54ryh4C2fWpMTajduVNVRW8RnPmfjr/IzQrOsMyA7Fjp0ZH1+ZqGXnd9SX0tg/xPSentCdv9D1tZVLXBBkJ8h5/iYD23cAW7YYzvNbugYOEBHs7ghL2dWrDbw4J5My+LprGyH90lwUvBp64ZQVEfHQo6m92fFOHWxjwHH6lxlt6yh4Wya4axewdq37JWbzloz487uxxDmxSvbLuBeBRHyNs69cIQR3b6/BjBeBV17P3C3zlHeIU1w45N95ZQjI7/zFOU3DG4rQPXNI5jBsUnhxh1dnSQpeHb+q2uH1UH38DwMd9216SMgJ5zWj6Q7FV0an0f9Y/eScS+jKoie7PPlCpYnw/XybgQO62JCYnSKUZZdEymdfS5Ya2LbdjRqgvSh4tQT15WXc3PoLN+JIHA5Zej2i4C3OtvJV59kZprNrN+I8/W+0uFa0XErmj2kzzL0v4/LiP2J4YTuy3ryx5sPdmD031eSwsLyojx5Z/i8UYXMrtj7xgZ63IOPP36YNcOIJyRS62XNHscyqvRwFr3IEVNsObz5c3qKkzWnvP6Tk39EVoXrW0HRRcVmnP5fCvPnurtFNN8hCpBfjLfGg4M03UqL/uwgQ8feM244gBW9xtvdSCMctMoPsME6bkdq7Cxv0ZTwXhVzzxpNPmXhtsftVLWjfN20yUFcX3fxWnAULK7VmjYE33jTw0cfunC1f7uTK/nrn7aIPPiV5O7rZxLjDW9gYyr6bglfHryp3eFtD5p2glkmoW1cEThwgAnfZcgPrP8odhko+410wvDih67V31y4bP/l5mz2CtzFSXzgKXuUPK4Ti3liMm78nBW9xxo2b4M3205UzABK7O2j68aCCV+7zNhLk/+eLJOMldKn0w1qevVsaLZ062Th1kCStid9uf3EjPH8pCt78jFq7g4JXx4+C18fPn9c+F1bPL3b7juYH3nLd74WZOuxQG0f3C2e34uZbSvOJm4JX+cMKobiXdjZuiQkoeIszblwEbyF+uoX2tLV5Q5JtPDzZdZlo6YCwfw6O27gvhIW4K7w4242/LV8LB/ZPo9OecxXa8xWFtCNu91Lw6ixCwavjR8Hr4+fPaz9siOWEi2otbqofvcRJ7NN7TwrjruGmkvWe4/l0yu6zuDREeVHwRkk3f91eRJE4pp2l4M1vv1x3eIJXdjglA6KEIMy+JCRhtqtStkASwbh2nSQrcXI87L28T+Xy95073Lq375ADsu4t3iE6v4tVMX66rfU+37whgnbSA27SDRnbkpLba58c3PWStGhdyoqzkL6UG30ncyhZbDl+nIX6HGcu9E+rvBooeHU2o+DV8aPg3cPvlddMPD2t9bz2n24F0mk3xmY5Qut4IuiQ7jauuoKCVzn0Y13cE0dxXPgpeAsbOhLHdlUDsH5D/tCChdVc/N0aP12N4JWyInAfmtw0KkGuOuPmytNav+X36n+R8MR8NbkrBBmNFLxBKLV8DwWvjh8F7x5+a9cZ+N97U87uStS7p8WazPODkzBqR/WVRBOFnaAu5Ln5dmoKqYv3Fk7Ay6w2ZlQaA/qH4w5TeCtyl6DgLYzkD39SA0l5Lpe4OXXvDshOZ65LxKA/iYPc01qIQn8d3k6wzA/eC7nzct7evUtCGaYbgYMPAY7qG43faNB5Y/0GE3f9zt1gkJc6uaTNte1sNKw298bzLSQ2uvTNSAGmb9e7MEsVd/cdv63BPza6ZeVFQvxyNX7QxbWiMkpR8OrsRMGr40fB6+N36y/csF9hxLFVmqVZcckE95vfNs14JIvBpV9NY//9w34aEHThCv/JrFEIeL7aN90Q7eHEYmhT8AajJqL2iSczO5kSLvD8kMKRLV5iOHXLDqLshMblKmTekHjoRxxuo2uOuLwSocQfWzxfdIfnX0xh7stuhsqrr4z265eftdhAbCEi+7KLG4uKvhMX25WiHRS8OsoUvDp+FLw+ft7kFcdPadu2GXjgQRPHHevG7vXCCJ022HKyHYV9FbJwhf3saq9Pon08/Gj8wpF5dqHgzT9C5dDSlKmuL6d8NZowVhehJfuJcTkAl92uMOeN7Njo8oI/fmzz2OUSVu2++93DvIcfbuOyr0UveOVlRn6jcsZD7Dvunyx84fB4fYnJP0pLfwcFr445Ba+OHwWvj5/nMiCHSv55bLrJgRAl5kiKr1hpor7eQls3UlmoV5gLV6gNq4LKvHBk+Xa1yoWCgrd18tNnSkIBN+5svhBcxdqwGgSvx0bOLjz8aCaTpLgN5MpGecrJFr54fvgv/9k2+uwz4De/rcHOnXBcMnKJ8GLtmvRyFLw6C1Pw6vhR8Pr4LVps4i9PZdIOnzkkjTOHVedbOwWv8odVZPHlK9xsV3KK3cum5wSsN4AR56XRt0/5xyMFb27j+nf95I4ovxS9PM/AzOdTscvaFuW80VJcW9lh9UKdXXNldOcaPKv/7S0Dj/055YjdUjyvyKkklsUoeHVmoeDV8aPg3cNvyVL5BOmKXUnxmErZGHWhhaOPKr/AUJq4qOJRLlxFNSjBheST7LKVpuMLKAt3S5f4f4aRNlqLkoK3OUG/C0Mpdv0WvGJi2nQzdln4op435EVw7Xrx1bXgD9d21+9c9wLZUR821MLmzW4YNnGLkHBnmzcZzuG/zVuAXbsMJ2V8fS+ryYFQuXf7nhBubWoM7Luv7ZRZv97YW5fUKRE3pB1xTRGt/X1HWZ6CV0eXglfHj4IXgF/sxvUzstLMBRePeuEquEEJKJC2gPfXAOs2SPB9dyFdtbppqCrZ1XX+tsFwFnRxrZEdrPbtDNS2j8fLVxDBa6UBs+kZywRYsHkXHD/Tme7BJbmicmHIfnIpY3IXYrhyzRuyu373ve6h41JdcUv5Xap+a55DwauhB1Dw6vhVveD1fyaj2M0MpnItXMrhHOvi/317DbZubd5EEbQDB9jo19tNWCJiRpKgyOJdiWHJlrxhYMpfUpF+0o+Dof2+pWLDM4eWNk2sF7pu4iXxiQ5QznnD2+Xt1Ak4uJvthGOTUGwS7kz+t3NnoK6zje3bgeUrTby+yMRHH2dGkvNiWesK5q2fAY2Ncr8bSq5znY3adm49Uod3SC6OUVTi8NtoqQ0UvDrrUPDq+CVW8L69zMCGjzJv+/JZy3v737Hd/XTl7aQJwij97ZQmKkvxci5cZelwCR7q/+zarZvtLJ4HHwx0Paj5zq18In9osuns7l59RRp1dSVoYMBH5NvhjeuBqoDdC3Sbd8BVbhZBNHpU8+gBgSpS3PTsTBPz9xyOCzPkmaJJZQtnKGHMpkx1PymcOTTtvHxEeXkvG1w3CqNMwVsYr+y7KXh1/BIpeL0YlYWgkeDnXx4Z/8gMhfRJcy8Fr4Ze7rLzFxh4YXYK117TiP32y1//w5NTWLbC9UscPy76UEv5W+Te0Zrg3d0IPPq4CYkgUgrhEbTNYd0nL8kSRUPsIpcITUlDXo7Mi41p4KmnU3uTNMiZgxOOjz5KQWssyzVveGnhxTf3zGFpdIn4BdET2P7EGdlcDjrAxtFHu8Jb/PRnv5RyEn98/fL4/JbD+l0ErYeCNyip3PdR8Or4JU7w+sWuiFiZkLxLdtTkksXJy2P/5tsmJK2wXOeebeGM08q7YCjNGVrxci1coXUgARXF1bWhNcH7y1+n8OmnTf0oZQfU+TTczUbHfYETT6ic39j2bUCqDZzQf9kuDGNGxSOjlvivLn4jhSGnp1FTZr/pcswbL8xKYc5LhjPXX39tacTke+8Bf3rQjf3b2uWtP3LITa6DDgT+5erGfMUS+3cKXp1pKXh1/BIneKfPTGHeAgOF+ON6nycPPdTGFVX89u0fSuVYuJRDOZHF4+ja0Jrg/fHParB7N3DQQTY+/bTpgTzPQCKAx4+Nl5tGS4PH74bi7epWUvtL/aMox7zxq1/XYMunQKl9mSVSRk0NcvrlC/fFSw0nmoO7yVJ6H+9S2z7I8yh4g1Bq+R4KXh2/xAlewSGhZPwha4Ig8lK53nJz9b59U/AGGSmlvydurg0tCV7vxTF7p012RiVMlESe8IuAuPidtmRR+Qz9yOM12LYtc4cIl29eEf0n89KPsnCeWErBu6rBwPKVxl4/5ssuTuPw+mh9dwulVA3+7IUwoeAthFbzeyl4dfwSKXiLQULB25RaKReuYuxVTWXi5tqQS/DKZ/Xb73Q/8U4YZ6Fvn9xuC7nSxV4wPB6uAdJ2ad/ipSYkvXPD6oxrhuzqSgxX7+BrnyMtfLrVgMRGru8VL5FVzt9GqeaN55438dK8TJIg6fO4i9I4ul+8bLFgoekkkpEoLKNHlsbdopz2z/dsCt58hFr/OwWvjh8F7x5+t99R4+wMX3dtY8G7w0oTxLJ4qRauWHY+ho2Kk2tDLsHr7UKL3/zoUfkXdhHIkx5I7f3kKwfzxMe3vqcbmq3UV67kH164uAHH2ejezYa0efbczCExaeN551g4/dTK8UmOmmsp5g1/KElJ/iDh/Oq6GOi0X7zErrCWrxt33+NmZRszMl2WsR21zQupn4K3EFrc4dXRylF67cbtoddZiRV6C7bkRe/XlwtYKRauShwn5WxzXFwbsgWv/8T65RcX5pvrD+/lZys7qt27wcmG1bUrHMEZxSXPz97NlYQC/fraGNg/d/QFfwrhuLtlRMGstTqjnjeenWFCbCbXmFGSKS3+c7X39dDjJuNLvgrIGJf/LUeEj1KPC+95FLw68tzh1fHjDu8efl5cxa+NT6P3kdEsrkpTlbR41AtXSTuTkIfFxbXBL3j9O7XFJsnwBHOPHjYad7u+vrmusISC7LpJcgx/Kufs5B/5hoy3cyf3FXJANl+9lf73KOcNf6zdShG7Ys+33jaw8h0T69bnHtuy+9u9qxvF5JijLRx4QKWPgpbbT8Grsy0Fb0B+Wz/bhsZ0GnWdOjYpwR1eF4d3Gls+p36hl4VTTi5PbM2A5oz8tigXrsgbn+AHxMG1wS94JSatHETTxgr+ZBOaxE4VF4NVq00n/fK6Ddjr+uA3rQiF+p6uUOjeNb8rhLCTCC5+39x8u7mtDaV58w1Mf86NA1aqlMJxH9pRzhterN1KTvbgphR3x7YcuvOPRbFthw42vvfd/C5BcR8HLbWPgldnOQrePPy2bd+BG3/yO7z418XOnccd9QXc+ZNrcUCXTs6/U/DC8c3zDtx4OA/YH/jq+Ebs30U3QCu1dJQLV6UyiUu7y+3a4C1aL87b6WSDk+u6bzdGmg1OfqOy8yv/5BIKnm1EwHbvLgLYxsp3DbRpYzgZ7eYvNPYeOCt0N7c1u69abeCJqaYjyI852sbYryRXrAQZ/1HNG/Pmm5j+nFnSWLtB+hvGPf4oJoccbCf6CyMFr27ERCZ4/zh5Onod2g2nn3IsalJljuatYPT7h57BY0/NxgN3fh/ta9vi6u/djvrDuuPHN0yk4N3D1S8gxCdsyVITK94xcMP1aeeNuxqvqBauamQZdp/9rg3iQ9q2HTDsDAtm00PrYT92b32yaG3bbuPmn+92hF65dtxEKMgOmbML3MLnYj8E8ZmU0/It+eYWC0zE+IznTUiWL/H9reZLO2/s2G5g0xbZ0Qc2b3FfcJYtz0THkBeaCePSVeX3mqTxRMGrs2ZkgvdHt/0Jjz45C10PrMOlY8/Hl4efjk777aNrbRlK/9MV/4nhw07CFV+90Hn6jNmv4Pof3oU3Z02CYRhVvcMrC6bszng+g9+8ohEHdy+DkWL4SO3CFcMuJaZJ2eNWOlaKtLJbdm7GD1++ETNXPYXj/vFLHLnrEuzXKY3vfic+Is/vCuElihB3g8Gn8IR8KX4Amnlj1pwUZs3J7b8tu/JeSDhxZYlTKLtScE3KMyh4dZaMTPBKs/627O94ZOqL+Mv0l51Wjh15Jv551Fno84VDda0uYemTRnwTP7nx647olevtlQ246MofYt5Tv0WnjvtUreBtaDBw3/3uzr1MoBKdIaqT4CU0d2iP0ixcoTWCFTUjMPnPJt56q/lWrufPOmxIYVESCkE88ZmxmLHqaXRp7I/Rn77qFN16/L/j9gt/Vkg1JbnXtoGf/XfKEUkMNVgS5M5Dipk3sl/gJGlQp04SxQCoq3NjNMt/80fHcF/y0jjh+Pi8bJWOcuU+iYJXZ7tIBa/XtE82b8XU6S/jgT/PxIaPN+GkAX1x8VfOw9BT+8fa3cG2bRxz5uW462fXYejg/k533mv4ECMv+z6en/wrdO+6v45+BZf+60ILkx5K47RTTIwbbaJD+9w7CxXcRTY9gQSu+/5upNPAOcNSGHCMgVcWWXh1sYWNn2Q62+cIwxnXp54cno+DuDCcfMt3sK/VE713XoK2dmcsav9jrKq7E5tu3BQ70k/PSOMv09yQVeKHP2pEKlQesetwhTZoyd9s/M/v3eyWYqeJE1Loc2Tr43bSg2n89RULI0ekMPL88MZ4hSJks6uIQEkE75ZPP8eTM/+KSZOfdQRvh/a1kMNgXTp3xDcvGYWvjjkntshlh/en3/sGzht6otPG7B3e2Da8BA3btRto26YED+IjSCAkApIcZb+OQPaxgjUf2HhhjuUIAe8SAdH3CBNfOt/EAfsX9kInAnfxGzbeX2tj3kIL27LCdW9JvYvHOx2FTu06YfP3NofUu/Cq+XSrjVkv25j3Snrvy8Chhxj4zxvcbHC84kFg9ssWHnw8jS+dn8I5Q41AGw+PTEnj+TkUvPGwIFtRSgKRCt43V6zC5KmzMGXaXKdPZ502EBNGn4NTjj8KK95bgwcen4kFi97Gi4/dXso+F/Qs8eE9/8yT8Y0JX3TK0Ye3IHxVe3MxnyarFlaMOi6ffVevMfDiHDdygHfJYZ/jB0ig/tyfgOUgnBP9YI2BVavcw0L+S3woN9W+iqW7JmOXsQXvtn0IttGIi/p+Db8+554YEWjaFHFteO11E09Nc3cCb7nZ3U3kFQ2BYuYNy0JBBy5fmG1izlw3YsM1V/IAWzSWjKZWujTouEYmeL1Da7KbKzu4F31pGA7p1jwi9Jatnzu+sHG97n3waTz+9BwnSkOH9u3wzRtvY5SGuBorRu0qZuGKUfPZFABygGvx0qapcP2+vlu2AMtWmli3rnk8UBG44kMpUQecUF/dbMihteuevxIL1r6ELTu3OGL3R2f8Ap3adY41b+Fw3/01Tmara66q7rBhURuqVPOGFzddkpVcOZE2jdquYdVPwasjGZngvfv+qejR7UCcO/RE1LZrq2tlGUt/vm0HvnvL3Zi7YKnTimP61OPOn34HBx3gLlKMw1tG48T40aVauGKMIDFNa2nXN7uDXiaz+p6tJ3HITi0cd1CPTknhzTcNDB5kYcR58U9FG3eerbWvVPOGjOkHH0lhd6PEgKbgrZQxQ8Grs1RkglfXrPiVlp3o3bsb9yac8FpIwRs/W8WhRaVauOLQ12pqg7fb6e/z8HMtnDAweGbBShO8Tz5jOm4N48dZ6NeHgjfK8V7KeaOxETANwKzcMPlRmiKWdVPw6sxCwavjxx1eJb+kFi/lwpVUhnHs16w5JuQfcW2QrE4rV5r41283omPTjOOtNr3SBG8c7ZDUNnHeSKplw+kXBa+OIwWvjh8Fr5JfUotz4UqeZf0ptK++Ml103GkK3uSNjbB6xHkjLJLJrIeCV2dXCl4dPwpeJb+kFufClTzL3venlJOKd2B/G6NHFe/3SMGbvLERVo84b4RFMpn1UPDq7ErBq+NHwavkl9TiXLiSZdklSw1MmZoKJZQTBW+yxkaYveG8ESbN5NVFwauzKQWvjh8Fr5JfUotz4UqOZcWVYdIDKScu75hR6RZj8QbtMQVvUFLVdx/njeqzeSE9puAthFbzeyl4dfwoeJX8klqcC1dyLPvEVDcWb78+NsaPK96VwSNCwZucsRF2TzhvhE00WfVR8OrsScGr40fBq+SX1OJcuJJh2eUrTDw02c0ydt23G1HJBe63AAAgAElEQVRXp+8XBa+eYVJr4LyRVMuG0y8KXh1HCl4dPwpeJb+kFufClQzL3naH68owYriFwaeEE4OWgjcZYyOKXnDeiIJqcuqk4NXZkoJXx4+CV8kvqcW5cFW+Zf0xd6+/Vu/KQJeGyh8TUfeA80bUhCu7fgpenf0oeHX8KHiV/JJanAtXZVvWH3M3LFcGCt7KHhOlaD3njVJQrtxnUPDqbEfBq+NHwavkl9TiXLgq27J3/S6F9Rv0MXdzUaBLQ2WPjShbz3kjSrqVXzcFr86GFLw6fhS8Sn5JLc6Fq3Itu3iJgSeeDCfmLgVv5Y6DcrSc80Y5qFfOMyl4dbai4NXxo+BV8ktqcS5clWnZx58w8cbf3KgMYcTcpeCtzHFQrlZz3igX+cp4LgWvzk4UvDp+FLxKfkktzoWrMi37l6dSWLTYQLeuNq65KryDan4adGmozLFRilZz3igF5cp9BgWvznYUvDp+FLxKfkktzoWrMi27bRvw6/9JYceOcEORUfBW5ngodas5b5SaeGU9j4JXZy8KXh0/Cl4lv6QW58JVuZb1kk3U1tq4+op0KMkmKHgrdzyUsuWcN0pJu/KeRcGrsxkFr44fBa+SX1KLc+GqbMs+PDmFZSvCSydMwVvZ46FUree8USrSlfkcCl6d3Sh4dfwoeJX8klqcC1dlW3bHDkCyrIlrw4RxFvr2CSfLmlChD29lj40oW895I0q6lV83Ba/OhhS8On4UvEp+SS3OhavyLet3bZBMa7W14fSJgjccjkmshfNGEq0aXp8oeHUsKXh1/Ch4lfySWpwLVzIse9+fUmhYHa5rAwVvMsZGFL3gvBEF1eTUScGrsyUFr44fBa+SX1KLc+FKhmUlxfDd94br2kDBm4yxEUUvOG9EQTU5dVLw6mxJwavjR8Gr5JfU4ly4kmPZ+QtNPDvDROfONq65Uu/aQMGbnLERdk84b4RNNFn1UfDq7EnBq+NHwavkl9Ti1bpwNTQAf11gYv/9DZx/bjSJG8oxZjzXhlMHWTj/vMwBtpXvGnjzTRM7dwLbdwDHHmPjpBNaP+BGwVsOC1bGM6t13qgM65S/lRS8OhtQ8Or4UfAq+SW1eLUtXLYN3P9gDd77u2vRAw8Avn1NY2LMK64Nt99Z4/SnXx/bEbdyrd8AJ5KDdx1ysI2rvtG60KfgTcywCL0j1TZvhA4w4RVS8OoMTMGr40fBq+SX1OLVtnBt22bg579MOebs1dPGhHH6T/9xGxvPzjQxf4HZrFntOwBfvjCN9rU2evQAalxd3OJFwRs3y8anPdU2b8SHfGW0hIJXZycKXh0/Cl4lv6QWr8aFa/ESA0886Yre0SPTGDjATpR5/W4NfXtn3BYOOQRo0yZ4Vyl4g7Oqtjurcd6oNhtr+kvBq6EHUPDq+FHwKvkltXi1LlzeAa9UCrj2msbQ0/KWa7z4XRpuuVnnqkHBWy4rxv+51TpvxN8y8WghBa/ODhS8On4UvEp+SS1ezQuXl5ZXXBsmXpqMg2tLlhqYMjXl+O+OH6frEwVvUn/1+n5V87yhp5f8Gih4dTam4NXxo+BV8ktq8TgsXK8vMvHiHBNfG59G926lcy+QtLx33ZPC5s0GzhxqYcBxlrPTO3++iXdXAf98kVWQC0Acxogn4seMSmNAfx1LCt44WDSebYjDvBFPMmyVEKDg1Y0DCl4dPwpeJb+kFg+6cG3dauDPf0nh1EFp9D5SJ6SyWT4/K4W5LxkYPMjCCF8orVIwX9UATLo/c3prYH8bi5e60QxuurERte1K0YrwnnHzLW5frvu23k2Dgjc8uyStpqDzRtL6zf4EI0DBG4xTS3dR8Or4UfAq+SW1eNCFyzvoJQe85KBXmJfExL3v/hrU1tq4/trSR01YvsLEvAWGk5rXu446ysY//1O4/QyTWa66PHeGsFw0KHijtljl1h903qjcHrLlGgIUvBp6PLSmowdQ8KoJJrOCoAuXJ0rDElPZNL3IAmcNTWPY0HB3kINaTg58SfQGEb7durp+vbW1QUuX9z6JL/znJ0y88aaJEcMtDD6l9aQSQVpLwRuEUnXeE3TeqE467DUFr24McIdXx4+CV8kvqcWDLlybNgO33+F+Lped2G5dgfa1QOc6e+9nfxGJnTu7f/cu2wK6dMlPz9tB7n+cha98WS/W8j8x9x1+v97D621cdnHhu7wLFprYsRM443QLqebhcIttWrNy0tZNm92d6beXGVi9xt2hDsOdQeqh4A3NVImrKOi8kbiOs0OBCFDwBsLU4k0UvDp+FLxKfkktXsjCde+kGrz/fnEkRAzXdQa67TmUJv8uCRA6dYbz32fNMZ1/ZHdXdnkLuaw0YLphdUO53nrbwOTHU84BNhGPhVx+n2DZZZXdVu0lLxvr1xtYv8FwhPSqVQY2b2maOc3/jJtuaAxlZ5qCV2u55JYvZN5ILgX2rCUCFLy6sUHBq+NHwavkl9TixSxcsrP40UfAxk9EgBmQfxchtm6dgU8+MWDbNmpqMv6wItiCXnJobGD/NLbvcOv1rs1bMvXJ/1+3Dk1En0RZkH/CuDTuG+m07Rzue/Mtd2vXEcyZpjdrnoj9Dz5wE2GIQE6lbEj/PIHbGjvZSe/cCaivd3fZly13RTFdGsIYBayjNQLFzBskWj0EKHh1tqbg1fGj4FXyS2rxUi1c69Yb+HCtga1bXXG8eZOB7TuA9Rta3qkslHnnzrI7bKnDcTWsMXDfH90t4wnjLPTt07KQFt/Z7dsNmKZEdzAdES7uGTt2tKJyC+yYszMu7iJ1rsAVP+q6znazXVwvmUZYftbc4S3QUFV0e6nmjSpCmqiuUvDqzEnBq+NHwavkl9TicVm4ZDd3VYPsUJp4/wOgTQ1Q2z5DvXMnG++8Z2DtWgOjLkyjWzdxhXBFn+zITnnSjacrlwjfyy9Oq7KnTZ8pkRtM1KSAHj1aPkS38RNAQrZlX7L72qaN4ZRv7fLv4B6wP3DooSJq3X+kj4XEJRaGt92RcsR2GG4NFLxJ/dXr+xWXeUPfE9YQBQEKXh1VCl4dPwpeJb+kFq+khcvas9FqtnAQTMJySQILT/j2PsLGIYe4YtVziRB/3y5dXKHcvavliMrsSAxyAOwvT5kF7dKKwK3vCfTqZTs7sIUIVenXm28bOO4YfXSKZ2eamL/AdNw7tC4eFLxJ/dXr+1VJ84a+t6yhUAIUvIUSa3o/Ba+OHwWvkl9Siydx4fJ2Zwux2SHdLRx3HJyd1UceS0FcFcSNYMjpNvbdp2WXhk82GZjzcgpfPN9C7yPD8SEupN3Z9/pjGo/8ooVjji5eRFPwaiyR7LJJnDeSbbHS9o6CV8ebglfHj4JXyS+pxZO6cInP8PIVGVcDEbJy7dzl+tzKobDWfIhlx/amGwqLFhGXMXLrL1y3BrmGn2vhtMHFCXEK3rhYNH7tSOq8ET/SldkiCl6d3Sh4dfwoeJX8klqcC5drWXGHWLvOwJYtrhDu29vGqUUKxXKPlWkzUliw0BW84kM88sJ0UQf5KHjLbcn4Pp/zRnxtE4eWUfDqrEDBq+NHwavkl9TiXLiSadlsf2Y5yHfBcLvViBPZJCh4kzk2wugV540wKCa3DgpenW0peHX8KHiV/JJanAtXUi2b2bn2H+STOMfDhgSLYEHBm+yxoekd5w0NveSXpeDV2ZiCV8ePglfJL6nFuXAl1bJN+yUxemfNycQHDiJ8KXirY2wU00vOG8VQq54yFLw6W1Pw6vhR8Cr5JbU4F66kWrZ5vzZtAmbPTWHx0sxhPhG+I4anc6YipuCtnrFRaE85bxRKrLrup+DV2ZuCV8ePglfJL6nFuXAl1bIt9ytb+LaUoY6CN55jw7JsNKw2cXh98SHntD3jvKElmOzyFLw6+1Lw6vhR8Cr5JbU4F66kWjZ/v0T4PvFkCg2rMxnq/KmZKXjzMyzHHS/OSWH2HAMDB9gYPbI8ofM4b5TD8pXzTApena0oeHX8KHiV/JJanAtXUi0bvF8tRXQYNLAtPt0KWKkdwSvjnZET+PgfwL33ubGWw8ioV0yDOW8UQ616ylDw6mxNwavjR8Gr5JfU4ly4kmrZwvuVLXy9GiZemnbSJfOKD4FVqw1M+lPKaVCQA4hht5zzRthEk1UfBa/OnhS8On4UvEp+SS3OhSupli2+X/5MbV4tIqrqe1n/v70zgZaiuN741z1PNiUsikA0Am4sGhAVARdwBdxQNNFgXImAYtS/JlGjcV8SNYoRxYUIioqiRgIuiAl7VHABERVQ9AGigCiLiGxvZv7ndtPv9XvMvOme6m26vz7HY+Lr6u763Zqqr6pu3VtUAoviv4QlayMw90PNcEmx2+jIIzJotpv/kxP2G2ybtRGg4FVrHxS8avwoeBX5xbU4B664Wra4ej03NoUFizTIQbYzTi7D3I+ymDu/KjWxpFxu31ZWFdNo3bq4d7CUdwRWrdLw9iy9MvKGpgGXDEyjZQt/RS/7De9sKL70U6encOQRaezezLvnhvkkCl41+hS8avwoeBX5xbU4B664WtZ9vaZOl1i9OkTUXjowjQP2rW885JPFm7B0mQZ7Agv57yKKZeX3oI4ZNGni/n01S3w4T8eEV3Wc/esM2u5fJbLVnxz/J4hoeuPNqsnKRec5Sy5SLBn2G8WSq15uxUoNo0brhj/2sUdncHSPeLR7Cl619kHBq8aPgleRX1yLc+CKq2Xd1UsSU0ycpBuFLrogjTatssgVpUGE1YcfmSuK69ZVxfNt0TyLw7tl0GqvbFHid+EiHWPGmu8//dQ0Du7s7wqlOzqlc/fwx1JYuUqD2ENWenUTqecX+w11pDLBe3m8aaAwI26o12THJ1DwqlGl4FXjR8GryC+uxTlwxdWyzuslq0yPPG76gp7YO4PuXc1VpkJhyZYsAebOk1VFGCtU1tW+bRbt2zn395WwaCO3H8AKK+qAc1rRvnPzZmD44yljMiIh5o7u6c+KIfsNtXbw/hwNE141f3MyUezTyx87qX1l8aUpeItnJyUpeNX4UfAq8otrcQ5ccbWss3rJiu2op02BVFNsFhK89jdIhIfyJVW+pNbfxOWhkL/v6GdTWPxFeCG2nJEqnbus1fru3TI40SchxX6j+PawYQNw79CyHSaYxT8xeiUpeNVsQsGrxo+CV5FfXItz4IqrZQvXy74aKKuy/c+unsTAjeC13iYCWvx953yoVya0kL+Jv2/7dll067Kjv++DD6fw3fcaLh3k/2GrwlRK/w5L8B7WJYNTTvRn5ZD9RlU7kcnaPntnIQcGnVyWr7xMBvudFk7iECffqXIPBa8KPa7wqtEDKHiVCcbzARy44mlXJ7WyR2QYMiiNevWqlypG8NqfkM/f10plbPn73nJHGTIZ4Ppr5Bvou+vEdrXdc/+D5or9GaelfQsjx37DtMA7s3RMfFPHEd0z6H2Cs8nFuPEpwwfe7j6kavOolafgVbMIV3jV+FHwKvKLa3EOXHG1bO31euNN3QhnJeIz34l+VcFr/wLL37d8KaoddpOEFitXmT7A119bgXp1k2kPr2r93XfAg8PLjImDHFpr6kH0jFzfxn7DpGL5vwvvq68wJ42yc7Jpe3JCadfio75uPbBihVbZ1qWs/PakvW/ZYi4Nr10HpFLApYMqSj48GQWv2i+agleNX6IE7xdfapgyTcevz0yjcSNFcDEvzoEr5gauUb1tFcDEN3S8P8c8HV6bG4GXgtf+Gfn8fUU0tGgOtGkt/zb/8SLcWZIsbK3aW3Xu0M5cebRzlMnHgs90zJ6tI7N9QV3Y//7SNH7W0Bkt9hsmJxG3sqJuP7RZiKC1i5GrjETVGHxx6bv2UPAWagW1/52CV41fogTvlOkpTJuuwc9DG4rmiExxDlyRMYXvHyKDs2ynSmIJuc44TSIp5N+G9UvwWhW1xyDVU0AmhzujHyJ4wis6vlujGbF+j+geHxcKmUi8PD5lrO7KxEFWFu2XfTW9ZmMTofXHqyqwy87OmmHc+435H2t48WUzikJZChj4u+oiVH5L4iv9zmxtB7Er/OvX07BlaxYNGwI/bwk0bmRO4GQyZ7kOicvP+vVAo8Ym8ybb/+3MAtG+i4JXzT4UvGr8EiV4ZQVj5GhzW0/8AnnlJxD3gYu2NwlInNvXJ1XFzj3u6Ax6Fghy76fgtUeH2LkBcMVlFcZqmcSQlX/Kl1Tf/q1pRxFvIh6kbKNGWdSt40y4rl1XlY53p52AG/9cEYsmIjwfGWGuNFq+u/Lfps1IYd58zfCRti7pF7t3lX8yO/htO4UR935j2Vcaxr6kY8OGqkmD5XMrE4vXJ5nJIuSSA58S/s3v7HZObROF+yh41axAwavGL1GCV1DdfHsZslkYJ8+lQ+KVm0DcB66k292egUtYiFCUgdvJ4Oyn4B0xKoWvvjITJAy4YMcDc5bd5PstEbxypQbxAXazfVyb/dvtLyz8zUgWRPuTCf7rk8yEE/lO/stkQly9GuwMI6mI6pWkfsPyd8816Tq2J1Ns52pLFLxqvzAKXjV+iRK8MkgOHWbGORz8uzT22EO9g1fEH9niSRq4ImsEnz5Mwh9ZW66yqndMT3NVz+nlp+Cd+ZaOr78GTu/rfpXRWAleCfxnqimaLSHvtF5ygMieJa5m1Ainz8l3n7FNHcDZAVm1f3uWeShKLnmnTB6aNPa/v0tav2G5iwhnaS8n9c6iXVvnvyXVNlVq5Sl41SxGwavGL1GC1wr7Euc4h4rNobJ40gYur7hF+TniGztuvG6s+MklOxwS77Nm2LFCdfBT8BZ6t5O/y4rlU8+kjFVrEXpuLmu7v2bUCOkzRMhIzOBirlnv6nj9DR0nHJfBUUf4I4jGTUjhy3Lx/zTtW8xkppi62cskrd+Y9B8db71jRjWRaAy8aidAwavWQih41fjFVvCmK4AZb+nGqobE9axfvyq15lWXV/CUd4F2k7SBS/FnFFhxEWRuIxTIyue0GWa4MWsl6oy+xW+5Rl3wLlio4bkXUoagr5k0w42hZPXOTaKM2p5tJRU47NAMTjnJH8F7023m7pWIL8sX1019vbg3Sf2G/K7uua8MFenao5p4wTUuz6DgVbMkBa8av9gK3iVLNIwcbZ6mtV/iGzhkMGfihZpNkgauQiyi8vevv9bw2BMpw7+1/1mFfUxlQF64qPpBmpppgoupW9QFryUuvair8MmXKENWkA8+SCJa1L7qW75Uw7PPlWHr1iz8PBBnCd7bbgrvwF2S+o25H5oHHVUnVsX8Bku1DAWvmuUoeNX4xVbwCpaPP9Ew/2O98kBLo58BZ56RRuu9ituWVERdUsWTNHCVimG2bgMeesTMlpUrMYS4LIjfprgsLFhY/RCXm0NphXhEXfBKhivJdOVHxiorUYZkxLJf4vLQudOOq+Yy6XjokTL8sMF0IenWNWNEkfDjuuseMxrD9ddUuHZT8ep7ktRvDH/MPBBohRsThhXpLPbdB2i3f/HuL17ZIorPoeBVswoFrxq/WAteCw3DkblvJEkauNzTCa+ErDbKdr0MtI0bA507ZZAvSoEMxOKfKwLZy1WoqAvekU+lDOE/4PwKtG7tn63yuTyI+D2oo5nUwTo3UL9+Fn/+k787S0MfLDOycl15WQV23dW/etf25KT0GzKRGf64Ofms7erQPmuI30K7AOFYK/i3UvCqMafgVeOXCMFrz3rj9yCoaI7IFE/KwBUZ4C4/RLI41RxsReB2PihrZBGUFV0JMWa1/XRa8yy2bNQFr7XyVlu2OJe4a729NpcHK1KCJHC45S/+uhqMfiaFxV9qvqxsO+WVlH7DamPGTsv5VROZ9euA8qW6ES/anuBD7mvcWMNPG4E2bTJGVBS/0js7tVUY91HwqlGn4FXjF3vBu3EjcO/QssoA61aooY6/zEIGIV65CSRl4CpF+9vjf+62K3DCcelqmZpq1knCbaVSQMNdvKlt1AVvmL6scmBOwoLVdHk49JAM+p7sz2E1y6qS4WviJL0yprJMfGRlOcgrCf2G9fvL5VZkZ2350E+ZrudcCZadgKN7FPbFD9J+fr+LgleNMAWvGr/YC94tW4Bhw1MQ/8fddgOWLze3oCRyw+8u9HeLUdE0oRZPwsAVKuAiXm5PASyruWecFk7MzygLXmF01z3hZlN8730dr7xuzqbFTlZCDGuy7df29icLNIx9seqgrpxZ+MP/+buqXLMZx73fsE823ewgiPvLDz8Ce+2Rxdx5qWoTosO7ZdC1i+kCE/eLglfNwhS8avxiL3hr4qE/r7MGE/eByxmF6NxV3XdXojQ4y4rmRw2iLHit33cxMXi9YiUretOm6/jlAVn8+sw0ROzYV/m8Fr65fIlFaB/RLVswTbRXdbaeE8d+Q9rUgs90SFQGa/JyztkZpQQTVrxnaydA2kSPI7KQnYA4XxS8atal4FXjlzjBK7iicJpZ0Wy+FJ8+Q0edukDL5hn8Z3IZMllg8MXBrhD5UrESf6hEX3juBXNb1GlIMj+rHFXBu2adhqlTdcybr0HTgJv+XIGUGZo20GvDBmD2+zo6HpjB7s2qXu2l8JWVbHFhsDLmWavJVvxdt8lEvAIUF8ErfOfO0yEuKnZf3NatAImlfOAB3gjTmsL3koFp/LxlsG4oXtneyXMoeJ1Qyn8PBa8av0QK3ufGprBgkWbEMi02c5Ii9sgV37IVuPNv1dXBnj8HBlHwhmor8Qd9eby5slRsZjSvKxBVwWvfbv55iywGXFiBOnVqP0XvNRsnz1MRvrLa+M5ss/+yLlnNln5MInaEJXStbyllwfvvV1L45mugzd7Zaqu51mHQ9vtnfIv6IcJ69WoNPY7yRkg7aYdh3EPBq0adgleNXyIFrxWns3u3DE7sVb2DyWZhrA4l8Vr0uY5PP9Uq/cv23Rs4/1yu8AbdFnKtLomfX58abTXo77LeF1XB+/0aDRNeNU/Ie5V0wk/GToSvtAXjxP8yrZoIk++SCVD3rsVnzPOjbqUseB8bkcLXK6I5kfDDVmE8k4JXjToFrxq/xAheiU8p2dfWrZeg/GZw/nzX/vtlcG7/eM+0a2s2ss325DMprF2rGWGuOndMY8Uqc3vvxx81XHbJNqRSCZ0VKP7e8hW3kkbU3EKVyZckUOh2WHTaY1QFr7D1OsuaT+au9thcwldO8IvQXbmqegKRxo2yRh8mq46XDozeCf9SFbxig5fHmwf+ZCHkoI5mWD9e3hKg4FXjScGrxi+2gtcSEAs/01Be7l6cJT1e7/q1ZRg+Ati0uXoD01PAdVenUS/gcEeKzTySxa3DMNI+7ROwILZQVYBQ8KrQy1+2pvC17qzpsmC5ZEUxTXopCl6Z4D8ywsxS50d2Pn9aS2k+lYJXzW4UvGr8QhG8776nG1tH/fp6FxZMVnAl41SurT9BJCKiTSugRYuscfBH/t2k8Y7wrBUi+ylvS5i02N0M7J+ESwauGW9rGPeKZOsy6y0pmSUtath+gqXMX7KAlZVlsfzrqhPfVvsUxn76CXrFjYLXK5LVn5NJA2NfqvLPPbhzFn1OSO/we7Nn+crlluXP1zl7aqkJXjtLWVXvd5p3Y5IzYsm6i4JXzd4UvGr8QhG8DwxLYc1ac9W1pujcsjWL/fcFDj6oyjdNOqXZ7+mVySOk3KZNMFbFam75WTiKFWn2rGzNm2chX2lffROxPOh3aZSFcPpb0dSuisvANXOWbPOZYtfLyYmrD4nBzbnCGpWayLWbgYLXn0Y5eaqO6TN1Y4JZKMayuL5IimmJAvH7S6PjZ19qgvdf41JGVA8JC3b1FRS7/rTsqqdS8KoRpuBV4xeK4P10oYbnX6gKkJ6vCtIJ1atbXXDmvbdRFk2bAm33z1amVS0WjXWoraZ4lkFGQkMNGpDGnnvGe6VXBq4PPgSeGWuujovIl6xevJwRyCdyrdKHHpxB31Oi45frrFbmXVEWvOMmpIzDXTJBK7XdmNvuKkNFBeAkxqsleOXgWv+zoyPUSknwfr5Yx9NjzAlGFP2h3fwmS+VeCl41S1HwFuA3eeYcXHHjgzvcNefNEahbZ6dQBK/9Y2RF1e4nKn5UCxeZkQJEXFqXuBjIdrp1yeENWR1u1Di3a4JKs7LcGmSL64AOWey5RxYNGmSNFeavv9bwi1/EW+wKO2vgenxUxgiBFGYgfxVb+lH2iy81/PADqgmqdBr431s6fpL4nbYA9fL+mj6573+g48AO2ZL1g46y4P3Xv1OY95EGcQc4/dToCMFC7dA6NOX0d2b1UXRpKEQ2/9+tVMxRY1h8jaJfkoJXzUYUvAX4/XfmB/jzXSPw0ohbq9251x67Q9O00AVvvs+fMVPHf6fqxlbTkEE7+rGpNZvaS7MjrBK8q9dsw/0Pmgc6nKw8+WmXqDz7nvtS+HGjuQ16bM8MJFXsuPHV04VG/eCZCssoC95Ro8tQvgSoWwe44brobPUX4r10mYbJU3SccHwGv3Cwe2Q/ayCr2VFJS1tKK7ylGNGjUDuK+t8peNUsRMHrQPDeet+TmPnvYTnv/Ob7TWoW8Km0xIR99jlzu0l8q4I8KCUrdLI1mmTfVfvAJckPxowNxxY+NS+lx0q4qHETzMxnNS/ZEeh6aLRioypVNkfhKAvepExWZZdgwmt6pXVkN+roHuEL31ISvJbrWinEbPb6NxzW8yh41chT8DoQvFfeOAyn9T4CdevWwaGd2qL30V1QljJ9aKMqeOXbnnqmDF98CRx6SBp9Tw7OjSCq/nFqPxV3pWsOXBJdQFJscnCo4lgzjJRMzq6/pnS20d21iKq7oyx4LR9e+VpZgT+pdwbt2gbXdxTLtJhyNdPSyjPCFr6lJHiZcbOYVqdWhoJXjV9iBe83K7/Da5Nn5aV37pm9UL9eHcxfWI5J095Fo4Y745tV3+OFCVNxTr/jcMOV5xllN2yK7rbfmjXA7fdkjYgId9+mQa9a0FBrNQVKL/4ii4dHAJJp7LJB7mP4+meZLB4AACAASURBVPpxAT28bpkOCVGxZZt5sGr5N8B9D5rC4Q9XaJC0w7yAijQMf96//yNr+KJfNhDYd594txlD1ESw7xg5Gpj/6Y7i9t47tFhHVZF+ctLkLN79oOoXedghQO/jNOMgb1DX5s1ZTJmu4ajDNTRsGL1JxqbNWSz+AlhcDrz3ftXZkROO1XBSr6AoJfs9Vt+RbArF1z6xgnfp8lV4fvyUvOQuH9APDerX2+HvL78+AzfeMxLzJj9hrPJu+Glb8fQDKDnqWWDpsiyu/4OGOnUCeCGANWtFaANNmwA3XhPMO6P2ljo7yexCw9ZtaSz7Chg/Efiy3PzK3sdn0ee4eIs6t/YY9xow439AjyOBfie7LV1a9zdssJMpeCPSd8iBwfuGAStWAdLlDRkIY0L23hwzfGGPI0qLb7Ffawpf4N05NuF7sAhfeC58RTxKKmdZZZaJ3pp1wBfbxeTJvXQcf0y4EUi2bQMmT5e41hqWfgXM/8QUuvZL2sq++2TR43DNWNzg5T8Bq+/w/03xfENiBW+x5pw5ez4uufY+fDDpcdSrWyfSLg3F1lG1nESOuOsecxXrtpuiuwKuWs/aytu3Jq1tYtmy7941i55HZiAZ13hVEZAwZCNHlxlJTYYMjrdbQ9RcGiy/XXFhuOi88P1Yw/5deOHqIH2g+KpLGmP535LUR+KR//STBomVXtt12KHAKSeF22++9Y6GSf/ZsZOystbJv5k6OPiWSpcGNeYUvAX4jRk3GW33+QU67N8a6zf8iD/d9ih2Kkth5NBrjZJR9uFVaxpqpW+6jYIXmmas4kmUBjmgddXlFZE5Da5mXX9K33WPGc3i+msqAj1k6U9t8j81aoLX8sU847S0ETGDl0mgkPDNJ2ola2WhSya/jRuZiYPq1QckTKRcs2brxoqvCMpzzg72sLH9mydP0zF9hukDd+ABWSPyRedOmVj/LgvZLAp/p+BVswIFbwF+9z/2Ap547vXKuzp22Af33ngJ9mzZjIK3FnZDHyyDdPxXXVGRMwWxWrONfmlrhXfCxDQkfI9cMsi1bysHY+IdhaBY61invk/snUH3ruFu6RZbByfloiZ4rckpJ2S5rZdL+Bays/zWWzQHWrY0k//IzkXjxiJwa08tvn5tGUY+k8XatWbYPrcr7hJ9Qt6z7z7F/36scGNSRx6yLWTpYP9OwavGm4LXAb/NW7Zi9ffr0HDnBmjcaJdqJbjCmxvg8MdSxhbegPMr0Lq1A8gxu0UE74y3NYx7xVy5kQHPnmJZBrP27bLo1iXDVd/ttreiezhNHlCqTSZKgtdtwoZSZe7Fd4vwHfW0uVtjTWBziVqVrX7pNyRt/LDHM8Z7pJ+QyBH2S9wk5Nq8qergmKw2b94iyYbMO4uNNmEXu2ecJjGyixfOXjDnM6oToOBVaxEUvGr86NKQh5+kPpYUyId3y6BPr+R1mqtWlBmRKuSyVixlwPzwI32HLHhWAoZWe2UTKX5lsJ47T4cIXgndJpc9KYXiTzRyxYMQvOI3vmIF0LIF0KxZFnv+PJNz4mkl/KA7Q+FmIr/focNMV63/+32F5wfZ5LnWzpAkrBHbSJZGlUt+R3JuoN3+hSfW9pB0FLsq1P0rS8GrxpaCV40fBW8efl8tB0aMNAeHQQPS2NNB9iNFU0Sm+IqVGp4cnTJ88fJtCcohrbnzUihfih1SQB98kJl9LO6XMJgy3YxPbF2yFSxJUqxVtDo7Ab86Q2LBxmfSFITgHfpgCmtrJPaQsIQigJrtaqb8bt4ceORx82AS3RkK/9qsyYGsnvY7zZ+DlTXj8E6eloKmZSXCYeVl+fvK76R+PbOfqFtPM1ym5P9J+vbFi3VXfQvFbmH7R+EOCl41K1DwqvGj4K2F3x1/LcPWbf6thiiarujisiIpYmLL5qzxb9lKlP8ml2w3fvmlhvU/AE5PW8u2cvkSc+XXfsnA2qZ1cOJ3WwWQzQIiMv26ROQu+EyHZOOTA2rWJW4Mx/as8m2umZQiTr6EQQhe4SqsV6zSjQgBNSdWdvvKgaSBA/wRcH61o6Cfa1/d9XNy4HXiCdk1kUyPufoWmUSKW5Vd7F50QRptWsV/sh10+/HqfRS8aiQpeNX4UfDm4WeFmZIVJUltHMa16lsNH83XcMjBGSMmsJPLErMrV5o+cfL/RcTKgLde/u3gBLa8Z982wGWDzSgNTi8RgAsXAXM+1KutegrDNq38PewmQvfvD6QgCQSHDPL2dHgulwVhYoU4qu30t92nsH1bc2UtyDTZTm3n5r6gBG/Nb5L2Je131Srg7Vm64VMuNjj15DSa7eamBsm7N4jVXaHqteC1W0omkTX7Fuvvsjp80fkZhhqLeNOm4FUzEAWvGj8KXhs/ETbvvq/hhx80fLva9McMQ6RYonXadN3wgZOVw6N75l+1EHH+xn9S+GaFM385K6SQCK8mTczoC3ISW66NG4WBjlNPBI7t6U7w2ptibf6+1mG3+g2Abds0NNzFmxWZBx4qgwTfF5sd3VN98MvnstD5IAlon9unNNfPUVaoXh5vrgiL+L/wvLTjCYziz9uX4mEJXntlLAF3TM8sjukZzoTUF7g+PDSo1V2/Ba+FRuqz8DMd78yWQ25mnxd2GDQfzBbLR1LwqpmVgleNHwXvdn4iMiUyg3WCOBdW6VQl7qSIlt2bAR3aZyRUbVFXRUUWny7UsXGjBHcH1q01A7vnW4GteWrZWnWcOr361ro9PmbjJqaQFZ85CSfUyAgrVPhzN/4EtNi1DFYc3sIlar9DBqhZ75mHuqwByl7iT1elXaUitSYEMiERduXl5r/tLgbW88VWEi/UCq/UplXG8BfMdxLdqcuCWyb2E/Kl7t4QtuC1dl+krV8ysLQnD27bkZv7vyzX8MYkHZu3mj7lfvruWt/l5wpvrrpLcoxxE/TKiBBuw6C54cl71QlQ8KoxpOBV40fBu52fpCe9Z2gKzZvB8AszwuRs1rBiJbByVX4x1f8sdyuJ8lzJDDXjfymk07lXNi3R2rIlIKc47P5rMmjVq5+t5kMqIcNk1XGfvTOGEPfi8mvgynXYrbaIBjIBWLI945Ply5lL2EqdhVtZSsNPm4Hdd6seRi0Xk5piWAZP+wE0Jy4LblhbCRLEXtLGfnlAFrvt5s3qtpvvUL03bMFrJUKJe7xjVTtJEojXJ5kxtGViLpEZZEfHz8uvfqO2b5bJ5HMvmGEk5WLEDj8trPZsCl41fhS8avwoeB3yM9NrmodozNXVqtVKEaEn9i7smyk+aDIAWYLNSG9ZI7B7rpXHfIHjax6UclgVR7cFMXDVPNglArTjgcCGH2GEpLLH/a350SIa27TZvnrbPIMWLUTwAjJxkcFdTvTLJREn5HCe2E1WgVes2J4qdfvgWPO5Iprduiw4AgrASpBgv19s2Ka1BPk3A/tPnKTj6280XH1lBXZu4PTJwd4XpuC14u6G6VsfLO3i3yauNGPGmj+Ek0/MoGsX/yOFBNFv5CPyxpu64dstl9RV6swrWgQoeNXsQcGrxo+CV4Gf/UBSbauUsqr5+qSqFYhihap9JcPvbfEgB66awtduEnHBEDHYooUpCuXfTtwynJjVEsPvzDbjhR5wQAZnn+nfILn4C92IIiHuLB9/omP+J7n9YUR0Xz4k45lvsxMWbu4JU/Baq7tcxavdYtK2R402J9d+9xX2Lwmy38hFQHbPZNIoV5D1dvP7SfK9FLxq1qfgVeNHwavIr2b2InmctWonvrOSkMCejOCk3lmlmKx2/0WJHuHXif+gBy5roBJm3btlDYYqGZ/cmNWauIQxQIpfs6xk290prOxUmbSGzhKho3G03B7CErzTZ+iYPE03Jj5DBvOgWr42bu+Tgk6cE3S/kYuBfWWbCSjc9IT+30vBq8aYgleNHwWvIj8pbj8FnetxsmIn2YJEUHlxjXwyhSXLzAgS/c/2Z+APcuAqX6ph1FNmAoF+fdOGS0GQV5iC115PWZV77gXzAI516SngikvTaNo0WCa18Q9L8Fqru0lN913oNzHzLR3ffS+TJ/OQWhgproPsN2rjYV/pZWzeQi0nuL9T8KqxpuBV40fBq8hPittjXIovr6zWyardggUa9tsvi6OOyHi6EvvtauCfo1LGdmWnjlmcebr3ojeogUvCiD36T7MuYaywiv1GjU4ZNjvhuDSOOiJcYSkuDzJYb9kCfPa5ZvjzSiSCn7cM97vsP5MwBK81qaTvbu4OS9rNzbebmSHlEk5ex6N20lUG1W84+RZrIssYvU5oBXMPBa8aZwpeNX4UvIr8pLi18uRnBqOanymrgVZa1Wv/mMbODbwVREENXPcOLcOGDTBWdWV1N4xr4ps63pmlI2qn/kWEf/GlxGHOQFZ6o3KFIXiDSpwQFcbFfIcxyV6oQYReWG4fQfUbTvlYWdhkAsCQZU6p+XcfBa8aWwpeNX4UvIr8rJPBYWwfzpuvGyuTp56UNjKMeXkFNXANfbDMiD186aB0YD67NTlFxaXBS/v5+awwBG8Yk0o/Gfr1bGsiLBOkW26o8Os1eZ8bVL/hpmISX10mA7lWvTOZLHS9yGDqbj6C9xoEKHjVGgIFrxo/Cl4FfvYoDQMuTKP1Xt6usip8mnLRoAYuKzZt/7PSRmzaMC4KXnfUgxa8ny/W8PSYFHbZJYtrrg5nF8AdofDutlbC27XN4hyf/Ptrq11Q/YYbwhJGcuRTpui1YpZLbO/ypWZ8dTkAKf+dl/8EKHjVGFPwqvGj4C2Sn13snnN2RinyQpGf4GuxoAauKLgTWAdcwvIh9tWQPjw8aMFr2ad714zhdsIrP4GwV8KD6jfctgEruk2ucmH5O7utQxzup+BVsyIFrxo/Ct4i+NnFblzD3gQ1cFksu3fL4MRe4YiZuR9KetJUqH7ERTTD0IoELXitSZFEJeneNW0kGdm0PRNi82bZSPk3h2YUAFFIyhFUv+GE8w8/AJ8s0I2VXfmNyyWuHpKUQtKNt2ubgewwWSu/DHXnhKraPRS8avwoeNX4UfC65PfhPB0vjzcDm0ftkJPLqtR6e1ADl2wtjhydCu2QjUCQgz6SmjTMg3Ne2s7vZwUteMe+mMInC3L7We6/bwbnnhPORMlvzm6fH4WkHEH1G4XYvD9Hx4RXt6db3H5zxwOzOOWk6rHLxd1h+OMpI4xb0DGLC9Uhjn+n4FWzKgWvGj8KXhf8rJVAKRL37e8gBy4r5e7111R4Gr7NqWnf+0DHK6/p2OPnWQy+mD6ihbgFLXi/+x6Y/7GOzVuA8nIz5J+EmpJVum6HZXBwZ/pfRmF1V9pNkP1Gbe3UEv+yK9C6dRYHtM/iZz/L3U7sEW/i6J5W6Pcc5N8peNVoU/Cq8aPgrYWfzP4lU9q69TC2xOSAg1y7Ns3iyt/HWxgFOXBZp6jP7Z/G/vsFL14sH1E/E3ko/kwjVTxowWuvvOWLGUZUlEgZocbHvPivlJGqWrhcdH4aWkiBB4LsN/LZoxjxb/UBYcfslZTjmzYBXQ6N564FBa9aL0LBq8aPgjcHP9nifuW1FH7cmB/uTTdsQ1kqpFFF0eZOigc5cFVGajg7g/Ztg+/oGaXBSYuouidMwfvxJxpe+Fe4LjDuaAVz99JlGp59Xjcm5V0OyeDUk4P/HUVlhXfKdB3Tpus475w09tvX+QTaHrNXYl8f1Ml5Wa+sfN8DKaz/QcNv+2fQdr9wbOhVXXI9h4JXjS4Frxo/Cl4bP1nRnTZDx9uzTN8vme23aQVjS0xWTlq2yBpphNeu07B3m+A7Q0VTuyoehuA9vHsGfU4IvpO3BHecfbJdGb/AzWEK3udfSuHTTzXUrQPccF3wcWa95Oj1s6zV77CSTkRF8Mp3VFQAZVWJ5xyhlsnC6DE6li83FzIkekPL5kDjJlnUqyvjgbm75+culNUX9Tgyi+OPjd8uIgWvo6aY9yYKXjV+FLzb+Ykf17jx5oleuZIufoISvAsX6Rgz1pxghCV4H/tnykjh2+OoDI4/JnjBrfgTDrx4WILX3lZ+8+sMOrSnrWoaP2x/+KD6DT8bvbhEyCqxHGTLd3XulEXnTmm0bu3dl1jps+WJQWbt9K4GhZ9EwVuYUW13UPCq8aPgBbD6O2DYcHM5QGb1/c/KhJb1S9GcnhUPauCyDpf06ZUxTkmHcVnf0Gw34PIhXDUsZIMwBK+IgUdGpIwt+7gfGC3Ev7a/W/7wA86v8FSMOf2moPoNp99T7H3bKoC339HRbNcsVq02he+69Zqxw7dkaZUQlvFCEuZ065JBkybu32adExE3Ouu5srN4xmnZ2MV2FzoUvO7biL0EBa8av0QI3i++1CAxGSXsVK7Lyk4kHc0Vl2Wwy87xdldw0mSCGListMwyaFx9RXjbd2vWAo9STDlpFsY9QQpe2XkRIfDObM1YcePBwtrNZMUsDmtSEES/4bih+nSjiN4PP9Ixd57ZJq3LcoEQF7iWzTPGhEMii6TTwM4Nqn+MuJ8s+Eyvdhhaxp8WzasEdW1h0kQoS1p52ZFcuVJDkyZZyKJB1C8KXjULUfCq8UuE4L3rnjJIB9GpYxZNm5hiVvyx6tXNYvMWDRMn6Ya/7qUD00XN0hVNEMnifg9c364GHnrEXFWPwvadfbucoYlqb5J+C16xxXsfaJCUwvbrZz8Dfn9JOKHrIvkjzfFRVujEsCYGfvcbUbODCNe581JY9Dnw0087ukA0bAhs2ABcOiiNdeuAJcvMJBhWxB+pj5wPkVXizp0yxrhkRYywC2kJwdemtYxXgERykGfaLyknYR2jflHwqlmIgleNX+wFr8zGx7ygY9Wq6kHIa2ILa0VE0Xy+Ffd74Pp2tYZ/jjJPlUeFvT000TE9qw4q+ga5RB/sl+BdtUrDE0+ZbcK6mjQG2rXLYOcGWXQ9LGscVuOVn8DadcDQB8uMCbzsaIlQslYbg+Dmd78RRB2KfYeMNRIto3yJjvKlqNUHWA4WtmmTxdFHZVG//o47imvWAc9vzwKX73uMg9Qts2ixexadflkaGQcpeIttXWY5Cl41frEWvCJgpk43Z9Oy3XRo5yzS23d9ZKYs/33FClQGsucKb1VjCmLgKl+qYdRTKeOlv7swjVZ7he9K8vyLKXxqy+rVvHkGlw2O/lahYjfgqrhfgtdanZTfaveuVSterj4uwTeL+8dzL+Q/bCUiq2VLYPMm4Ps1wGGHZnFYF2/bdhD9RqmY2HI7GDfBnMT9Ys8s9t1HfHPNiD9OL7GrrA6L+0Imq2Gf1t4elnP6HV7cR8GrRpGCV41frASv+Eo98FAKsiokl3UIQLb3+p1WPaWkHZsVCkYGhAEX5L9PEXVJFQ9i4LIHiL/q8vCC5dsNIxOit97WsfgL03dUws9deF54/sVRbDR+CV5rhb17twxOLAF/xCjZRtxAXh5vTu6lH5NdExFI4mP61XINkq2u5iWrgmf287ZtB9FvRIl7oW/Z8CNw7/2m69ZtN0Xf5aBQfVT/TsGrRpCCV41frASvnKy9++9l2LrVhOL0tKs9n3pYvm+KZvS8eBADlxUd4YzT0qEEea8NWpS/zXNju3ygX4LXOjwq7iTH9PRWiLmsYkndbh3+lI8WN4Z+fXOzk7TMIn533TWLpk3hapXRKZAg+g2n3xKF+7ZtA27/qyl4xY/XzcpuFL7f62+g4FUjSsGrxi9WgtdCYcWidHMYyh726ITj0jjqCOdbToomiGTxIAYuy05RWvn46SfgtYlmmtawo0dEsmH4FKXBWu2XSerFF6Wxe7Oo1j463yUT9TFjU5U7WVGIHR5EvxEdCzj7EityRqeOGZx5urcuJM6+IDp3UfCq2YKCV41fZAWvuCekTPdOV5f4Oz3yeKoosWI/qX/N1WnssktyRW8QA1cUBa+4MYzc7lcsDa9rlwxOPjHZg1TNH6DXK7z2HZYorva76oACuHnLZg3/+rdmnPi3zidEJXZ4EP1GAIg9e4W4NAwbbsaPPv3UNA7unNwxRaBS8Ko1LQpeNX6RFLwSsurhR8twSOcM+p7iTGxInM6WzbN4eULK8FuTPOgyeLq9LH/eY3umcXTP5HZOQQxcURS80l6+/VbDJws0TJ1uRvb4za/S6NAhuW3Bb8FruTJwRd1ZbyWRAJ540lwNEH/d/mdFJ5xiEP2GM0rRuEv6EPlHMrPJOZKkXxS8ai2AgleNXyQF748bNTz4sHmyVTqKo3vU3qGPm5AyYhtalwycv/1NBs13dy9Sps3QMGVaylhdvvmG5B4yCGLgiqrgtdoRD1Hl7ly8XOG1dmTkTW5ckBS7vZIvbk3MxV83X0KdMCoZRL8RRr2Kfac1mYuCu0mxdfCyHAWvGk0KXjV+kRS8UqXPPtfwzHNVPg0ifPfbN4MDD6guYq0ZtPj+Wdt7F51X/IqHBA9/4CFTbCc5AUEQA1cxvtaKzd1Vcdlql6Ql0rauv4arMxY8LwXvm/9N4X9vawbjPidkEr/l67SBWpN8Cl6nxMK5T9yjxE0qrFTP4dQ6/1speNUsQsGrxi+ygleqJQfJps1IGSkc7au3bVoB3bpmjI5EsqTJJeJ07zYZY2W2GN9fO0b7AZqkxub1W/BmMsBf7y3Dli0m+fN/mzZiVEbt4oC1o0W8FLwywZz0HzNNq1wSo/Scszm5KPQ7sA5CRW3l0O9+oxCXqP3dmtRLFjTJhpb0i4JXrQVQ8Krxi7TglarZtzzr1kWlQLJX249O/7F/pvD1N+ZJfZUVY0XzhFY8iIHLHhkjaitVFnhrB+GI7hn0PsGZP3loRgvoxV4KXmtiO+pp0/f++OMy6HEEOecyZTZrpp2tXy+LufN0Y8IflSyF1vcG0W8E1MyVX8MdovyTZWW4CX0ABa+i4b/5fpPiE/wrLmJ31GjTveDwbhn06ZUxBPCHH2lYsNAMqu5Xhy9+xKOf0SuzsA24IGMcEEnKFcTApRJRIyg7WIK3yyEZnHoyhZhw91rwDn8sZfzOGAO79lb9+WIdT4+pniI9akk6gug3gvrtq75H2rS0bbmkbffpVbyrneq3RKU8V3jVLEHBq8Yvsiu8q1cDI0aZ4VzyBVPfshmo6/M2kXU4pEGDLK77Y3K2WoMYuKoSDWSMiUsUL8ulQU7Ct2+XnAlPbbbwUvBaSRNkJ2XIIGY5rI37lq3AlKk6Nm3SUL9BFu/M0o1J+JDB0emXgug3othP5PsmcY+bMr0q3fORh2fQ6/ho9nVBcKXgVaNMwavGL7KCV2Lpygpg2Ks+lihLWlgZvwcucWcYOszMQBTl0/lRjySh+PMvqrhXgteKgiEH1pLqK1+UAbYXuusec0EgSv6hfvcbKrzCKlvzLEqPozI4/phkil4KXrVWSMGrxi+ygtfaSg5zBaNURJliE8hZ3O+B6/mXUvj0UzPsXFTjU366UMPzL5hbkl0Py+DkPskcpGo2EC8Er903P8nRUFR+u9buw+HdJcJFNFZ5/e43VHiFWdY+lgy4oAKtW4X5NeG9m4JXjT0Frxq/yApeqdY/HkpBMq5dfWU4nfmLL6cw/+NoizJF8+ct7vfA9e8JKcz5UEPDhllc+fs06uzkV02Kf67lziJPCHunofhaeF9SVfDK4G8dUrN8873/yvg/0dp90jXglhujETPc736jVK16/4Pmocykt3cKXrUWTMGrxi/SgjddAegpQKuKSqZYW3fFX5uoY/Z7pp/cgAuS5V8YxMBlHVaKYjrZJUuAkaPLikpR7a6Vld7dxQpeObW+aTPwxqQUFizSIud/WkqWsK+Qn9s/jf33i4Z/eRD9RinZSb5VQmeK+w4zCTK1sGrbpeBVJBjlKA2KVVMuLgO0bBvKadt8B+eUXxLRBwQxcFluK35F2lBBG2UxrlIvL8q6FbxDH0xh7bodZ60ykZR42gcfFA2x5gWboJ5hrRhG7bcTRL8RFGMv3rN0GfDEk9E/q+BFXZ08gyu8Tijlv4eCV41fpFd4FavmSXH7SoqTNMeevDQCD/F74JKEA8+/aMYSjdJkQuw9a3ZVIoQoHQiKQLMwPsG14B1WZiSRkcNp9etp2Lo1i40/mQJYdnBuSXAK72JsaiXGieKKod/9RjG8wiyT1EPP+ZhT8Kq1RgpeNX4UvA74fTBXw/hXqtIciz9n+3YZHNQpvitTfg9cb72jGxm25PrFnlkMHBCOn7bd/BLY/5Y7yiD/ti4RFcf2jLetHfwEqt3iVvDmer7snpQvER9uYM894vs7csvWyf0TXtPw/gepSCbF8bvfcMInKvdIGx/+uOm7G+VINEHyouBVo03Bq8aPgtchv3xpjk8/NY292zh8SAnd5vfAVVEBPPdCCp8v1hCl4PmTp4rTOPDzFhmIy4W4s8glkxvxNeblfoWXzLwlILsjTz5dlRSn/9kZtGkVjUmD3/2GtyS9fdoHczSsWaujIp1FeblW2XfIWy4dlEbLFtGwkbe1dvc0Cl53vGreTcGrxo+C1yU/Eb5Ll1UFE99v3wzOOyd+4aqCGLii7MNrNQtr+1j+v2y9yxZ80i8vVniTztCL+tujiEh69c6dMqjncyKeQt8dRL9R6BvC+vvtf90J27ZViVpx4WnRHIbbFmNNm1ah4FVrnRS8avwoeIvkd899ZfhxI9C6VRb9+sYvZaTfA5cldgX/mf3S6PTLaK5+yAr002NShp0lUgcvrvBGpQ3Yt8ytb5KDgJIR8KCOGTRpEvyX+t1vBF8j52+UxZC3Z+uyQYQO7TJo3dosO25CCnM/1CLpguK8dt7cScGrxpGCV40fBW+R/D6Yo2P8q6YPquXn2bhRFut+0HBQx2iKNzdV9XPgGvlkCkuWma4CchDw9L7p0ELPFWKSSQNTZ+pou28We+5Z+nYtVF8nf+cKrxNK/t4jE8Z3ZmtGpjVjJXF3VP6mrDdb/VKrvbKBiV8/+w1/ifr7dCtJSNLPBFDwqrUzCl41fhS8CvxkRi9+qJafp32g6X9WpqR9tvwcuB4bkcLGTcAZN6oZrAAAE4lJREFUfdOVqyAKZmDRgAlQ8AYMfPvrJIKIbI+L0JWDUHLJAVrJVGi5MixYqGHhIh0LFsEQw9YlOxQtW2ZRry6wyy5Al0P8ccPys98Ih7o3bxVbjH5Wx/Kvq+zWp1f8dgYL0aLgLUSo9r9T8Krxo+BV5CfF58zVMG9+CtlsFuvWo3IwOql3xogzWooXB65StFow30zBGwxn+1tGjEzhq+VVAlZcF07qXfuEUfzPy5dUhdizP0/K9z/Le8HFfqP2tiE2mTJdrxwj5PyHnANJykXBq2ZpCl41fhS8ivxyFbf8U3scmcXxx5am3ycHLh8aRkweScEbvCFvus1MXiARTeS0vxu3KdmJ+malhh9+EBcIYO68qhXiPidkcHh37wQX+43CbcMe8Sdph9koeAu3j9ruoOBV40fBq8gvX/FMBtBNF9+SvDhwlaTZAvloCt5AMFd7ya13liGdBn7z6ww6tFcTqDVDLB7WJYNTTlR7pvWx7Dect40nRqWw9Ctz1V6ibHQv0d1A5zVmlAY3rHLdS8GrSJCphRUBxrQ4B66YGtaDalHwegDR5SP8COFnhdyTVcarr6jyA3b5adVuZ7/hnN6GDcB/p6SMFXe55EDbheem0bSp82eU2p1c4VWzGAWvGj+u8Cryi2txDlxxtax6vSh41Rm6fcI7s3VMnKR7noa7ZvQA1YgO7DfcWhbGIcPXJ5luJj2OzOD4Y71ZbXf/Jf6XoOBVY0zBq8aPgleRX1yLc+CKq2XV60XBq87Q7ROWLwceH1kGOWw2ZLB35wIkPOCzY1LYsrXqiw7vloVEECjmYr9RDDWzjETY2GfvDOrUqTqcWPzTolmSglfNLhS8avwoeBX5xbU4B664Wla9XhS86gzdPkHOBNz5tzJsqwCuv6bC84xq4t4w50PdCHuma8AtN1a4/UTjfvYbRWFLTCEKXjVTU/Cq8aPgVeQX1+IcuOJqWfV6UfCqMyzmCRPf1PHOLB3H9MwY//hxDX/MjCs+4PyKomJks9/wwyrxeSYFr5otKXjV+FHwKvKLa3EOXHG1rHq9KHjVGRbzBHs6bvG1/d2Fxbkd1PZuS1TLAarjjsmi0y/dCWv2G8VYNjllKHjVbE3Bq8aPgleRX1yLc+CKq2XV60XBq87Q7RM+WaBh7Ispo5hEVWjcCJ768lrfs2QJMPrZMlSkzbCKt/zFnWsD+w23lk3W/RS8avam4FXjR8GryC+uxTlwxdWy6vWi4FVn6PYJt9xZhkwale4M2Syg+Xi26dERkjkSuHSQu1Vk9htuLZus+yl41exNwavGj4JXkV9ci3Pgiqtl1etFwavO0O0TPl+sYeNG4KBOWbdFA72f/UaguEvuZRS8aiaj4FXjR8GryC+uxTlwxdWy6vWi4FVnGNcnsN+Iq2W9qRcFrxpHCl41fhS8ivziWpwDV1wtq14vCl51hnF9AvuNuFrWm3pR8KpxpOBV40fBq8gvrsU5cMXVsur1ouBVZxjXJ7DfiKtlvakXBa8aRwpeNX4UvIr84lqcA1dcLateLwpedYZxfQL7jbha1pt6UfCqcaTgVeNHwavIL67FOXDF1bLq9aLgVWcY1yew34irZb2pFwWvGkcKXjV+FLyK/OJanANXXC2rXi8KXnWGcX0C+424WtabelHwqnGk4FXjR8GryC+uxTlwxdWy6vWi4FVnGNcnsN+Iq2W9qRcFrxpHCl41fhS8ivziWpwDV1wtq14vCl51hnF9AvuNuFrWm3pR8KpxpOBV40fBq8gvrsU5cMXVsur1ouBVZxjXJ7DfiKtlvakXBa8aRwpeNX4UvIr84lqcA1dcLateLwpedYZxfQL7jbha1pt6UfCqcaTgVeNHwavIL67FOXDF1bLq9aLgVWcY1yew34irZb2pFwWvGkcKXjV+FLyK/OJanANXXC2rXi8KXnWGcX0C+424WtabelHwqnGk4FXjR8GryC+uxTlwxdWy6vWi4FVnGNcnsN+Iq2W9qRcFrxpHCl41fhS8ivziWpwDV1wtq14vCl51hnF9AvuNuFrWm3pR8KpxpOBV40fBq8gvrsU5cMXVsur1ouBVZxjXJ7DfiKtlvakXBa8aRwpeNX4UvIr84lqcA1dcLateLwpedYZxfQL7jbha1pt6UfCqcaTgtfGrSKehazp0XduB6oYff4L8vUmjhtX+9s33m9QswNKxJMCBK5Zm9aRSFLyeYIzlQ9hvxNKsnlWKglcNJQXvdn6bNm/F2YNvwaBzT8UpJ3SvpPrTps249o7HMOWtucZ/69hhHwy74wrs1rSR8f8peNUaYFxLc+CKq2XV60XBq84wrk9gvxFXy3pTLwpeNY4UvAD+/uhYjHp+okHy7hsGVxO8/xzzGl58ZRqeHnYD6terg0uvG4o2e7XE7dcMoOBVa3uxLs2BK9bmVaocBa8SvlgXZr8Ra/MqV46CVw0hBS+Adet/xOatW3HOkNtx9aCzqgneXw28Gb2P7oKBvz3FID1p2ru4+pbh+HjqKGiaxhVetfYX29IcuGJrWuWKUfAqI4ztA9hvxNa0nlSMglcNIwWvjV/v/n/C5QPOqCZ4u5x4Ce649neG6JXr08+W4NeDbsHbrzyMRg13puBVa3+xLc2BK7amVa4YBa8ywtg+gP1GbE3rScUoeNUwxlrwvvLm21i5ek1OQh32b40juhxY7W81BW82m8WBx1yE4X+9Cj27dzLu/WLJ1+h74Q3479j70LL5rhS8au0vtqU5cMXWtMoVo+BVRhjbB7DfiK1pPakYBa8axlgL3mdf/i+Wr1idk9DBv9wPJ/Q4tFbBK3+UFd47r7sYvXqa99Zc4VXDz9IkQAIkQAIkQAIkQAJ+E4i14HULL5dLg/jw9jnmMFx8zsnG42r68Lp9B+8nARIgARIgARIgARIIlgAFL2DE181msjjl/D/jkvP74pTju2OnncoMS4x49lW89Op0I0pDg/p1ccm191eL0hCsufg2EiABEiABEiABEiABtwQoeAEj6oKs3NqvV0f/1RC2G3/ajD/e9ghmzJpn/PnAtm0w7M4rsftujY3/L36+6UwGZalUTvbfrVmPBvXrGWKZFwmQAAnkS2JDMiRAAiRgEaC28L4tUPA6ZLp+w0Zs21ZRmXDCKiYH44aOeBFTXhxa7UnLvl5lrAYvXb7K+O9nnNQDN119AXYqyy2MHX4GbytBAnc//BxGvzip2pd3PnA/PPPQDSVYG35ysQQKJbEp9rksV/oEJs+cgytufHCHisx5cwTq1tmp9CvIGrgmQG3hGlnBAhS8BRHlvkEE7cA//t04FNe8WZMdBO+gP/0du+xcH3deNxArv/0eZw2+FTdddT5O7XV4kW9ksVIl8LeHxuCrb77FNUP6V1ahbt2d0KJZ01KtEr+7CAKFktgU8UgWiQmB/878AH++awReGnFrtRrttcfuRrx3XskhQG3hn60peItkK36/4q4w5X9z8c8xr1YTvLIafPiplxkreLKSJ9ed/3gaK79dY7hD8EoWARG86374EX+7flCyKs7aViNQKIkNcSWXgAjeW+97EjP/PSy5EFhzgwC1hX8NgYJXke3EKbNx7yPPVxO8Vqzeaf96AM12NX19n37pTYyf9NYOM3jF17N4CRAQwfvm9PfQ7eAOaNKoIY498mAc0nH/EvhyfqKXBAolsfHyXXxWaREQwXvljcNwWu8jULduHRzaqa2R7Cjf2ZDSqh2/thgC1BbFUKu9DAVvDT7frPwOr02elZfauWf2Qv16dSr/nqtRzv34c5z7+zsrs7HJzS+8Mg2Pjh6/g+uD9yblE4Mi8MFHn2HO/M9yvk6E7a9O6Wn8TXyxlixfafjifbyoHOKvd/8tQ9D76MOC+lS+J2QCTpLYhPyJfH2IBOYvLDcOThvZO1d9jxcmTMU5/Y7DDVeeF+JX8dVhEqC28J4+BW8NpnLI7PnxU/KSvnxAPyPqgnXVNgub/vI/Kg+5cYXX+8Yb9hNnzv4Ib7//Sc7PaNq4IQb+9pScf7vursexbv0GPHr3H8KuAt8fIAEmsQkQdom/6uXXZ+DGe0Zi3uQnuMpb4rYs9vOpLYoll78cBa8i01yNMpcP7+1DR+Pb79bSh1eRdxyKPzDiJcjq8NPDro9DdVgHhwSYxMYhKN6GmbPn45Jr78MHkx5HvbpVO4pEkxwC1Bbe25qCt0imskVZUZHGG1PfNcKSTRpzLzRdq5yNX/zHe/GzXXY20hIzSkORkGNSbOjjL6Jvr8Ox154tsOiLZbjo/+42MvcNPu/UmNSQ1XBCgElsnFBK5j1jxk1G231+gQ77t8b6DT/iT7c9aoSwHDn02mQCSXCtqS38Mz4Fb5FsF5d/jdMuqh5HVUKOWSfxy5etMOLwStgyuU7vcyRu+cOFlRncinwti5UggbMH32r47lqXtIUbrzqfKzclaEuVTy6UxEbl2Sxb2gTuf+wFPPHc65WV6NhhH9x74yXYs2Wz0q4Yv941AWoL18gcF6DgdYyquBtXrV5rxOPduUGV329xT2KpUiYg2bXWrt+AZrs2qXbosZTrxG8vjkC+JDbFPY2l4kJg85atWP39OjTcuQEaN9olLtViPXwiQG3hHiwFr3tmLEECJEACJEACJEACJFBCBCh4S8hY/FQSIAESIAESIAESIAH3BCh43TNjCRIgARIgARIgARIggRIiQMFbQsbip5IACZAACZAACZAACbgnQMHrnhlLkAAJkAAJkAAJkAAJlBABCt4SMhY/lQRIgARIgARIgARIwD0BCl73zFiCBEiABEiABEiABEighAhQ8JaQsfipJEACJEACJEACJEAC7glQ8LpnxhIkQAIkQAIkQAIkQAIlRICCt4SMxU8lARIgARIgARIgARJwT4CC1z0zliABEiABEiABEiABEighAhS8JWQsfioJkAAJkAAJkAAJkIB7AhS87pmxBAmQAAmQAAmQAAmQQAkRoOAtIWPxU0mABEiABEiABEiABNwToOB1z4wlSIAESIAESIAESIAESogABW8JGYufSgIkQAIkQAIkQAIk4J4ABa97ZixBAiRAAiRAAiRAAiRQQgQoeEvIWPxUEiABEiABEiABEiAB9wQoeN0zYwkSIAEScEVg2MiX8e7chbjzuoux1x67G2UXffEVbh86Gmf3PQan9jrc1fN4MwmQAAmQgDsCFLzuePFuEiABEnBN4Ls169FvwF/QvFlTjHn4L9hWkcavB92MXZs0wsih12KnspTrZ7IACZAACZCAcwIUvM5Z8U4SIAESKJrAnPmf4bzL78JvzzgB6zf8iLff+xjjRt6B3Zo2KvqZLEgCJEACJOCMAAWvM068iwRIgASUCTz14iTc8/BzxnPGPnYzDmzbRvmZfAAJkAAJkEBhAhS8hRnxDhIgARLwhMDM2R/hkmvvN571+jN3o9WezT15Lh9CAiRAAiRQOwEKXrYQEiABEgiAwPIVq9FvwI3oc8xh+OCjRShLpfD8ozejQf26AbydryABEiCBZBOg4E22/Vl7EiCBAAhs2rwV5wy5DalUyji09uWyFTjz4puM6Ax/u35QAF/AV5AACZBAsglQ8Cbb/qw9CZBAAARu/vsovPTqdEx89m7stYfpxvD8+ClGWLJb/3gRfnVKzwC+gq8gARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQSoOBNru1ZcxIgARIgARIgARJIBAEK3kSYmZUkARIgARIgARIggeQS+H/ldYcCoa21CAAAAABJRU5ErkJggg==" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "#| caption: Rapidly Exploring Random Tree (RRT) from start to goal. The green points are the start and goal.\n", - "#| label: fig:rrt_example\n", - "fig.show()\n" - ] - }, - { - "cell_type": "markdown", - "id": "eoJqxxWL-_hS", - "metadata": {}, - "source": [ - "You can see that the RRT increasingly \"fills\" the entire area of interest.\n", - "The algorithm terminates when it finds a leaf vertex that is near the goal." + "fig.update_yaxes(range=[-10, 10], autorange=False,scaleratio = 1);\n", + "fig.show()" ] } ], diff --git a/S56_diffdrive_learning.ipynb b/S56_diffdrive_learning.ipynb index 280cc56e..fbe7d040 100644 --- a/S56_diffdrive_learning.ipynb +++ b/S56_diffdrive_learning.ipynb @@ -23,7 +23,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "id": "-z7-iMHZamMh", "metadata": { "tags": [ @@ -40,7 +40,7 @@ } ], "source": [ - "%pip install -q -U gtbook\n" + "%pip install -q -U gtbook" ] }, { @@ -114,12 +114,12 @@ "id": "eVunmcqzSi4j", "metadata": {}, "source": [ - "```{index} supervised learning, classification, regression\n", - "```\n", "## Supervised Learning Setup\n", "\n", "> From data, learn concept.\n", "\n", + "```{index} supervised learning, classification, regression, training dataset\n", + "```\n", "In the **supervised learning** setup, we have a large number of examples of inputs $x$ and corresponding labels $y$.\n", "We will often refer to the *training dataset* as $D$, consisting of pairs $(x,y)$. The nature of the output labels $y$ determine the type of learning problem we are dealing with:\n", "\n", @@ -133,6 +133,8 @@ "id": "PiBqLmehLzBj", "metadata": {}, "source": [ + "```{index} training datasets, validation dataset, test dataset, overfitting\n", + "```\n", "Whether we are talking about classification or regression, the supervised leaning process normally follows these steps:\n", "\n", "1. Define a model $f$ and its parameters $\\theta$ that allow you to output a prediction $\\hat{y}$ from the input features $x$:\n", @@ -143,9 +145,9 @@ "\n", "2. Train the model using the training data $D_{\\text{train}}$, while monitoring for \"overfitting\" on the validation dataset $D_{\\text{val}}$. We train by adjusting the parameters $\\theta$ to minimize a training loss, both of which we look at in more detail below.\n", "\n", - "3. After we decided to stop the training process, we typically test the model on the held-out dataset $D_{\\text{test}}$ that the training process has never seen, to get an independent assessment on how well the model will generalize towards new, unseen data.\n", + "3. After we decide to stop the training process, we typically test the model on the held-out dataset $D_{\\text{test}}$ that the training process has never seen, to get an independent assessment of how well the model will generalize towards new, unseen data.\n", "\n", - "Supervised learning is the staple of machine learning and its use has exploded in recent years to encompass almost any human economic activity, ranging from finance to healthcare and everything in between. Most recently the success of large language models is also based on supervised learning, where a \"transformer\"-based model is trained to predict the next word (or token) in a sequence, from very large textual datasets, a paradigm which is rapidly finding its way to different modalities like vision as well." + "Supervised learning is the staple of machine learning and its use has exploded in recent years to encompass almost any human economic activity, ranging from finance to healthcare and everything in between. Most recently the success of large language models is also based on supervised learning, where a *transformer*-based model is trained to predict the next word (or token) in a sequence, from very large textual datasets, a paradigm which is rapidly finding its way to different modalities like vision as well." ] }, { @@ -155,7 +157,7 @@ "source": [ "## Example: Interpolation in 1D\n", "\n", - "As an example, e formulate a simple regression problem that asks for interpolating functions in 1D. We will create a *differentiable* interpolation scheme that can be trained using samples from any function we want to interpolate, even functions with multi-dimensional outputs.\n", + "As an example, we formulate a simple regression problem that asks for interpolating functions in 1D. We will create a *differentiable* interpolation scheme that can be trained using samples from any function we want to interpolate, even functions with multi-dimensional outputs.\n", "\n", "The `LineGrid` class below is designed for this purpose, and divides up the 1D interval over which the function is defined in a number of *cells*, arranged in a 1D grid. It is initialized with two parameters:\n", "\n", @@ -251,12 +253,12 @@ "id": "E5lXyKlfEG3I", "metadata": {}, "source": [ - "```{index} mean squared error\n", - "```\n", "## Loss Functions\n", "\n", "> A loss function for every occasion.\n", "\n", + "```{index} mean squared error, loss function\n", + "```\n", "Different tasks require different loss functions, and a lot of creativity and research goes into crafting loss functions for complex tasks. For \"vanilla\" regression tasks, we typically use a **mean squared error** loss function as we already encountered before:\n", "\\begin{equation}\n", "\\mathcal{L}_{\\text{MSE}}(\\theta; D) \\doteq \\frac{1}{|D|} \\sum_{(x,y)\\in D}|f(x;\\theta)-y|^2\n", @@ -288,7 +290,7 @@ "id": "zaJW_r0PV78Z", "metadata": {}, "source": [ - "We used the vectorized versions of subtraction and power above, and then used the `mean` method of tensors. As you can see, the MSE loss in this case is 14.8715. Even though in this case the calculation is simple, many other loss functions exists and might not be that straightforward to implement. Luckily, PyTorch has many loss functions built-in:" + "We used the vectorized versions of subtraction and power above, and then used the `mean` method of tensors. As you can see, the MSE loss in this case is 13.045638. Even though in this case the calculation is simple, many other loss functions exist and might not be as straightforward to implement. Luckily, PyTorch has many loss functions built-in:" ] }, { @@ -318,12 +320,14 @@ "source": [ "```{index} cross entropy\n", "```\n", - "For classification, the **cross entropy** loss function is very popular: it measures the average disagreement of the predicted labels with the ground truth labels:\n", + "For classification, the **cross entropy** loss function is very popular.\n", + "It measures the average disagreement of the predicted labels with the ground truth labels:\n", "\\begin{equation}\n", "\\mathcal{L}_{\\text{CE}}(\\theta; D) \\doteq \\sum_c \\sum_{(x,y=c)\\in D}\\log\\frac{1}{p_c(x;\\theta)}\n", "\\end{equation}\n", "\n", - "This formula seems perhaps unintuitive and rather complicated. However, it is actually quite intuitive once you understand a few concepts.\n", + "This formula seems perhaps unintuitive and rather complicated;\n", + "however, it is actually quite intuitive once you understand a few concepts.\n", "In particular, in the multi-class classification problem we assume that the model outputs a probability $p_c(x;\\theta)$ for every class $c\\in[N]$, where $N$ is the number of classes. The quantity \n", "\\begin{equation}\n", "\\log\\frac{1}{p_c(x;\\theta)}\n", @@ -333,7 +337,9 @@ "However, if the probability is only $0.01$, our surprise is $\\log\\frac{1}{0.01}=\\log 100 = 2$.\n", "The lower the probability, the higher the surprise. Hence, the cross-entropy above measures the *average surprise* for seeing the labeled examples in the training data. After training, the model is the least surprised possible, hopefully, which is why it is an intuitive loss function to minimize.\n", "\n", - "Note that training with cross-entropy does not guarantee that the outputs can be *truly* interpreted as probabilities: the recent field of \"model calibration\" has shown that especially neural networks can severely over-estimate those probability values in attempting to minimize the loss. If this interpretation is important for the application at hand, several techniques now exist to \"calibrate\" the models to be more interpretable that way." + "```{index} model calibration\n", + "```\n", + "Note that training with cross-entropy does not guarantee that the outputs can be *truly* interpreted as probabilities: the recent field of *model calibration* has shown that especially neural networks can severely over-estimate those probability values in attempting to minimize the loss. If this interpretation is important for the application at hand, several techniques now exist to *calibrate* the models to be more interpretable that way." ] }, { @@ -341,15 +347,14 @@ "id": "pb2oEJG4Z8Dt", "metadata": {}, "source": [ - "```{index} gradient descent\n", - "```\n", "## Gradient Descent\n", "\n", "> Calculate gradient, reduce loss.\n", "\n", "A neural network output, and in particular a CNN, depends on the large set of continuous weights $W$ that make up its convolutional layers, pooling layers, and fully connected layers. In other words, the neural network is the model $f(x;\\theta)$ in the learning setup discussed above, and the weights $W$ are its parameters $\\theta$.\n", "\n", - "\n", + "```{index} gradient descent\n", + "```\n", "When we train a neural networks, we adjust its weights $W$ to perform better on the task at hand, be it classification or regression. To measure whether the model performs \"better\", we can use one of the loss functions defined above. To adjust the weights, we could calculate the gradient of the loss function with respect to each of the weights, and adjust the weights accordingly. That procedure is called **gradient descent**." ] }, @@ -414,18 +419,20 @@ "id": "jyI361D_6bRA", "metadata": {}, "source": [ - "We can then use the PyTorch training code below, which is a standard way of training any differentiable function, including our LineGrid class. That is because all the operations inside the LineGrid class are differentiable, so gradient descent will just work.\n", + "We can then use the PyTorch training code below, which is a standard way of training any differentiable function, including our `LineGrid` class. That is because all the operations inside the `LineGrid` class are differentiable, so gradient descent will just work.\n", "\n", - "Inside the training loop below, you'll find the typical sequence of operations: zeroing gradients, performing a forward pass to get predictions, computing the loss, and doing a backward pass to update the model's parameters. Try to understand the code, as this same training loop is at the core of most deep learning architectures. Now, let's take a closer look at the code itself, which is extensively documented for clarity:" + "Inside the training loop below, you'll find the typical sequence of operations: zeroing gradients, performing a forward pass to get predictions, computing the loss, and doing a backward pass to update the model's parameters. Try to understand the code, as this same training loop is at the core of most deep learning architectures. Now, let's take a closer look at the code itself, which is extensively documented for clarity, and listed in Figure [2](#train_gd)." ] }, { "cell_type": "code", - "execution_count": 9, + "execution_count": null, "id": "pFZvb4Mz458C", "metadata": {}, "outputs": [], "source": [ + "#| caption: Code to train a model using gradient descent.\n", + "#| label: code:train_gd\n", "def train_gd(model, dataset, loss_fn, callback=None, learning_rate=0.5, num_iterations=301):\n", " # Initialize optimizer\n", " optimizer = optim.SGD(model.parameters(), lr=learning_rate)\n", @@ -1982,30 +1989,17 @@ "id": "J1c0z_y-2s6E", "metadata": {}, "source": [ - "Note that gradient descent converges rather slow. You could try experimenting with the learning rate to speed this up. \n", + "The resulting loss function is shown in Figure [2](#fig:loss_training).\n", + "Note that gradient descent converges rather slowly.\n", + "You could try experimenting with the learning rate to speed this up. \n", "\n", - "After the training has converged, we can evaluate the resulting functions and plot the result against the training data, and we see that we get decent approximations of sin and cos, even with noisy training data:" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "id": "lMSjElNmAaMt", - "metadata": {}, - "outputs": [], - "source": [ - "x_sorted = torch.sort(x_samples).values\n", - "y_pred = model(x_sorted).detach().numpy()\n", - "fig = plotly.graph_objects.Figure()\n", - "fig.add_scatter(x=x_samples, y=y_samples[:, 0], mode='markers', name='sin')\n", - "fig.add_scatter(x=x_samples, y=y_samples[:, 1], mode='markers', name='cos')\n", - "fig.add_scatter(x=x_sorted, y=y_pred[:, 0], mode='lines', name='predicted sin')\n", - "fig.add_scatter(x=x_sorted, y=y_pred[:, 1], mode='lines', name='predicted cos');\n" + "After the training has converged, we can evaluate the resulting functions and plot the result against the training data,\n", + "and Figure [3](#fig:sin_cos_approx) that we get decent approximations of sin and cos, even with noisy training data." ] }, { "cell_type": "code", - "execution_count": 13, + "execution_count": null, "id": "bo8SalWnHCFO", "metadata": {}, "outputs": [ @@ -4880,7 +4874,14 @@ "source": [ "#| caption: Learned approximation of the sine and cosine functions. The model has learned to fit the data.\n", "#| label: fig:sin_cos_approx\n", - "fig.show()\n" + "x_sorted = torch.sort(x_samples).values\n", + "y_pred = model(x_sorted).detach().numpy()\n", + "fig = plotly.graph_objects.Figure()\n", + "fig.add_scatter(x=x_samples, y=y_samples[:, 0], mode='markers', name='sin')\n", + "fig.add_scatter(x=x_samples, y=y_samples[:, 1], mode='markers', name='cos')\n", + "fig.add_scatter(x=x_sorted, y=y_pred[:, 0], mode='lines', name='predicted sin')\n", + "fig.add_scatter(x=x_sorted, y=y_pred[:, 1], mode='lines', name='predicted cos');\n", + "fig.show()" ] }, { @@ -4888,10 +4889,10 @@ "id": "Cc3kfkWGei-x", "metadata": {}, "source": [ - "```{index} pair: stochastic gradient descent; SGD\n", - "```\n", "## Stochastic Gradient Descent\n", "\n", + "```{index} pair: stochastic gradient descent; SGD\n", + "```\n", "**Stochastic gradient descent** or **SGD** is an approximate gradient descent procedure, to cope with the very large data sets typically thrown at supervised problems. It is typically impossible to calculate the *exact* gradient, which requires looping over all the examples, which can run in the millions. An easy approximation scheme is to *randomly sample* a small subset of the examples, and calculate the gradient of the weights using only those examples. The upside is that this is much faster, but the downside is that this is only approximate. Hence, if we adjust weights with this approximate gradient, we might or might not make progress on the task. This procedure is called stochastic gradient descent, and it works amazingly well in practice.\n", "\n", "The `DataLoader` class in PyTorch makes implementing SGD very easy: it can wrap any `Dataset` instance, and then retrieves training samples one \"mini-batch\" at a time. The code below uses a mini-batch size of 25, but feel free to experiment with different values for both this parameter and the learning rate to get a feel for what happens. Note that by convention we refer to one execution of the inner loop below, over a mini-batch, as an \"iteration\". One full cycle through the dataset by randomly selecting mini-batches is referred to as an \"epoch\"." @@ -5905,7 +5906,8 @@ "id": "E61thb3Ae0LD", "metadata": {}, "source": [ - "Note that we converged *much* faster in this case: in just 30 iterations we reached the same low loss as with 300 iterations before. The answer is because with 250 training samples and mini-batches of size 25, each epoch adjusts the model's parameters 10 times. This effectively boosts the learning rate by a factor of 10. However, note that because each mini-batch looks at only one 10th of the dataset, each mini-batch's adjustment could *adversely* affect the performance on the other training samples." + "The training loss is shown in Figure [4](#fig:loss_training_sgd).\n", + "Note that we converge *much* faster in this case: in just 30 iterations we reached the same low loss as with 300 iterations before. The answer is because with 250 training samples and mini-batches of size 25, each epoch adjusts the model's parameters 10 times. This effectively boosts the learning rate by a factor of 10. However, note that because each mini-batch looks at only one 10th of the dataset, each mini-batch's adjustment could *adversely* affect the performance on the other training samples." ] }, {