You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/120-miscellaneous/3d_model_processing/10-cloudcompare.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,7 +46,7 @@ Used for data processing before meshing and texturing.
46
46
47
47
[CC command line mode](https://www.cloudcompare.org/doc/wiki/index.php?title=Command_line_mode) opens a way for scripting of most of the functions available within CC GUI.
48
48
**Scripting is faster than GUI and provides repeatability: prefer scripting over GUI.**
49
-
Example script is available [here](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/blob/master/scripts/pointclouds/processPtxFiles.sh).
49
+
Example script is available [here](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/blob/master/scripts/pointclouds/processPtxFiles.sh).
50
50
Example command merging two `.ptx` files, exporting them to `.ply`, sampling the data to resolution of 2 cm and exporting the sampled data to `.ply`:
Copy file name to clipboardExpand all lines: docs/120-miscellaneous/3d_model_processing/30-meshlab.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -137,8 +137,8 @@ The pointcloud vertex color might not be detailed enough for the whole model or
137
137
138
138
#### Create whole model
139
139
140
-
***This method is not automatized, only proof of concept.** The method assume having precise position of the images w.r.t. the model. The files for this project are available [here](https://nasmrs.felk.cvut.cz/index.php/apps/files/?dir=/shared/3D_models/raster_texture_example).
141
-
* Use either [CloudCompare](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-images) or [VoxelizeE57Files](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package to extract images.
140
+
***This method is not automatized, only proof of concept.** The method assume having precise position of the images w.r.t. the model. The files for this project are available [here](https://nasmrs.fel.cvut.cz/index.php/apps/files/?dir=/shared/3D_models/raster_texture_example).
141
+
* Use either [CloudCompare](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-images) or [VoxelizeE57Files](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package to extract images.
142
142
* Open the MeshLab and import the mesh model, you would like to texture with `File->Import Mesh...`. The model do not need parametrization.
143
143
* Import all raster images `File->Import Raster...`
144
144
***Recommend to save the MeshLab project `.mlp` as much as you can. MeshLab likes to crash.**
@@ -191,7 +191,7 @@ ViewportPx="2048 2048"
191
191
FocalMm="6.141"
192
192
```
193
193
194
-
* Values are obtained from both CloudCompare `Camera Sensor` output and [VoxelizeE57Files](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package running with `--only-export-images` option.
194
+
* Values are obtained from both CloudCompare `Camera Sensor` output and [VoxelizeE57Files](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package running with `--only-export-images` option.
195
195
* Copy the whole `.xml` snippet and select `Windows->Paste clipboard to camera setting`.
196
196
* You should be able to see the change in the camera perspective.
197
197
* If you copy again the current value of the camera with `Windows->Copy camera settings to clipboard`, the values are different since MeshLab recalculates the imported values. Values are correct. Here is the example of such a snippet
@@ -230,7 +230,7 @@ FocalMm="6.141"
230
230
```
231
231
232
232
* These values represent the Leica BLK 360 camera sensor as described in the previous chapter even the values do not match. MeshLab recalculates the values for its own projection.
233
-
* To correct the camera sensor position and orientation, check the [extract sensor position](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-sensor-positions) guide or [VoxelizeE57Files](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package. *Note: Not sure if VoxelizeE57Files gives the same transformation.*
233
+
* To correct the camera sensor position and orientation, check the [extract sensor position](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-sensor-positions) guide or [VoxelizeE57Files](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package. *Note: Not sure if VoxelizeE57Files gives the same transformation.*
234
234
* Correct the values in `cameras.xml` configuration file for the `cameras` tag for each image as below:
Copy file name to clipboardExpand all lines: docs/120-miscellaneous/3d_model_processing/99-export.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,8 +26,8 @@ Most of the times, the models are saved in following formats
26
26
27
27
## Web
28
28
29
-
* The basic guide will be described here. Please follow the [original guide](https://mrs.felk.cvut.cz/gitlab/bednaj14/meshlab/blob/master/modely_report.pdf) and the scripts inside for detailed information.
30
-
* Download the [obj2optimizedGlb.sh](https://mrs.felk.cvut.cz/gitlab/bednaj14/meshlab/blob/master/obj2optimizedGlb.sh) script.
29
+
* The basic guide will be described here. Please follow the [original guide](https://mrs.fel.cvut.cz/gitlab/bednaj14/meshlab/blob/master/modely_report.pdf) and the scripts inside for detailed information.
30
+
* Download the [obj2optimizedGlb.sh](https://mrs.fel.cvut.cz/gitlab/bednaj14/meshlab/blob/master/obj2optimizedGlb.sh) script.
31
31
* Recommend to install `NodeJS` with [snapcraft](https://snapcraft.io/node) tool. The `apt` version for `Ubuntu` does not contain the up-to-date version.
32
32
* The convert tool needs to have max *16384x16384* texture. Otherwise it will not work.
33
33
**Note: In general, it is better to have several lower size texture files than a single large one.*
Copy file name to clipboardExpand all lines: docs/120-miscellaneous/latex.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ description: Syncing overleaf with a git repository
11
11
***Note:** you must login with an email and password and not through a Google account. It is needed to push to the overleaf afterward since it doesn't support SSH keys by default.
12
12
2. Get git link:
13
13
* Once you've created a project, go to the menu, in the section called ```Sync``` you can find ```Git```. Go there.
14
-
* If you have created a new account, this feature will be paid only. For this purpose you have to use our shared credentials, see on this page "Overleaf credentials" [here](http://mrs.felk.cvut.cz/internal), use them to log in and get GitLab link.
14
+
* If you have created a new account, this feature will be paid only. For this purpose you have to use our shared credentials, see on this page "Overleaf credentials" [here](http://mrs.fel.cvut.cz/internal), use them to log in and get GitLab link.
15
15
* This will open a pop-up window with ``git clone https://git.overleaf.com/<your overleaf project link>``. Copy the ``https...`` link.
Copy file name to clipboardExpand all lines: docs/50-features/20-trackers/index.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ This page is describing the upcoming ROS2 version of the MRS UAV System (however
26
26
* "MPC tracker"
27
27
* the main *workhorse* of the [MRS UAV System](https://github.com/ctu-mrs/mrs_uav_system), it is used for most of the regular flying
28
28
* based on a unique *realtime simulated Model Predictive Control* approach
29
-
* originally published in: `Baca, et al., "Model Predictive Trajectory Tracking and Collision Avoidance for Reliable Outdoor Deployment of Unmanned Aerial Vehicles", IROS 2018`, [link](http://mrs.felk.cvut.cz/data/papers/baca-mpc-tracker.pdf)
29
+
* originally published in: `Baca, et al., "Model Predictive Trajectory Tracking and Collision Avoidance for Reliable Outdoor Deployment of Unmanned Aerial Vehicles", IROS 2018`, [link](http://mrs.fel.cvut.cz/data/papers/baca-mpc-tracker.pdf)
30
30
* produces feasible reference which is smooth up to snap and satisfies given state constraints.
31
31
* can smoothly track trajectories
32
32
* can efficiently stop a UAV from any previous motion
Copy file name to clipboardExpand all lines: docs/60-simulations/30-FlightForge/01-installation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ The FlightForge simulator is composed of two main parts - the simulator itself e
19
19
20
20
## FlightForge Simulator
21
21
22
-
The prebuild binaries of the simulator can be downloaded from the [here](https://nasmrs.felk.cvut.cz/index.php/s/MnGARsSwnpeVy5z).
22
+
The prebuild binaries of the simulator can be downloaded from the [here](https://nasmrs.fel.cvut.cz/index.php/s/MnGARsSwnpeVy5z).
23
23
The simulator is available for Linux and Windows, the simulator can be run by running the binary file `mrs_flight_forge.sh` in the downloaded archive.
24
24
The binary provides the simulator as a standalone application that can be run without the need of Unreal Engine.
25
25
However, if you wish to create a custom environment or modify the simulator you can download our Unreal Engine plugin from [flight_forge repository](https://github.com/ctu-mrs/flight_forge/), and place it in the `Plugins` folder of your Unreal Engine project.
Copy file name to clipboardExpand all lines: versioned_docs/version-1.5.0/120-miscellaneous/3d_model_processing/10-cloudcompare.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,7 +46,7 @@ Used for data processing before meshing and texturing.
46
46
47
47
[CC command line mode](https://www.cloudcompare.org/doc/wiki/index.php?title=Command_line_mode) opens a way for scripting of most of the functions available within CC GUI.
48
48
**Scripting is faster than GUI and provides repeatability: prefer scripting over GUI.**
49
-
Example script is available [here](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/blob/master/scripts/pointclouds/processPtxFiles.sh).
49
+
Example script is available [here](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/blob/master/scripts/pointclouds/processPtxFiles.sh).
50
50
Example command merging two `.ptx` files, exporting them to `.ply`, sampling the data to resolution of 2 cm and exporting the sampled data to `.ply`:
Copy file name to clipboardExpand all lines: versioned_docs/version-1.5.0/120-miscellaneous/3d_model_processing/30-meshlab.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -137,8 +137,8 @@ The pointcloud vertex color might not be detailed enough for the whole model or
137
137
138
138
#### Create whole model
139
139
140
-
***This method is not automatized, only proof of concept.** The method assume having precise position of the images w.r.t. the model. The files for this project are available [here](https://nasmrs.felk.cvut.cz/index.php/apps/files/?dir=/shared/3D_models/raster_texture_example).
141
-
* Use either [CloudCompare](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-images) or [VoxelizeE57Files](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package to extract images.
140
+
***This method is not automatized, only proof of concept.** The method assume having precise position of the images w.r.t. the model. The files for this project are available [here](https://nasmrs.fel.cvut.cz/index.php/apps/files/?dir=/shared/3D_models/raster_texture_example).
141
+
* Use either [CloudCompare](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-images) or [VoxelizeE57Files](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package to extract images.
142
142
* Open the MeshLab and import the mesh model, you would like to texture with `File->Import Mesh...`. The model do not need parametrization.
143
143
* Import all raster images `File->Import Raster...`
144
144
***Recommend to save the MeshLab project `.mlp` as much as you can. MeshLab likes to crash.**
@@ -191,7 +191,7 @@ ViewportPx="2048 2048"
191
191
FocalMm="6.141"
192
192
```
193
193
194
-
* Values are obtained from both CloudCompare `Camera Sensor` output and [VoxelizeE57Files](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package running with `--only-export-images` option.
194
+
* Values are obtained from both CloudCompare `Camera Sensor` output and [VoxelizeE57Files](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package running with `--only-export-images` option.
195
195
* Copy the whole `.xml` snippet and select `Windows->Paste clipboard to camera setting`.
196
196
* You should be able to see the change in the camera perspective.
197
197
* If you copy again the current value of the camera with `Windows->Copy camera settings to clipboard`, the values are different since MeshLab recalculates the imported values. Values are correct. Here is the example of such a snippet
@@ -230,7 +230,7 @@ FocalMm="6.141"
230
230
```
231
231
232
232
* These values represent the Leica BLK 360 camera sensor as described in the previous chapter even the values do not match. MeshLab recalculates the values for its own projection.
233
-
* To correct the camera sensor position and orientation, check the [extract sensor position](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-sensor-positions) guide or [VoxelizeE57Files](https://mrs.felk.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package. *Note: Not sure if VoxelizeE57Files gives the same transformation.*
233
+
* To correct the camera sensor position and orientation, check the [extract sensor position](https://ctu-mrs.github.io/docs/software/3d_model_processing/cloudcompare.html#extracting-sensor-positions) guide or [VoxelizeE57Files](https://mrs.fel.cvut.cz/gitlab/NAKI/naki_postprocessing/tree/master) package. *Note: Not sure if VoxelizeE57Files gives the same transformation.*
234
234
* Correct the values in `cameras.xml` configuration file for the `cameras` tag for each image as below:
0 commit comments