mirror of
https://gitlab.com/orbital-debris-research/directed-study/final-report.git
synced 2025-06-16 15:17:18 +00:00
finish content i think
This commit is contained in:
parent
1b9d9e5352
commit
6904e7e894
2
.vscode/ltex.dictionary.en-US.txt
vendored
2
.vscode/ltex.dictionary.en-US.txt
vendored
@ -1 +1,3 @@
|
|||||||
DebriSat
|
DebriSat
|
||||||
|
GrabCAD
|
||||||
|
dataframe
|
||||||
|
73
report.qmd
73
report.qmd
@ -58,7 +58,7 @@ only real way to hit the order of magnitude required is to have the only part of
|
|||||||
isn't fully automated by code to be taking the scans. My semester planned to do most of the
|
isn't fully automated by code to be taking the scans. My semester planned to do most of the
|
||||||
processing of the mesh in CATIA which proved to take far too long and isn't scalable.
|
processing of the mesh in CATIA which proved to take far too long and isn't scalable.
|
||||||
|
|
||||||
## Mesh Properties Algorithm Breakdown
|
## stlProcess Algorithm Breakdown
|
||||||
|
|
||||||
The algorithm for processing the 3D meshes is implemented in the Julia programming language.
|
The algorithm for processing the 3D meshes is implemented in the Julia programming language.
|
||||||
Syntactically the language is very similar to Python and Matlab, but was chosen because it is nearly
|
Syntactically the language is very similar to Python and Matlab, but was chosen because it is nearly
|
||||||
@ -126,12 +126,12 @@ The algorithm begins by taking the eigenvectors $\lambda$, and making vectors fr
|
|||||||
gravity to every point that makes up the mesh, $\bar{P_i}$. Then the angle, $\theta$, between the
|
gravity to every point that makes up the mesh, $\bar{P_i}$. Then the angle, $\theta$, between the
|
||||||
eigenvectors and all the point vectors needs to be calculated as follows:
|
eigenvectors and all the point vectors needs to be calculated as follows:
|
||||||
|
|
||||||
$$\theta_i = arccos\left(\frac{\lambda \cdot P_i}{\| \lambda \| \| P_i \| }\right)$$
|
$$\theta_i = \text{arccos}\left(\frac{\lambda \cdot P_i}{\| \lambda \| \| P_i \| }\right)$$
|
||||||
|
|
||||||
For almost every mesh there won't be a point that has a direct intersection with the eigenvector,
|
For almost every mesh there won't be a point that has a direct intersection with the eigenvector,
|
||||||
i.e. $\theta \approx 0$. To deal with this the 3 points with the smallest angle can be turned into a
|
i.e. $\theta \approx 0, \pi$. To deal with this the 3 points with the smallest angle can be turned
|
||||||
plane, then the intersection of the eigenvector and the plane can be calculated^[I won't go into
|
into a plane, then the intersection of the eigenvector and the plane can be calculated^[I won't go
|
||||||
depth of this math, but you can find a great explanation here [@plane_line]]. This process then
|
into depth of this math, but you can find a great explanation here [@plane_line]]. This process then
|
||||||
needs to be repeated for the largest 3 angles to be able to get two points that a distance can be
|
needs to be repeated for the largest 3 angles to be able to get two points that a distance can be
|
||||||
calculated between to be used in the characteristic length. The characteristic length, $L_C$, can
|
calculated between to be used in the characteristic length. The characteristic length, $L_C$, can
|
||||||
then be calculated simply by taking the average of the distances found for each eigenvector.
|
then be calculated simply by taking the average of the distances found for each eigenvector.
|
||||||
@ -141,52 +141,69 @@ $$L_c = \frac{\mathbf{A} + \mathbf{B} + \mathbf{C}}{3}$$
|
|||||||
### Solid Body Values
|
### Solid Body Values
|
||||||
|
|
||||||
The current algorithm just takes all the points that make up the mesh, then finds the `minmax` for
|
The current algorithm just takes all the points that make up the mesh, then finds the `minmax` for
|
||||||
each axis then averages them like the figure above. No consideration is taken for the orientation of
|
each axis then averages them like the figure below. No consideration is taken for the orientation of
|
||||||
the model, but I believe that shouldn't matter. Definitely an algorithm that deserves a revisit
|
the model, but I believe that shouldn't matter. Definitely an algorithm that deserves a revisit.
|
||||||
especially if DebriSat could be contacted for a rigorous definition of the quantity.
|
|
||||||
|
|
||||||
![Calculation of Solid Body Values, @hillMeasurementSatelliteImpact [slide 9]](Figures/solid_body.png)
|
![Calculation of Solid Body Values, @hillMeasurementSatelliteImpact [slide 9]](Figures/solid_body.png)
|
||||||
|
|
||||||
### `find_scale` and `fast_volume` Functions
|
### `find_scale` and `fast_volume` Functions
|
||||||
|
|
||||||
These functions are used together to scale the models for machine learning. When performing machine
|
These functions are used together to scale the models for machine learning. When performing machine
|
||||||
learning on a dataset with properties of varing magnitudes it is important to do some sort of
|
learning on a dataset with properties of varying magnitudes it is important to do some sort of
|
||||||
scaling. It was determined that scaling by volume was optimal for many reasons, but mainly because
|
scaling. Scaling data can be more of an art than a science, but it was determined that scaling by
|
||||||
the algorithm to calculate the volume is one of the fastest. The `fast_volume` function is just a
|
volume was optimal for many reasons, but mainly because the algorithm to calculate the volume is one
|
||||||
modified version of the main volume function that has all the extra calculations required for center
|
of the fastest. The `fast_volume` function is just a modified version of the main volume function
|
||||||
of gravity and inertia removed. The `find_scale` function uses the `fast_volume` function and a
|
that has all the extra calculations required for center of gravity and inertia removed. The
|
||||||
derivative root finding algorithm to find a scaling factor that can be used to get a volume of 1
|
`find_scale` function uses the `fast_volume` function and a derivative root finding algorithm to
|
||||||
from a model. This process while iterative works incredibly fast and with the current dataset only
|
find a scaling factor that can be used to get a volume of 1 from a model. This process while
|
||||||
adds about 20% to the computation time to calculate all the properties for a mesh.
|
iterative works incredibly fast and with the current dataset only adds about 20% to the computation
|
||||||
|
time to calculate all the properties for a mesh.
|
||||||
|
|
||||||
## stl-process Workflow
|
## stlProcess Workflow
|
||||||
|
|
||||||
|
Using Julia to process the meshes and make a `csv` to import into Matlab is a relatively
|
||||||
|
straightforward process. The hardest part would be to get up and running in Julia but I'd recommend
|
||||||
|
the following resources:
|
||||||
|
|
||||||
|
- [Julia Getting Started Docs](https://docs.julialang.org/en/v1/manual/getting-started/)
|
||||||
|
- [Syntax Cheat sheet](https://cheatsheets.quantecon.org/)
|
||||||
|
- [Julia for Matlabbers](https://projects.ansonbiggs.com/posts/2021-08-24-julia-for-matlabbers/)
|
||||||
|
|
||||||
### Imports
|
### Imports
|
||||||
|
|
||||||
|
Unlike Matlab which makes all installed toolboxes available by default in Julia packages have to be
|
||||||
|
brought into scope.
|
||||||
|
|
||||||
```{julia}
|
```{julia}
|
||||||
|
# For reading stl files
|
||||||
using FileIO
|
using FileIO
|
||||||
using MeshIO
|
using MeshIO
|
||||||
|
|
||||||
|
# The custom made library discussed above
|
||||||
using stlProcess
|
using stlProcess
|
||||||
|
|
||||||
|
# For making a csv of the data to be imported into matlab
|
||||||
using CSV
|
using CSV
|
||||||
using DataFrames
|
using DataFrames
|
||||||
|
|
||||||
|
# For calculating eigenvectors
|
||||||
using LinearAlgebra
|
using LinearAlgebra
|
||||||
using Statistics
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### `loop` Setup
|
### `loop` Setup
|
||||||
|
|
||||||
|
To setup for the loop that processes all of the `stl` files a dataframe has to be initalized, and
|
||||||
|
the path to the folder containing all the `stl` files needs to be defined.
|
||||||
|
|
||||||
```{julia}
|
```{julia}
|
||||||
#| output: false
|
#| output: false
|
||||||
|
|
||||||
# local path to https://gitlab.com/orbital-debris-research/fake-satellite-dataset
|
# local path to https://gitlab.com/orbital-debris-research/fake-satellite-dataset
|
||||||
dataset_path = raw"C:\Coding\fake-satellite-dataset"
|
dataset_path = raw"C:\Coding\fake-satellite-dataset"
|
||||||
|
|
||||||
# folders = ["1_5U", "assembly1", "cubesat"]
|
folders = ["1_5U", "assembly1", "cubesat"]
|
||||||
folders = ["cubesat"]
|
|
||||||
|
|
||||||
|
# Columns for every piece of data you want to export
|
||||||
df = DataFrame(;
|
df = DataFrame(;
|
||||||
surface_area=Float64[],
|
surface_area=Float64[],
|
||||||
characteristic_length=Float64[],
|
characteristic_length=Float64[],
|
||||||
@ -201,6 +218,10 @@ df = DataFrame(;
|
|||||||
|
|
||||||
### The `loop`
|
### The `loop`
|
||||||
|
|
||||||
|
The actual loop is quite simple, it first iterates through all the subfolders in the main dataset
|
||||||
|
folder, then iterates through all the files in each subfolder. Then the return of properties which
|
||||||
|
is shown above needs to be processed and added to the dataframe.
|
||||||
|
|
||||||
```{julia}
|
```{julia}
|
||||||
#| output: false
|
#| output: false
|
||||||
|
|
||||||
@ -225,6 +246,8 @@ end
|
|||||||
|
|
||||||
### Output
|
### Output
|
||||||
|
|
||||||
|
A summary of the dataframe can then be generated after the dataset is created.
|
||||||
|
|
||||||
:::{.column-body-outset}
|
:::{.column-body-outset}
|
||||||
|
|
||||||
```{julia}
|
```{julia}
|
||||||
@ -235,6 +258,12 @@ describe(df)
|
|||||||
|
|
||||||
:::
|
:::
|
||||||
|
|
||||||
|
```julia
|
||||||
|
CSV.write("scaled_dataset.csv", df)
|
||||||
|
```
|
||||||
|
|
||||||
|
Finally, the dataset can be saved to a `csv` file for Matlab to import.
|
||||||
|
|
||||||
## Gathering Data
|
## Gathering Data
|
||||||
|
|
||||||
To get started on the project before any scans of the actual debris are made available, I opted to
|
To get started on the project before any scans of the actual debris are made available, I opted to
|
||||||
@ -262,8 +291,8 @@ and surface body dimensions in a few milliseconds per part. The library can be f
|
|||||||
[here.](https://gitlab.com/MisterBiggs/stl-process) The characteristic length is a value that is
|
[here.](https://gitlab.com/MisterBiggs/stl-process) The characteristic length is a value that is
|
||||||
heavily used by the NASA DebriSat project [@DebriSat2019] that is doing very similar work to this
|
heavily used by the NASA DebriSat project [@DebriSat2019] that is doing very similar work to this
|
||||||
project. The characteristic length takes the maximum orthogonal dimension of a body, sums the
|
project. The characteristic length takes the maximum orthogonal dimension of a body, sums the
|
||||||
dimensions then divides by 3 to produce a single scalar value that can be used to get an idea of
|
dimensions then divides by 3 to produce a single scalar value that can be used to get an idea of the
|
||||||
thesize of a 3D object.
|
size of a 3D object.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
Loading…
x
Reference in New Issue
Block a user