finish content i think

This commit is contained in:
Anson 2022-05-01 23:12:49 -07:00
parent 1b9d9e5352
commit 6904e7e894
2 changed files with 53 additions and 22 deletions

View File

@ -1 +1,3 @@
DebriSat
GrabCAD
dataframe

View File

@ -58,7 +58,7 @@ only real way to hit the order of magnitude required is to have the only part of
isn't fully automated by code to be taking the scans. My semester planned to do most of the
processing of the mesh in CATIA which proved to take far too long and isn't scalable.
## Mesh Properties Algorithm Breakdown
## stlProcess Algorithm Breakdown
The algorithm for processing the 3D meshes is implemented in the Julia programming language.
Syntactically the language is very similar to Python and Matlab, but was chosen because it is nearly
@ -126,12 +126,12 @@ The algorithm begins by taking the eigenvectors $\lambda$, and making vectors fr
gravity to every point that makes up the mesh, $\bar{P_i}$. Then the angle, $\theta$, between the
eigenvectors and all the point vectors needs to be calculated as follows:
$$\theta_i = arccos\left(\frac{\lambda \cdot P_i}{\| \lambda \| \| P_i \| }\right)$$
$$\theta_i = \text{arccos}\left(\frac{\lambda \cdot P_i}{\| \lambda \| \| P_i \| }\right)$$
For almost every mesh there won't be a point that has a direct intersection with the eigenvector,
i.e. $\theta \approx 0$. To deal with this the 3 points with the smallest angle can be turned into a
plane, then the intersection of the eigenvector and the plane can be calculated^[I won't go into
depth of this math, but you can find a great explanation here [@plane_line]]. This process then
i.e. $\theta \approx 0, \pi$. To deal with this the 3 points with the smallest angle can be turned
into a plane, then the intersection of the eigenvector and the plane can be calculated^[I won't go
into depth of this math, but you can find a great explanation here [@plane_line]]. This process then
needs to be repeated for the largest 3 angles to be able to get two points that a distance can be
calculated between to be used in the characteristic length. The characteristic length, $L_C$, can
then be calculated simply by taking the average of the distances found for each eigenvector.
@ -141,52 +141,69 @@ $$L_c = \frac{\mathbf{A} + \mathbf{B} + \mathbf{C}}{3}$$
### Solid Body Values
The current algorithm just takes all the points that make up the mesh, then finds the `minmax` for
each axis then averages them like the figure above. No consideration is taken for the orientation of
the model, but I believe that shouldn't matter. Definitely an algorithm that deserves a revisit
especially if DebriSat could be contacted for a rigorous definition of the quantity.
each axis then averages them like the figure below. No consideration is taken for the orientation of
the model, but I believe that shouldn't matter. Definitely an algorithm that deserves a revisit.
![Calculation of Solid Body Values, @hillMeasurementSatelliteImpact [slide 9]](Figures/solid_body.png)
### `find_scale` and `fast_volume` Functions
These functions are used together to scale the models for machine learning. When performing machine
learning on a dataset with properties of varing magnitudes it is important to do some sort of
scaling. It was determined that scaling by volume was optimal for many reasons, but mainly because
the algorithm to calculate the volume is one of the fastest. The `fast_volume` function is just a
modified version of the main volume function that has all the extra calculations required for center
of gravity and inertia removed. The `find_scale` function uses the `fast_volume` function and a
derivative root finding algorithm to find a scaling factor that can be used to get a volume of 1
from a model. This process while iterative works incredibly fast and with the current dataset only
adds about 20% to the computation time to calculate all the properties for a mesh.
learning on a dataset with properties of varying magnitudes it is important to do some sort of
scaling. Scaling data can be more of an art than a science, but it was determined that scaling by
volume was optimal for many reasons, but mainly because the algorithm to calculate the volume is one
of the fastest. The `fast_volume` function is just a modified version of the main volume function
that has all the extra calculations required for center of gravity and inertia removed. The
`find_scale` function uses the `fast_volume` function and a derivative root finding algorithm to
find a scaling factor that can be used to get a volume of 1 from a model. This process while
iterative works incredibly fast and with the current dataset only adds about 20% to the computation
time to calculate all the properties for a mesh.
## stl-process Workflow
## stlProcess Workflow
Using Julia to process the meshes and make a `csv` to import into Matlab is a relatively
straightforward process. The hardest part would be to get up and running in Julia but I'd recommend
the following resources:
- [Julia Getting Started Docs](https://docs.julialang.org/en/v1/manual/getting-started/)
- [Syntax Cheat sheet](https://cheatsheets.quantecon.org/)
- [Julia for Matlabbers](https://projects.ansonbiggs.com/posts/2021-08-24-julia-for-matlabbers/)
### Imports
Unlike Matlab which makes all installed toolboxes available by default in Julia packages have to be
brought into scope.
```{julia}
# For reading stl files
using FileIO
using MeshIO
# The custom made library discussed above
using stlProcess
# For making a csv of the data to be imported into matlab
using CSV
using DataFrames
# For calculating eigenvectors
using LinearAlgebra
using Statistics
```
### `loop` Setup
To setup for the loop that processes all of the `stl` files a dataframe has to be initalized, and
the path to the folder containing all the `stl` files needs to be defined.
```{julia}
#| output: false
# local path to https://gitlab.com/orbital-debris-research/fake-satellite-dataset
dataset_path = raw"C:\Coding\fake-satellite-dataset"
# folders = ["1_5U", "assembly1", "cubesat"]
folders = ["cubesat"]
folders = ["1_5U", "assembly1", "cubesat"]
# Columns for every piece of data you want to export
df = DataFrame(;
surface_area=Float64[],
characteristic_length=Float64[],
@ -201,6 +218,10 @@ df = DataFrame(;
### The `loop`
The actual loop is quite simple, it first iterates through all the subfolders in the main dataset
folder, then iterates through all the files in each subfolder. Then the return of properties which
is shown above needs to be processed and added to the dataframe.
```{julia}
#| output: false
@ -225,6 +246,8 @@ end
### Output
A summary of the dataframe can then be generated after the dataset is created.
:::{.column-body-outset}
```{julia}
@ -235,6 +258,12 @@ describe(df)
:::
```julia
CSV.write("scaled_dataset.csv", df)
```
Finally, the dataset can be saved to a `csv` file for Matlab to import.
## Gathering Data
To get started on the project before any scans of the actual debris are made available, I opted to
@ -262,8 +291,8 @@ and surface body dimensions in a few milliseconds per part. The library can be f
[here.](https://gitlab.com/MisterBiggs/stl-process) The characteristic length is a value that is
heavily used by the NASA DebriSat project [@DebriSat2019] that is doing very similar work to this
project. The characteristic length takes the maximum orthogonal dimension of a body, sums the
dimensions then divides by 3 to produce a single scalar value that can be used to get an idea of
thesize of a 3D object.
dimensions then divides by 3 to produce a single scalar value that can be used to get an idea of the
size of a 3D object.
![Current mesh processing pipeline](Figures/current_process.png)