mirror of
https://gitlab.com/orbital-debris-research/directed-study/final-report.git
synced 2025-06-16 15:17:18 +00:00
finished
This commit is contained in:
parent
5ab0b14f16
commit
e08d48e8e2
1
.vscode/ltex.dictionary.en-US.txt
vendored
1
.vscode/ltex.dictionary.en-US.txt
vendored
@ -2,3 +2,4 @@ DebriSat
|
||||
GrabCAD
|
||||
dataframe
|
||||
CATIA
|
||||
stlProcess
|
||||
|
2
.vscode/ltex.hiddenFalsePositives.en-US.txt
vendored
2
.vscode/ltex.hiddenFalsePositives.en-US.txt
vendored
@ -3,3 +3,5 @@
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QCalculation of Characteristic Length, @hillMeasurementSatelliteImpact slide 9\\E$"}
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QCurrently, algorithms have been made that are capable of getting many key features from solid ^A mesh with a surface that is fully closed and has no holes in its geometry.\\E$"}
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QThe summary is that using PCA determined that by far the most variance out of the current list of properties is captured by the principle moments of inertia.^Eigen Values of the inertia tensor.\\E$"}
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QCurrently, algorithms have been made that are capable of getting many key features from solid^A mesh with a surface that is fully closed and has no holes in its geometry.\\E$"}
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QProcessing the 3D scans produced by the scanner in the RPL^Rapid Prototyping Lab, STEM Building is no easy task and something I chose to avoid for now.\\E$"}
|
||||
|
120
report.qmd
120
report.qmd
@ -29,40 +29,40 @@ filters:
|
||||
|
||||
## Introduction
|
||||
|
||||
Orbital debris is a form of pollution that is growing at an exponential pace and puts current and
|
||||
future space infrastructure at risk. Satellites are critical to military, commercial, and civil
|
||||
operations. Unfortunately, the space that debris occupies is increasingly becoming more crowded and
|
||||
dangerous, potentially leading to a cascade event that could turn orbit around the Earth into an
|
||||
unusable wasteland for decades unless proper mitigation is not introduced. Existing models employed
|
||||
by NASA rely on a dataset created from 2D images and are missing many crucial features required for
|
||||
Orbital debris is a form of pollution that is growing at an exponential pace and puts future space
|
||||
infrastructure at risk. Satellites are critical to military, commercial, and civil operations.
|
||||
Unfortunately, the space that debris occupies is increasingly becoming more crowded and dangerous,
|
||||
potentially leading to a cascade event that could turn orbit around the Earth into an unusable
|
||||
wasteland for decades unless proper mitigation is not introduced. Existing models employed by NASA
|
||||
rely on a dataset created from 2D images and are missing many crucial features required for
|
||||
correctly modeling the space debris environment. This approach aims to use high-resolution 3D
|
||||
scanning to fully capture the geometry of a piece of debris and allow a more advanced analysis of
|
||||
each piece. Coupled with machine learning methods, the scans will allow advances to the current
|
||||
cutting edge. Physical and photograph-based measurements are time-consuming, hard to replicate, and
|
||||
lack precision. 3D scanning allows much more advanced and accurate analysis of each debris sample,
|
||||
focusing on properties such as moment of inertia, cross-section, and drag. Once the characteristics
|
||||
of space debris are more thoroughly understood, we can begin mitigating the creation and danger of
|
||||
future space debris by implementing improved satellite construction methods and more advanced debris
|
||||
avoidance measures.
|
||||
lack precision. 3D scanning allows much more advanced and accurate analysis of each debris sample.
|
||||
Once the characteristics of space debris are more thoroughly understood, we can begin mitigating the
|
||||
creation and danger of future space debris by implementing improved satellite construction methods
|
||||
and more advanced debris avoidance measures.
|
||||
|
||||
## Current Progress
|
||||
|
||||
This project aims to fix very difficult issues, and although great progress has been made there is
|
||||
still plenty of work to be done. Currently, algorithms have been made that are capable of getting
|
||||
many key features from solid ^[A mesh with a surface that is fully closed and has no holes in its
|
||||
many key features from solid^[A mesh with a surface that is fully closed and has no holes in its
|
||||
geometry.] models that are in the `stl` format. Processing the 3D scans produced by the scanner in
|
||||
the RPL lab is no easy task and something I chose to avoid for now. I suspect that the best method
|
||||
to work with data from the scanner will be to export a point cloud from the scanner software, then
|
||||
from there use the points to create a triangulated mesh. The amount of scans needed means that the
|
||||
the RPL^[Rapid Prototyping Lab, STEM Building] is no easy task and something I chose to avoid for
|
||||
now. I suspect that the best method to work with data from the scanner will be to export a point
|
||||
cloud from the scanner software, then from there use the points to create a triangulated mesh. The
|
||||
amount of scans needed for machine learning to capture all possible debris geometry means that the
|
||||
only real way to hit the order of magnitude required is to have the only part of the workflow that
|
||||
isn't fully automated by code to be taking the scans. My semester planned to do most of the
|
||||
processing of the mesh in CATIA which proved to take far too long and isn't scalable.
|
||||
processing of the mesh in CATIA which proved to take far too long and was unable to scale.
|
||||
|
||||
## stlProcess Algorithm Breakdown
|
||||
|
||||
The algorithm for processing the 3D meshes is implemented in the Julia programming language.
|
||||
Syntactically the language is very similar to Python and Matlab, but was chosen because it is nearly
|
||||
as performant as compiled languages like C, while still having tooling geared towards engineers and
|
||||
Syntactically the language is very similar to Python and Matlab, but was chosen because it is as
|
||||
performant as compiled languages like C, while still having tooling geared towards engineers and
|
||||
scientists. The current code is quite robust and well tested and for a given `stl` file produces the
|
||||
following data:
|
||||
|
||||
@ -70,7 +70,7 @@ following data:
|
||||
struct Properties
|
||||
# Volume of the mesh
|
||||
volume::Float64
|
||||
# Center of gravity, meshes are not always center at [0,0,0]
|
||||
# Center of gravity, meshes are not always centered at [0,0,0]
|
||||
center_of_gravity::Vector{Float64}
|
||||
# Moment of inertia tensor
|
||||
inertia::Matrix{Float64}
|
||||
@ -100,11 +100,11 @@ algorithm is one line of code:
|
||||
surface_area = sum(norm.(eachrow([x0 y0 z0] - [x1 y1 z1]) .× eachrow([x1 y1 z1] - [x2 y2 z2])) / 2)
|
||||
```
|
||||
|
||||
The algorithm is finding the area of a triangle made up by 3 points in 3D space using some Calculus
|
||||
3 vector math. The area of a triangle is $S=\dfrac{|\mathbf{AB}\times\mathbf{AC}|}2$, then the sum
|
||||
of all the triangles that make up the mesh should produce the surface area. This can be inaccurate
|
||||
if a mesh is missing triangles, has triangles that overlap, or is a mesh that has internal geometry.
|
||||
None of which should be an issue for 3D scans.
|
||||
The algorithm is finding the area of a triangle made up by 3 points, $\mathbf{ABC}$, in 3D space
|
||||
using Calculus 3 vector math. The area of a triangle is $S=\dfrac{|\mathbf{AB}\times\mathbf{AC}|}2$,
|
||||
then the sum of all the triangles that make up the mesh should produce the surface area. This can be
|
||||
inaccurate if a mesh is missing triangles, has triangles that overlap, or is a mesh that has
|
||||
internal geometry. None of which should be an issue for 3D scans.
|
||||
|
||||
### Characteristic Length
|
||||
|
||||
@ -118,7 +118,7 @@ in various papers with slightly different definitions.
|
||||
|
||||
The current implementation is quite complicated and is the most involved algorithm used. The key
|
||||
fact that makes this algorithm work is that the eigenvectors are orthonormal and oriented so that
|
||||
they are pointing in the direction of most mass. Assuming the mesh has uniform density, this means
|
||||
they are pointing in the directions of most mass. Assuming the mesh has uniform density, this means
|
||||
that the eigenvectors are pointing in the direction of the points furthest from the center of
|
||||
gravity.
|
||||
|
||||
@ -129,15 +129,21 @@ eigenvectors and all the point vectors needs to be calculated as follows:
|
||||
$$\theta_i = \text{arccos}\left(\frac{\lambda \cdot P_i}{\| \lambda \| \| P_i \| }\right)$$
|
||||
|
||||
For almost every mesh there won't be a point that has a direct intersection with the eigenvector,
|
||||
i.e. $\theta \approx 0, \pi$. To deal with this the 3 points with the smallest angle can be turned
|
||||
into a plane, then the intersection of the eigenvector and the plane can be calculated^[I won't go
|
||||
into depth of this math, but you can find a great explanation here [@plane_line]]. This process then
|
||||
needs to be repeated for the largest 3 angles to be able to get two points that a distance can be
|
||||
calculated between to be used in the characteristic length. The characteristic length, $L_C$, can
|
||||
then be calculated simply by taking the average of the distances found for each eigenvector.
|
||||
i.e. where $\theta \approx 0, \pi$. To deal with this the 3 points with the smallest angle can be
|
||||
turned into a plane, then the intersection of the eigenvector and the plane can be calculated^[I
|
||||
won't go into depth of this math, but you can find a great explanation here [@plane_line]]. This
|
||||
process then needs to be repeated for the largest 3 angles to be able to get two points that a
|
||||
distance can be calculated between to be used in the characteristic length. The characteristic
|
||||
length, $L_C$, can then be calculated simply by taking the average of the distances found for each
|
||||
eigenvector.
|
||||
|
||||
$$L_c = \frac{\mathbf{A} + \mathbf{B} + \mathbf{C}}{3}$$
|
||||
|
||||
This algorithm will run into an error if a mesh is flat or 2D, but it has error handling, so it
|
||||
fails gracefully. Scans should never have this problem, so it likely isn't worth worrying about.
|
||||
There is likely a much faster analytical method to find the intersection, but for now it is
|
||||
performant enough.
|
||||
|
||||
### Solid Body Values
|
||||
|
||||
The current algorithm just takes all the points that make up the mesh, then finds the `minmax` for
|
||||
@ -154,15 +160,15 @@ scaling. Scaling data can be more of an art than a science, but it was determine
|
||||
volume was optimal for many reasons, but mainly because the algorithm to calculate the volume is one
|
||||
of the fastest. The `fast_volume` function is just a modified version of the main volume function
|
||||
that has all the extra calculations required for center of gravity and inertia removed. The
|
||||
`find_scale` function uses the `fast_volume` function and a derivative root finding algorithm to
|
||||
find a scaling factor that can be used to get a volume of 1 from a model. This process while
|
||||
`find_scale` function uses the `fast_volume` function and a derivative free root finding algorithm
|
||||
to find a scaling factor that can be used to get a volume of 1 from a model. This process while
|
||||
iterative works incredibly fast and with the current dataset only adds about 20% to the computation
|
||||
time to calculate all the properties for a mesh.
|
||||
|
||||
## stlProcess Workflow
|
||||
|
||||
Using Julia to process the meshes and make a `csv` to import into Matlab is a relatively
|
||||
straightforward process. The hardest part would be to get up and running in Julia but I'd recommend
|
||||
straightforward process. The hardest part would be to get up and running in Julia, but I'd recommend
|
||||
the following resources:
|
||||
|
||||
- [Julia Getting Started Docs](https://docs.julialang.org/en/v1/manual/getting-started/)
|
||||
@ -171,7 +177,7 @@ the following resources:
|
||||
|
||||
### Imports
|
||||
|
||||
Unlike Matlab which makes all installed toolboxes available by default in Julia packages have to be
|
||||
Unlike Matlab which makes all installed toolboxes available by default, Julia packages have to be
|
||||
brought into scope.
|
||||
|
||||
```{julia}
|
||||
@ -193,7 +199,7 @@ using LinearAlgebra
|
||||
### `loop` Setup
|
||||
|
||||
To set up for the loop that processes all the `stl` files a dataframe has to be initialized, and the
|
||||
path to the folder containing all the `stl` files needs to be defined.
|
||||
path to the subfolders containing all the `stl` files needs to be defined.
|
||||
|
||||
```{julia}
|
||||
#| output: false
|
||||
@ -205,40 +211,43 @@ folders = ["1_5U", "assembly1", "cubesat"]
|
||||
|
||||
# Columns for every piece of data you want to export
|
||||
df = DataFrame(;
|
||||
volume=Float64[],
|
||||
surface_area=Float64[],
|
||||
characteristic_length=Float64[],
|
||||
sbx=Float64[],
|
||||
sby=Float64[],
|
||||
sbz=Float64[],
|
||||
Ix=Float64[],
|
||||
Iy=Float64[],
|
||||
Iz=Float64[],
|
||||
bounding_box=Float64[],
|
||||
I1=Float64[],
|
||||
I2=Float64[],
|
||||
I3=Float64[],
|
||||
)
|
||||
```
|
||||
|
||||
### The `loop`
|
||||
|
||||
The actual loop is quite simple, it first iterates through all the subfolders in the main dataset
|
||||
folder, then iterates through all the files in each subfolder. Then the return of properties which
|
||||
is shown above needs to be processed and added to the dataframe.
|
||||
folder, then iterates through all the files in each subfolder. Then the return of the `properties`
|
||||
struct which is shown above needs to be processed and added to the dataframe.
|
||||
|
||||
```{julia}
|
||||
#| output: false
|
||||
|
||||
for path in dataset_path * "\\" .* folders
|
||||
Threads.@threads for file in readdir(path)
|
||||
|
||||
# load the stl
|
||||
stl = load(path * "\\" * file)
|
||||
|
||||
# normalize the stl
|
||||
scale = find_scale(stl)
|
||||
props = get_mass_properties(stl; scale=scale)
|
||||
|
||||
eigs = eigvals(props.inertia)
|
||||
sort_index = sortperm(eigs)
|
||||
I1, I2, I3 = eigs[sort_index]
|
||||
sbx, sby, sbz = props.sb_values[sort_index]
|
||||
I1, I2, I3 = eigvals(props.inertia)
|
||||
|
||||
sbx, sby, sbz = props.sb_values
|
||||
bounding_box = sbx * sby * sbz
|
||||
|
||||
push!(
|
||||
df,
|
||||
[props.surface_area, props.characteristic_length, sbx, sby, sbz, I3, I2, I1],
|
||||
[props.volume, props.surface_area, props.characteristic_length, bounding_box, I3, I2, I1],
|
||||
)
|
||||
end
|
||||
end
|
||||
@ -246,7 +255,8 @@ end
|
||||
|
||||
### Output
|
||||
|
||||
A summary of the dataframe can then be generated after the dataset is created.
|
||||
A summary of the dataframe can then be generated after the dataset is created. Note the volumes are
|
||||
all `1.0`
|
||||
|
||||
:::{.column-body-outset}
|
||||
|
||||
@ -264,6 +274,16 @@ Finally, the dataset can be saved to a `csv` file for Matlab to import.
|
||||
CSV.write("scaled_dataset.csv", df)
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
The last key part of the Julia library is that it has tests written in the `test` directory. These
|
||||
tests take simple shapes that all the above properties can be calculated for easily, and makes sure
|
||||
that new versions of the code still compute the values correctly automatically. Test driven design
|
||||
is a powerful way to approach programming and is worth exploring since it increases productivity and
|
||||
if done correctly proves that code is running correctly.
|
||||
|
||||
[Julia Testing Documentation](https://docs.julialang.org/en/v1/stdlib/Test/)
|
||||
|
||||
---
|
||||
|
||||
## Machine Learning
|
||||
|
Loading…
x
Reference in New Issue
Block a user