mirror of
https://gitlab.com/orbital-debris-research/directed-study/final-report.git
synced 2025-06-15 14:46:45 +00:00
lotta text added
This commit is contained in:
parent
684e980455
commit
1b9d9e5352
1
.vscode/ltex.dictionary.en-US.txt
vendored
Normal file
1
.vscode/ltex.dictionary.en-US.txt
vendored
Normal file
@ -0,0 +1 @@
|
||||
DebriSat
|
2
.vscode/ltex.hiddenFalsePositives.en-US.txt
vendored
2
.vscode/ltex.hiddenFalsePositives.en-US.txt
vendored
@ -1 +1,3 @@
|
||||
{"rule":"UPPERCASE_SENTENCE_START","sentence":"^\\Qmodels that are in the \\E(?:Dummy|Ina|Jimmy-)[0-9]+\\Q format.\\E$"}
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QWikipedia gives a different definition^[Wikipedia: Characteristic Length] that defines characteristic length as the volume divided by the surface area.\\E$"}
|
||||
{"rule":"MORFOLOGIK_RULE_EN_US","sentence":"^\\QCalculation of Characteristic Length, @hillMeasurementSatelliteImpact slide 9\\E$"}
|
||||
|
BIN
Figures/solid_body.png
Normal file
BIN
Figures/solid_body.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 112 KiB |
@ -24,3 +24,19 @@
|
||||
url = {https://grabcad.com/library/6u-cubesat-model-1},
|
||||
urldate = {2022-02-15}
|
||||
}
|
||||
|
||||
@article{hillMeasurementSatelliteImpact,
|
||||
title = {Measurement of Satellite Impact Test Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris},
|
||||
author = {Hill, Nicole M.},
|
||||
url = {https://ntrs.nasa.gov/api/citations/20090019123/downloads/20090019123.pdf},
|
||||
urldate = {2009}
|
||||
}
|
||||
|
||||
@misc{plane_line,
|
||||
title = {determine where a vector will intersect a plane},
|
||||
author = {David Mitra (https://math.stackexchange.com/users/18986/david-mitra)},
|
||||
howpublished = {Mathematics Stack Exchange},
|
||||
note = {URL:https://math.stackexchange.com/q/100447 (version: 2012-01-19)},
|
||||
eprint = {https://math.stackexchange.com/q/100447},
|
||||
url = {https://math.stackexchange.com/q/100447}
|
||||
}
|
89
report.qmd
89
report.qmd
@ -1,8 +1,13 @@
|
||||
---
|
||||
title: "Characterization of Space Debris using Machine Learning Methods"
|
||||
subtitle: "Advanced processing of 3D meshes using Julia, and data science in Matlab."
|
||||
author: Anson Biggs
|
||||
author:
|
||||
- name: Anson Biggs
|
||||
url: https://ansonbiggs.com
|
||||
|
||||
date: "4/30/2022"
|
||||
thanks: Mehran Andalibi
|
||||
|
||||
latex-auto-install: true
|
||||
format:
|
||||
html:
|
||||
@ -19,6 +24,7 @@ execute:
|
||||
|
||||
filters:
|
||||
- filter.lua
|
||||
# theme: litera
|
||||
---
|
||||
|
||||
## Introduction
|
||||
@ -98,17 +104,66 @@ The algorithm is finding the area of a triangle made up by 3 points in 3D space
|
||||
3 vector math. The area of a triangle is $S=\dfrac{|\mathbf{AB}\times\mathbf{AC}|}2$, then the sum
|
||||
of all the triangles that make up the mesh should produce the surface area. This can be inaccurate
|
||||
if a mesh is missing triangles, has triangles that overlap, or is a mesh that has internal geometry.
|
||||
None of which should be an issue for 3D scans.
|
||||
|
||||
### Characteristic Length
|
||||
|
||||
This is a property that is heavily used by the DebriSat^[@DebriSat2019] program, so it is definitely
|
||||
worth having specifically since DebriSat is a very interesting partner for this project. It is
|
||||
unclear if the current implementation is the same value that DebriSat uses. Wikipedia gives a
|
||||
different
|
||||
definition^[[Wikipedia: Characteristic Length](https://en.wikipedia.org/wiki/Characteristic_length)]
|
||||
that defines characteristic length as the volume divided by the surface area. DebriSat mentions it
|
||||
in various papers with slightly different definitions.
|
||||
|
||||
The current implementation is quite complicated and is the most involved algorithm used. The key
|
||||
fact that makes this algorithm work is that the eigenvectors are orthonormal and oriented so that
|
||||
they are pointing in the direction of most mass. Assuming the mesh has uniform density, this means
|
||||
that the eigenvectors are pointing in the direction of the points furthest from the center of
|
||||
gravity.
|
||||
|
||||
The algorithm begins by taking the eigenvectors $\lambda$, and making vectors from the center of
|
||||
gravity to every point that makes up the mesh, $\bar{P_i}$. Then the angle, $\theta$, between the
|
||||
eigenvectors and all the point vectors needs to be calculated as follows:
|
||||
|
||||
$$\theta_i = arccos\left(\frac{\lambda \cdot P_i}{\| \lambda \| \| P_i \| }\right)$$
|
||||
|
||||
For almost every mesh there won't be a point that has a direct intersection with the eigenvector,
|
||||
i.e. $\theta \approx 0$. To deal with this the 3 points with the smallest angle can be turned into a
|
||||
plane, then the intersection of the eigenvector and the plane can be calculated^[I won't go into
|
||||
depth of this math, but you can find a great explanation here [@plane_line]]. This process then
|
||||
needs to be repeated for the largest 3 angles to be able to get two points that a distance can be
|
||||
calculated between to be used in the characteristic length. The characteristic length, $L_C$, can
|
||||
then be calculated simply by taking the average of the distances found for each eigenvector.
|
||||
|
||||
$$L_c = \frac{\mathbf{A} + \mathbf{B} + \mathbf{C}}{3}$$
|
||||
|
||||
### Solid Body Values
|
||||
|
||||
The current algorithm just takes all the points that make up the mesh, then finds the `minmax` for
|
||||
each axis then averages them like the figure above. No consideration is taken for the orientation of
|
||||
the model, but I believe that shouldn't matter. Definitely an algorithm that deserves a revisit
|
||||
especially if DebriSat could be contacted for a rigorous definition of the quantity.
|
||||
|
||||
![Calculation of Solid Body Values, @hillMeasurementSatelliteImpact [slide 9]](Figures/solid_body.png)
|
||||
|
||||
### `find_scale` and `fast_volume` Functions
|
||||
|
||||
```{julia}
|
||||
#| echo: false
|
||||
#| output: false
|
||||
These functions are used together to scale the models for machine learning. When performing machine
|
||||
learning on a dataset with properties of varing magnitudes it is important to do some sort of
|
||||
scaling. It was determined that scaling by volume was optimal for many reasons, but mainly because
|
||||
the algorithm to calculate the volume is one of the fastest. The `fast_volume` function is just a
|
||||
modified version of the main volume function that has all the extra calculations required for center
|
||||
of gravity and inertia removed. The `find_scale` function uses the `fast_volume` function and a
|
||||
derivative root finding algorithm to find a scaling factor that can be used to get a volume of 1
|
||||
from a model. This process while iterative works incredibly fast and with the current dataset only
|
||||
adds about 20% to the computation time to calculate all the properties for a mesh.
|
||||
|
||||
## stl-process Workflow
|
||||
|
||||
### Imports
|
||||
|
||||
```{julia}
|
||||
using FileIO
|
||||
using MeshIO
|
||||
|
||||
@ -116,15 +171,14 @@ using stlProcess
|
||||
|
||||
using CSV
|
||||
using DataFrames
|
||||
using Plots
|
||||
theme(:ggplot2)
|
||||
|
||||
using LinearAlgebra
|
||||
using Statistics
|
||||
```
|
||||
|
||||
### `loop` Setup
|
||||
|
||||
```{julia}
|
||||
#| code-fold: true
|
||||
#| output: false
|
||||
|
||||
# local path to https://gitlab.com/orbital-debris-research/fake-satellite-dataset
|
||||
@ -145,6 +199,8 @@ df = DataFrame(;
|
||||
)
|
||||
```
|
||||
|
||||
### The `loop`
|
||||
|
||||
```{julia}
|
||||
#| output: false
|
||||
|
||||
@ -167,6 +223,8 @@ for path in dataset_path * "\\" .* folders
|
||||
end
|
||||
```
|
||||
|
||||
### Output
|
||||
|
||||
:::{.column-body-outset}
|
||||
|
||||
```{julia}
|
||||
@ -177,23 +235,6 @@ describe(df)
|
||||
|
||||
:::
|
||||
|
||||
```{julia}
|
||||
S = cov(Matrix(df))
|
||||
|
||||
eig_vals = eigvals(S);
|
||||
|
||||
# sorting eigenvalues from largest to smallest
|
||||
sort_index = sortperm(eig_vals; rev=true)
|
||||
|
||||
lambda = eig_vals[sort_index]
|
||||
names_sorted = names(df)[sort_index]
|
||||
|
||||
lambda_ratio = cumsum(lambda) ./ sum(lambda)
|
||||
|
||||
plot(lambda_ratio, marker=:x)
|
||||
xticks!(sort_index,names(df), xrotation = 15)
|
||||
```
|
||||
|
||||
## Gathering Data
|
||||
|
||||
To get started on the project before any scans of the actual debris are made available, I opted to
|
||||
|
Loading…
x
Reference in New Issue
Block a user