mirror of
https://gitlab.com/orbital-debris-research/directed-study/report-1.git
synced 2025-06-15 14:36:46 +00:00
started report
This commit is contained in:
parent
7003f4acc6
commit
31725d6b49
3
.vscode/settings.json
vendored
Normal file
3
.vscode/settings.json
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
{
|
||||
"julia.environmentPath": ".\\prep",
|
||||
}
|
38
README.md
38
README.md
@ -1 +1,37 @@
|
||||
# Fusion Properties
|
||||
---
|
||||
title: "Machine Learning Methods for Orbital Debris Characterization"
|
||||
description: |
|
||||
A short description of the post.
|
||||
author:
|
||||
- name: Anson Biggs
|
||||
url: https://ansonbiggs.com
|
||||
date: 2022-02-13
|
||||
output:
|
||||
distill::distill_article:
|
||||
self_contained: false
|
||||
draft: true
|
||||
---
|
||||
|
||||
To get started on the project before any scans of the real debris are made available I opted to find similar 3D models online and to process them as if they were data collected by my team. GrabCad is an excellent source of high quality 3D models and all of the models have at worst a non-commercial license making them suitable for this study. To start I downloaded a high quality model of a 6U CubeSat, which coincidentally enough was designed for detection of orbital debris. This model consists of 48 individual parts most of which are unique.
|
||||
|
||||

|
||||
|
||||
## Data Preparation
|
||||
|
||||
To begin analysis of the pieces of the satellite key properties from each piece needed to be gathered. The most accessible CAD software for me at the moment is Fusion 360, but most any CAD software is capable of giving properties for a model. The only way to export the properties is to the clipboard of my computer and required clicking on each individual part of the CubeSats assembly. This task was easily automated using an autohotkey script to automatically push my computers clipboard to a file. Below is an example of what each file looks like, but only a portion of the file for brevity. The text file is generated in a way that makes it difficult to parse so a separate piece of code was used to collect all of the data from all of the part files and then turn it into a `.csv` file for easy import into Matlab.
|
||||
|
||||
```txt
|
||||
...
|
||||
Physical
|
||||
Mass 108.079 g
|
||||
Volume 13768.029 mm^3
|
||||
Density 0.008 g / mm^3
|
||||
Area 19215.837 mm^2
|
||||
World X,Y,Z 149.00 mm, 103.80 mm, 41.30 mm
|
||||
Center of Mass 101.414 mm, 102.908 mm, -0.712 mm
|
||||
...
|
||||
```
|
||||
|
||||
The full file of the compiled parts properties from Fusion 360 can be seen [here.](https://gitlab.com/orbital-debris-research/fusion-properties/-/blob/main/compiled.csv) This method gave 22 columns of data but most of the columns are unsuitable for characterization of 3D geometry. Its important that the only properties considered are scalars that are independent of a models orientation of position in space. Part of the data provided was a moment of inertia tensor. This was computed down to $I_x$, $I_y$, and $I_z$, which was then used to compute an $\bar{I}$. Then bounding box length, width, and height were used to compute a total volume that the object takes up. In the end the only properties used in the analysis of the parts were: mass, volume, density, area,bounding box volume, $\bar{I}$. Some parts also had to be removed due to being outliers to the final dataset is 44 rows and 6 columns.
|
||||
|
||||

|
||||
|
BIN
assembly.jpg
Normal file
BIN
assembly.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 876 KiB |
File diff suppressed because it is too large
Load Diff
@ -2,5 +2,4 @@
|
||||
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
|
||||
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
|
||||
DataFramesMeta = "1313f7d8-7da2-5740-9ea0-a2ca25f37964"
|
||||
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
|
||||
StatsPlots = "f3b207a7-027a-5e70-b257-86293d7955fd"
|
||||
PlotlyJS = "f0f68f2c-4968-5e81-91da-67840de0976a"
|
||||
|
31
prep/prep.jl
31
prep/prep.jl
@ -2,13 +2,14 @@ using DataFrames
|
||||
using LinearAlgebra
|
||||
using CSV
|
||||
using DataFramesMeta
|
||||
using Plots
|
||||
using StatsPlots
|
||||
# using Plots
|
||||
# using StatsPlots
|
||||
using PlotlyJS
|
||||
|
||||
begin
|
||||
df = CSV.read("compiled.csv", DataFrame)
|
||||
|
||||
df.bb_volume = df.bb_length .* df.bb_width .* df.bb_height
|
||||
df.box = df.bb_length .* df.bb_width .* df.bb_height
|
||||
|
||||
|
||||
@eachrow! df begin
|
||||
@ -33,7 +34,7 @@ begin
|
||||
# Convert material to scalar
|
||||
begin
|
||||
kv = Dict(reverse.(enumerate(Set(df.material_name))))
|
||||
mats = []
|
||||
mats = [] |> Vector{Int}
|
||||
for material in df.material_name
|
||||
push!(mats, kv[material])
|
||||
end
|
||||
@ -41,20 +42,28 @@ begin
|
||||
end
|
||||
|
||||
# Remove columns not needed for analysis
|
||||
df = df[!, [:mass, :volume, :density, :area, :bb_volume, :Ibar]]
|
||||
# df = df[!, [:mass, :volume, :density, :area, :bb_volume, :Ibar, :material_index]]
|
||||
|
||||
# Remove outliers
|
||||
df = df[df.bb_volume.<1e6, :]
|
||||
df = df[df.box.<1e6, :]
|
||||
df = df[df.mass.<1000, :]
|
||||
end
|
||||
|
||||
@df df cornerplot(cols(1:4), compact = true)
|
||||
# @df df cornerplot(cols(1:7), compact = true)
|
||||
|
||||
# plot(df.mass)
|
||||
# histogram(df.mass)
|
||||
scatter(df.mass, df.volume)
|
||||
# scatter(df.mass, df.Ibar)
|
||||
|
||||
features = [:mass, :volume, :density, :area, :box, :Ibar]
|
||||
|
||||
plot(df, dimensions = features, kind = "splom", Layout(title = "Raw Data"))
|
||||
|
||||
corner(df)
|
||||
|
||||
CSV.write("prepped.csv", df)
|
||||
|
||||
|
||||
|
||||
|
||||
CSV.write("prepped.csv", df)
|
||||
df = dataset(DataFrame, "iris")
|
||||
features = [:sepal_width, :sepal_length, :petal_width, :petal_length]
|
||||
plot(df, dimensions = features, color = :species, kind = "splom")
|
BIN
prepped.png
Normal file
BIN
prepped.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 74 KiB |
BIN
process.mlx
Normal file
BIN
process.mlx
Normal file
Binary file not shown.
Loading…
x
Reference in New Issue
Block a user