Piv
e5b07fb766
Fix loss function
2021-06-16 21:52:31 +09:30
Michael Pivato
0547509689
Merge branch 'kitti_depth_dataset' into 'main'
...
Kitti depth dataset
See merge request vato007/fast-depth-tf!2
2021-04-22 12:13:48 +00:00
Michael Pivato
070aec6eed
Add Kitti depth dataset
...
Warning: Using this requires >175gb of disk space (tensorflow will also generate examples that will take up space)
2021-04-22 12:13:48 +00:00
Piv
02d8cd5810
Remove half-features from dense_depth
2021-04-22 12:30:30 +09:30
Piv
acdb58396c
Remove usage of keras
2021-04-14 12:45:01 +09:30
Piv
cf7d2561ec
Implement details of dense depth paper
2021-04-14 12:38:51 +09:30
Piv
f598005b73
Add basic coreml and mlkit conversion scripts
2021-03-29 19:08:55 +10:30
Piv
f3fc0f8fbb
Add coreml conversion
2021-03-29 18:58:16 +10:30
Michael Pivato
f2a42cca4c
Merge branch 'dense-depth' into 'main'
...
Dense depth
See merge request vato007/fast-depth-tf!1
2021-03-29 07:31:42 +00:00
Piv
d88e9d3f12
Add dense-depth and experimental dense-net-nnconv5 models
...
Since dense-depth will use half labels by default, the nyu train/eval datasets can be loaded from here at half resolutions for labels
2021-03-29 17:59:18 +10:30
Piv
870429c3ef
Refactor fast-depth
...
Addresses the following:
- Rename nnconv5 block to nnconv5
- Add skip connections directly to nnconv5 block
- Allow custom metrics, loss and optimizer (keep defaults that reflect original paper) to train
- Correctly use nyu evaluation dataset only when no dataset is provided
2021-03-29 17:57:12 +10:30
Piv
3325ea0c0c
Format pep8, include pass shaped to mobilenet
2021-03-25 21:56:50 +10:30
Piv
9449ddef01
Add notebook, gitignore
2021-03-25 21:50:19 +10:30
Piv
78d5aace15
Remove Experimental model
...
It didn't perform any better than the regular model
Removing batch normalisation significantly harmed training performance
2021-03-25 21:28:07 +10:30
Piv
ab7da5acd4
Add documentation and README, use Upsampling2D rather than image Resizing layer
2021-03-24 21:35:25 +10:30
Piv
ac3ab27ddd
Add model with no batch normalisation, use actual mobilnet model rather than extracting layers
...
Found from DenseDepth, each layer can be set to trainable in the encoder, then the outputs of the
model and the required layers for skip connections can be used directly. Ends up being much cleaner
2021-03-24 20:00:26 +10:30
Piv
39074f22a7
Add working train and eval functions for nyu_v2
2021-03-21 09:51:45 +10:30
Piv
fea08521bb
Add metrics, prepare for training
2021-03-17 21:15:06 +10:30
Piv
00762f3e86
Build Functional Model, remove subclassed model
...
Functional models are way easier to work with,
and I don't need any advanced features that would
require model subclassing
2021-03-17 18:24:46 +10:30
Piv
b25b9be4eb
Initial Commit
2021-03-16 21:06:27 +10:30