Piv
28b11aaa44
Use image type for CoreML input, reshape last layer to work with metal shader
2021-09-10 22:13:41 +09:30
Piv
3254eef4bf
Add compiling packnet model, refactor modules to not duplicate loaders and trainers
2021-07-23 22:41:46 +09:30
Piv
38e7ad069e
Refactor load/util, start fixing packnet to support NHWC format
2021-07-19 12:32:56 +09:30
Michael Pivato
070aec6eed
Add Kitti depth dataset
...
Warning: Using this requires >175gb of disk space (tensorflow will also generate examples that will take up space)
2021-04-22 12:13:48 +00:00
Piv
870429c3ef
Refactor fast-depth
...
Addresses the following:
- Rename nnconv5 block to nnconv5
- Add skip connections directly to nnconv5 block
- Allow custom metrics, loss and optimizer (keep defaults that reflect original paper) to train
- Correctly use nyu evaluation dataset only when no dataset is provided
2021-03-29 17:57:12 +10:30
Piv
3325ea0c0c
Format pep8, include pass shaped to mobilenet
2021-03-25 21:56:50 +10:30
Piv
78d5aace15
Remove Experimental model
...
It didn't perform any better than the regular model
Removing batch normalisation significantly harmed training performance
2021-03-25 21:28:07 +10:30
Piv
ab7da5acd4
Add documentation and README, use Upsampling2D rather than image Resizing layer
2021-03-24 21:35:25 +10:30
Piv
ac3ab27ddd
Add model with no batch normalisation, use actual mobilnet model rather than extracting layers
...
Found from DenseDepth, each layer can be set to trainable in the encoder, then the outputs of the
model and the required layers for skip connections can be used directly. Ends up being much cleaner
2021-03-24 20:00:26 +10:30
Piv
39074f22a7
Add working train and eval functions for nyu_v2
2021-03-21 09:51:45 +10:30
Piv
fea08521bb
Add metrics, prepare for training
2021-03-17 21:15:06 +10:30
Piv
00762f3e86
Build Functional Model, remove subclassed model
...
Functional models are way easier to work with,
and I don't need any advanced features that would
require model subclassing
2021-03-17 18:24:46 +10:30
Piv
b25b9be4eb
Initial Commit
2021-03-16 21:06:27 +10:30