Last active
November 5, 2017 09:34
-
-
Save gaving/d7c172e91500f5de35044c14a3e4bc50 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
192:~/Sandbox/dd% docker run -p 8080:8080 --name dd -v /Users/gavin/Sandbox/dd:/opt/models beniz/deepdetect_cpu | |
DeepDetect [ commit 978401f3d1f23a327d0ebfef24cb0a0d7c543c6e ] | |
INFO - 09:30:47 - Running DeepDetect HTTP server on 0.0.0.0:8080 | |
INFO - 09:30:53 - Initializing net from parameters: | |
INFO - 09:30:53 - Creating layer / name=vgg_face / type=MemoryData | |
INFO - 09:30:53 - Creating Layer vgg_face | |
INFO - 09:30:53 - vgg_face -> data | |
INFO - 09:30:53 - vgg_face -> label | |
INFO - 09:30:53 - Setting up vgg_face | |
INFO - 09:30:53 - Top shape: 10 3 224 224 (1505280) | |
INFO - 09:30:53 - Top shape: 10 (10) | |
INFO - 09:30:53 - Memory required for data: 6021160 | |
INFO - 09:30:53 - Creating layer / name=conv1_1 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv1_1 | |
INFO - 09:30:53 - conv1_1 <- data | |
INFO - 09:30:53 - conv1_1 -> conv1_1 | |
INFO - 09:30:53 - Setting up conv1_1 | |
INFO - 09:30:53 - Top shape: 10 64 224 224 (32112640) | |
INFO - 09:30:53 - Memory required for data: 134471720 | |
INFO - 09:30:53 - Creating layer / name=relu1_1 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu1_1 | |
INFO - 09:30:53 - relu1_1 <- conv1_1 | |
INFO - 09:30:53 - relu1_1 -> conv1_1 (in-place) | |
INFO - 09:30:53 - Setting up relu1_1 | |
INFO - 09:30:53 - Top shape: 10 64 224 224 (32112640) | |
INFO - 09:30:53 - Memory required for data: 262922280 | |
INFO - 09:30:53 - Creating layer / name=conv1_2 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv1_2 | |
INFO - 09:30:53 - conv1_2 <- conv1_1 | |
INFO - 09:30:53 - conv1_2 -> conv1_2 | |
INFO - 09:30:53 - Setting up conv1_2 | |
INFO - 09:30:53 - Top shape: 10 64 224 224 (32112640) | |
INFO - 09:30:53 - Memory required for data: 391372840 | |
INFO - 09:30:53 - Creating layer / name=relu1_2 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu1_2 | |
INFO - 09:30:53 - relu1_2 <- conv1_2 | |
INFO - 09:30:53 - relu1_2 -> conv1_2 (in-place) | |
INFO - 09:30:53 - Setting up relu1_2 | |
INFO - 09:30:53 - Top shape: 10 64 224 224 (32112640) | |
INFO - 09:30:53 - Memory required for data: 519823400 | |
INFO - 09:30:53 - Creating layer / name=pool1 / type=Pooling | |
INFO - 09:30:53 - Creating Layer pool1 | |
INFO - 09:30:53 - pool1 <- conv1_2 | |
INFO - 09:30:53 - pool1 -> pool1 | |
INFO - 09:30:53 - Setting up pool1 | |
INFO - 09:30:53 - Top shape: 10 64 112 112 (8028160) | |
INFO - 09:30:53 - Memory required for data: 551936040 | |
INFO - 09:30:53 - Creating layer / name=conv2_1 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv2_1 | |
INFO - 09:30:53 - conv2_1 <- pool1 | |
INFO - 09:30:53 - conv2_1 -> conv2_1 | |
INFO - 09:30:53 - Setting up conv2_1 | |
INFO - 09:30:53 - Top shape: 10 128 112 112 (16056320) | |
INFO - 09:30:53 - Memory required for data: 616161320 | |
INFO - 09:30:53 - Creating layer / name=relu2_1 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu2_1 | |
INFO - 09:30:53 - relu2_1 <- conv2_1 | |
INFO - 09:30:53 - relu2_1 -> conv2_1 (in-place) | |
INFO - 09:30:53 - Setting up relu2_1 | |
INFO - 09:30:53 - Top shape: 10 128 112 112 (16056320) | |
INFO - 09:30:53 - Memory required for data: 680386600 | |
INFO - 09:30:53 - Creating layer / name=conv2_2 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv2_2 | |
INFO - 09:30:53 - conv2_2 <- conv2_1 | |
INFO - 09:30:53 - conv2_2 -> conv2_2 | |
INFO - 09:30:53 - Setting up conv2_2 | |
INFO - 09:30:53 - Top shape: 10 128 112 112 (16056320) | |
INFO - 09:30:53 - Memory required for data: 744611880 | |
INFO - 09:30:53 - Creating layer / name=relu2_2 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu2_2 | |
INFO - 09:30:53 - relu2_2 <- conv2_2 | |
INFO - 09:30:53 - relu2_2 -> conv2_2 (in-place) | |
INFO - 09:30:53 - Setting up relu2_2 | |
INFO - 09:30:53 - Top shape: 10 128 112 112 (16056320) | |
INFO - 09:30:53 - Memory required for data: 808837160 | |
INFO - 09:30:53 - Creating layer / name=pool2 / type=Pooling | |
INFO - 09:30:53 - Creating Layer pool2 | |
INFO - 09:30:53 - pool2 <- conv2_2 | |
INFO - 09:30:53 - pool2 -> pool2 | |
INFO - 09:30:53 - Setting up pool2 | |
INFO - 09:30:53 - Top shape: 10 128 56 56 (4014080) | |
INFO - 09:30:53 - Memory required for data: 824893480 | |
INFO - 09:30:53 - Creating layer / name=conv3_1 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv3_1 | |
INFO - 09:30:53 - conv3_1 <- pool2 | |
INFO - 09:30:53 - conv3_1 -> conv3_1 | |
INFO - 09:30:53 - Setting up conv3_1 | |
INFO - 09:30:53 - Top shape: 10 256 56 56 (8028160) | |
INFO - 09:30:53 - Memory required for data: 857006120 | |
INFO - 09:30:53 - Creating layer / name=relu3_1 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu3_1 | |
INFO - 09:30:53 - relu3_1 <- conv3_1 | |
INFO - 09:30:53 - relu3_1 -> conv3_1 (in-place) | |
INFO - 09:30:53 - Setting up relu3_1 | |
INFO - 09:30:53 - Top shape: 10 256 56 56 (8028160) | |
INFO - 09:30:53 - Memory required for data: 889118760 | |
INFO - 09:30:53 - Creating layer / name=conv3_2 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv3_2 | |
INFO - 09:30:53 - conv3_2 <- conv3_1 | |
INFO - 09:30:53 - conv3_2 -> conv3_2 | |
INFO - 09:30:53 - Setting up conv3_2 | |
INFO - 09:30:53 - Top shape: 10 256 56 56 (8028160) | |
INFO - 09:30:53 - Memory required for data: 921231400 | |
INFO - 09:30:53 - Creating layer / name=relu3_2 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu3_2 | |
INFO - 09:30:53 - relu3_2 <- conv3_2 | |
INFO - 09:30:53 - relu3_2 -> conv3_2 (in-place) | |
INFO - 09:30:53 - Setting up relu3_2 | |
INFO - 09:30:53 - Top shape: 10 256 56 56 (8028160) | |
INFO - 09:30:53 - Memory required for data: 953344040 | |
INFO - 09:30:53 - Creating layer / name=conv3_3 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv3_3 | |
INFO - 09:30:53 - conv3_3 <- conv3_2 | |
INFO - 09:30:53 - conv3_3 -> conv3_3 | |
INFO - 09:30:53 - Setting up conv3_3 | |
INFO - 09:30:53 - Top shape: 10 256 56 56 (8028160) | |
INFO - 09:30:53 - Memory required for data: 985456680 | |
INFO - 09:30:53 - Creating layer / name=relu3_3 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu3_3 | |
INFO - 09:30:53 - relu3_3 <- conv3_3 | |
INFO - 09:30:53 - relu3_3 -> conv3_3 (in-place) | |
INFO - 09:30:53 - Setting up relu3_3 | |
INFO - 09:30:53 - Top shape: 10 256 56 56 (8028160) | |
INFO - 09:30:53 - Memory required for data: 1017569320 | |
INFO - 09:30:53 - Creating layer / name=pool3 / type=Pooling | |
INFO - 09:30:53 - Creating Layer pool3 | |
INFO - 09:30:53 - pool3 <- conv3_3 | |
INFO - 09:30:53 - pool3 -> pool3 | |
INFO - 09:30:53 - Setting up pool3 | |
INFO - 09:30:53 - Top shape: 10 256 28 28 (2007040) | |
INFO - 09:30:53 - Memory required for data: 1025597480 | |
INFO - 09:30:53 - Creating layer / name=conv4_1 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv4_1 | |
INFO - 09:30:53 - conv4_1 <- pool3 | |
INFO - 09:30:53 - conv4_1 -> conv4_1 | |
INFO - 09:30:53 - Setting up conv4_1 | |
INFO - 09:30:53 - Top shape: 10 512 28 28 (4014080) | |
INFO - 09:30:53 - Memory required for data: 1041653800 | |
INFO - 09:30:53 - Creating layer / name=relu4_1 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu4_1 | |
INFO - 09:30:53 - relu4_1 <- conv4_1 | |
INFO - 09:30:53 - relu4_1 -> conv4_1 (in-place) | |
INFO - 09:30:53 - Setting up relu4_1 | |
INFO - 09:30:53 - Top shape: 10 512 28 28 (4014080) | |
INFO - 09:30:53 - Memory required for data: 1057710120 | |
INFO - 09:30:53 - Creating layer / name=conv4_2 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv4_2 | |
INFO - 09:30:53 - conv4_2 <- conv4_1 | |
INFO - 09:30:53 - conv4_2 -> conv4_2 | |
INFO - 09:30:53 - Setting up conv4_2 | |
INFO - 09:30:53 - Top shape: 10 512 28 28 (4014080) | |
INFO - 09:30:53 - Memory required for data: 1073766440 | |
INFO - 09:30:53 - Creating layer / name=relu4_2 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu4_2 | |
INFO - 09:30:53 - relu4_2 <- conv4_2 | |
INFO - 09:30:53 - relu4_2 -> conv4_2 (in-place) | |
INFO - 09:30:53 - Setting up relu4_2 | |
INFO - 09:30:53 - Top shape: 10 512 28 28 (4014080) | |
INFO - 09:30:53 - Memory required for data: 1089822760 | |
INFO - 09:30:53 - Creating layer / name=conv4_3 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv4_3 | |
INFO - 09:30:53 - conv4_3 <- conv4_2 | |
INFO - 09:30:53 - conv4_3 -> conv4_3 | |
INFO - 09:30:53 - Setting up conv4_3 | |
INFO - 09:30:53 - Top shape: 10 512 28 28 (4014080) | |
INFO - 09:30:53 - Memory required for data: 1105879080 | |
INFO - 09:30:53 - Creating layer / name=relu4_3 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu4_3 | |
INFO - 09:30:53 - relu4_3 <- conv4_3 | |
INFO - 09:30:53 - relu4_3 -> conv4_3 (in-place) | |
INFO - 09:30:53 - Setting up relu4_3 | |
INFO - 09:30:53 - Top shape: 10 512 28 28 (4014080) | |
INFO - 09:30:53 - Memory required for data: 1121935400 | |
INFO - 09:30:53 - Creating layer / name=pool4 / type=Pooling | |
INFO - 09:30:53 - Creating Layer pool4 | |
INFO - 09:30:53 - pool4 <- conv4_3 | |
INFO - 09:30:53 - pool4 -> pool4 | |
INFO - 09:30:53 - Setting up pool4 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1125949480 | |
INFO - 09:30:53 - Creating layer / name=conv5_1 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv5_1 | |
INFO - 09:30:53 - conv5_1 <- pool4 | |
INFO - 09:30:53 - conv5_1 -> conv5_1 | |
INFO - 09:30:53 - Setting up conv5_1 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1129963560 | |
INFO - 09:30:53 - Creating layer / name=relu5_1 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu5_1 | |
INFO - 09:30:53 - relu5_1 <- conv5_1 | |
INFO - 09:30:53 - relu5_1 -> conv5_1 (in-place) | |
INFO - 09:30:53 - Setting up relu5_1 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1133977640 | |
INFO - 09:30:53 - Creating layer / name=conv5_2 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv5_2 | |
INFO - 09:30:53 - conv5_2 <- conv5_1 | |
INFO - 09:30:53 - conv5_2 -> conv5_2 | |
INFO - 09:30:53 - Setting up conv5_2 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1137991720 | |
INFO - 09:30:53 - Creating layer / name=relu5_2 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu5_2 | |
INFO - 09:30:53 - relu5_2 <- conv5_2 | |
INFO - 09:30:53 - relu5_2 -> conv5_2 (in-place) | |
INFO - 09:30:53 - Setting up relu5_2 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1142005800 | |
INFO - 09:30:53 - Creating layer / name=conv5_3 / type=Convolution | |
INFO - 09:30:53 - Creating Layer conv5_3 | |
INFO - 09:30:53 - conv5_3 <- conv5_2 | |
INFO - 09:30:53 - conv5_3 -> conv5_3 | |
INFO - 09:30:53 - Setting up conv5_3 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1146019880 | |
INFO - 09:30:53 - Creating layer / name=relu5_3 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu5_3 | |
INFO - 09:30:53 - relu5_3 <- conv5_3 | |
INFO - 09:30:53 - relu5_3 -> conv5_3 (in-place) | |
INFO - 09:30:53 - Setting up relu5_3 | |
INFO - 09:30:53 - Top shape: 10 512 14 14 (1003520) | |
INFO - 09:30:53 - Memory required for data: 1150033960 | |
INFO - 09:30:53 - Creating layer / name=pool5 / type=Pooling | |
INFO - 09:30:53 - Creating Layer pool5 | |
INFO - 09:30:53 - pool5 <- conv5_3 | |
INFO - 09:30:53 - pool5 -> pool5 | |
INFO - 09:30:53 - Setting up pool5 | |
INFO - 09:30:53 - Top shape: 10 512 7 7 (250880) | |
INFO - 09:30:53 - Memory required for data: 1151037480 | |
INFO - 09:30:53 - Creating layer / name=fc6 / type=InnerProduct | |
INFO - 09:30:53 - Creating Layer fc6 | |
INFO - 09:30:53 - fc6 <- pool5 | |
INFO - 09:30:53 - fc6 -> fc6 | |
INFO - 09:30:53 - Setting up fc6 | |
INFO - 09:30:53 - Top shape: 10 4096 (40960) | |
INFO - 09:30:53 - Memory required for data: 1151201320 | |
INFO - 09:30:53 - Creating layer / name=relu6 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu6 | |
INFO - 09:30:53 - relu6 <- fc6 | |
INFO - 09:30:53 - relu6 -> fc6 (in-place) | |
INFO - 09:30:53 - Setting up relu6 | |
INFO - 09:30:53 - Top shape: 10 4096 (40960) | |
INFO - 09:30:53 - Memory required for data: 1151365160 | |
INFO - 09:30:53 - Creating layer / name=drop6 / type=Dropout | |
INFO - 09:30:53 - Creating Layer drop6 | |
INFO - 09:30:53 - drop6 <- fc6 | |
INFO - 09:30:53 - drop6 -> fc6 (in-place) | |
INFO - 09:30:53 - Setting up drop6 | |
INFO - 09:30:53 - Top shape: 10 4096 (40960) | |
INFO - 09:30:53 - Memory required for data: 1151529000 | |
INFO - 09:30:53 - Creating layer / name=fc7 / type=InnerProduct | |
INFO - 09:30:53 - Creating Layer fc7 | |
INFO - 09:30:53 - fc7 <- fc6 | |
INFO - 09:30:53 - fc7 -> fc7 | |
INFO - 09:30:53 - Setting up fc7 | |
INFO - 09:30:53 - Top shape: 10 4096 (40960) | |
INFO - 09:30:53 - Memory required for data: 1151692840 | |
INFO - 09:30:53 - Creating layer / name=relu7 / type=ReLU | |
INFO - 09:30:53 - Creating Layer relu7 | |
INFO - 09:30:53 - relu7 <- fc7 | |
INFO - 09:30:53 - relu7 -> fc7 (in-place) | |
INFO - 09:30:53 - Setting up relu7 | |
INFO - 09:30:53 - Top shape: 10 4096 (40960) | |
INFO - 09:30:53 - Memory required for data: 1151856680 | |
INFO - 09:30:53 - Creating layer / name=drop7 / type=Dropout | |
INFO - 09:30:53 - Creating Layer drop7 | |
INFO - 09:30:53 - drop7 <- fc7 | |
INFO - 09:30:53 - drop7 -> fc7 (in-place) | |
INFO - 09:30:53 - Setting up drop7 | |
INFO - 09:30:53 - Top shape: 10 4096 (40960) | |
INFO - 09:30:53 - Memory required for data: 1152020520 | |
INFO - 09:30:53 - Creating layer / name=fc8 / type=InnerProduct | |
INFO - 09:30:53 - Creating Layer fc8 | |
INFO - 09:30:53 - fc8 <- fc7 | |
INFO - 09:30:53 - fc8 -> fc8 | |
INFO - 09:30:53 - Setting up fc8 | |
INFO - 09:30:53 - Top shape: 10 2622 (26220) | |
INFO - 09:30:53 - Memory required for data: 1152125400 | |
INFO - 09:30:53 - Creating layer / name=prob / type=Softmax | |
INFO - 09:30:53 - Creating Layer prob | |
INFO - 09:30:53 - prob <- fc8 | |
INFO - 09:30:53 - prob -> prob | |
INFO - 09:30:53 - Setting up prob | |
INFO - 09:30:53 - Top shape: 10 2622 (26220) | |
INFO - 09:30:53 - Memory required for data: 1152230280 | |
INFO - 09:30:53 - prob does not need backward computation. | |
INFO - 09:30:53 - fc8 does not need backward computation. | |
INFO - 09:30:53 - drop7 does not need backward computation. | |
INFO - 09:30:53 - relu7 does not need backward computation. | |
INFO - 09:30:53 - fc7 does not need backward computation. | |
INFO - 09:30:53 - drop6 does not need backward computation. | |
INFO - 09:30:53 - relu6 does not need backward computation. | |
INFO - 09:30:53 - fc6 does not need backward computation. | |
INFO - 09:30:53 - pool5 does not need backward computation. | |
INFO - 09:30:53 - relu5_3 does not need backward computation. | |
INFO - 09:30:53 - conv5_3 does not need backward computation. | |
INFO - 09:30:53 - relu5_2 does not need backward computation. | |
INFO - 09:30:53 - conv5_2 does not need backward computation. | |
INFO - 09:30:53 - relu5_1 does not need backward computation. | |
INFO - 09:30:53 - conv5_1 does not need backward computation. | |
INFO - 09:30:53 - pool4 does not need backward computation. | |
INFO - 09:30:53 - relu4_3 does not need backward computation. | |
INFO - 09:30:53 - conv4_3 does not need backward computation. | |
INFO - 09:30:53 - relu4_2 does not need backward computation. | |
INFO - 09:30:53 - conv4_2 does not need backward computation. | |
INFO - 09:30:53 - relu4_1 does not need backward computation. | |
INFO - 09:30:53 - conv4_1 does not need backward computation. | |
INFO - 09:30:53 - pool3 does not need backward computation. | |
INFO - 09:30:53 - relu3_3 does not need backward computation. | |
INFO - 09:30:53 - conv3_3 does not need backward computation. | |
INFO - 09:30:53 - relu3_2 does not need backward computation. | |
INFO - 09:30:53 - conv3_2 does not need backward computation. | |
INFO - 09:30:53 - relu3_1 does not need backward computation. | |
INFO - 09:30:53 - conv3_1 does not need backward computation. | |
INFO - 09:30:53 - pool2 does not need backward computation. | |
INFO - 09:30:53 - relu2_2 does not need backward computation. | |
INFO - 09:30:53 - conv2_2 does not need backward computation. | |
INFO - 09:30:53 - relu2_1 does not need backward computation. | |
INFO - 09:30:53 - conv2_1 does not need backward computation. | |
INFO - 09:30:53 - pool1 does not need backward computation. | |
INFO - 09:30:53 - relu1_2 does not need backward computation. | |
INFO - 09:30:53 - conv1_2 does not need backward computation. | |
INFO - 09:30:53 - relu1_1 does not need backward computation. | |
INFO - 09:30:53 - conv1_1 does not need backward computation. | |
INFO - 09:30:53 - vgg_face does not need backward computation. | |
INFO - 09:30:53 - This network produces output label | |
INFO - 09:30:53 - This network produces output prob | |
INFO - 09:30:53 - Network initialization done.[09:30:53] /opt/deepdetect/src/caffelib.cc:408: Using pre-trained weights from /opt/models/vgg_face/VGG_FACE.caffemodel | |
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h. | |
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 580013788 | |
[09:31:02] /opt/deepdetect/src/caffelib.cc:1932: Net total flops=15476908032 / total params=144987840 | |
INFO - 09:31:02 - Sun Nov 5 09:31:02 2017 UTC - 192.168.99.1 "PUT /services/face" 201 8613 | |
INFO - 09:31:16 - A total of 3 images. | |
ERROR - 09:31:16 - service face training call failed | |
ERROR - 09:31:16 - Sun Nov 5 09:31:16 2017 UTC - 192.168.99.1 "POST /train" 500 14 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mbp:~/Sandbox/dd/vgg_face% curl -X PUT "http://192.168.99.100:8080/services/face" -d ' 9:30 | |
{ | |
"mllib":"caffe", | |
"description":"face recognition service", | |
"type":"supervised", | |
"parameters":{ | |
"input":{ | |
"connector":"image", | |
"width":224, | |
"height":224 | |
}, | |
"mllib":{ | |
"nclasses":2 | |
} | |
}, | |
"model":{ | |
"templates":"../templates/caffe/", | |
"repository":"/opt/models/vgg_face" | |
} | |
}' | |
{"status":{"code":201,"msg":"Created"}} | |
mbp:~/Sandbox/dd/vgg_face% curl -X POST "http://192.168.99.100:8080/train" -d '{ 9:32 | |
"service":"face", | |
"async":false, | |
"parameters":{ | |
"mllib":{ | |
"gpu":false, | |
"net":{ | |
"batch_size":32 | |
}, | |
"solver":{ | |
"test_interval":500, | |
"iterations":30000, | |
"base_lr":0.001, | |
"stepsize":1000, | |
"gamma":0.9 | |
} | |
}, | |
"input":{ | |
"connector": | |
"image", | |
"test_split":0.1, | |
"shuffle":true, | |
"width":224, | |
"height":224 | |
}, | |
"output":{ | |
"measure":["acc","mcll","f1"] | |
} | |
}, | |
"data":["/opt/models/train"] | |
}' | |
{"status":{"code":500,"msg":"InternalError","dd_code":1007,"dd_msg":"./include/caffe/util/db_lmdb.hpp:15 / Check failed (custom): (mdb_status) == (0)"}}% |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mbp:~/Sandbox/dd/vgg_face% lsa 9:34 | |
total 554M | |
-rw-r--r-- 1 gavin 554M Nov 5 09:24 VGG_FACE.caffemodel | |
-rw-r--r-- 1 gavin 4.8K Nov 5 09:25 VGG_FACE_deploy.prototxt | |
-rw-r--r-- 1 gavin 295 Nov 5 09:25 VGG_FACE_solver.prototxt | |
-rw-r--r-- 1 gavin 6.5K Nov 5 09:25 train_val.prototxt |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment