Last active
February 18, 2021 19:21
-
-
Save tahwaru/4907fb01c14d8034c8991f985b45f92b to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
C:\\AppData\Local\JetBrains\PyCharm Community Edition 2020.3\plugins\python-ce\helpers\pydev\pydevconsole.py" --mode=client --port=51450 | |
import sys; print('Python %s on %s' % (sys.version, sys.platform)) | |
sys.path.extend(['............PycharmProjects/cvnn']) | |
Python 3.8.5 (default, Sep 3 2020, 21:29:08) [MSC v.1916 64 bit (AMD64)] | |
Type 'copyright', 'credits' or 'license' for more information | |
IPython 7.19.0 -- An enhanced Interactive Python. Type '?' for help. | |
PyDev console: using IPython 7.19.0 | |
Python 3.8.5 (default, Sep 3 2020, 21:29:08) [MSC v.1916 64 bit (AMD64)] on win32 | |
In[2]: runfile(/PycharmProjects/cvnn/examples/MyTestComplex.py', wdir='C:armProjects/cvnn/examples') | |
2021-02-15 12:31:07.998340: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'cudart64_101.dll'; dlerror: cudart64_101.dll not found | |
2021-02-15 12:31:07.998928: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine. | |
2021-02-15 12:31:18.991559: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'nvcuda.dll'; dlerror: nvcuda.dll not found | |
2021-02-15 12:31:18.992101: W tensorflow/stream_executor/cuda/cuda_driver.cc:312] failed call to cuInit: UNKNOWN ERROR (303) | |
2021-02-15 12:31:18.997768: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:169] retrieving CUDA diagnostic information for host: ip2979 | |
2021-02-15 12:31:18.998568: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:176] hostname: ip2979 | |
2021-02-15 12:31:18.999505: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations: AVX2 | |
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags. | |
2021-02-15 12:31:19.010674: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1b6d6ed38c0 initialized for platform Host (this does not guarantee that XLA will be used). Devices: | |
2021-02-15 12:31:19.011428: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version | |
Model: "sequential" | |
_________________________________________________________________ | |
Layer (type) Output Shape Param # | |
================================================================= | |
complex_conv2d (ComplexConv2 (None, 30, 30, 32) 1792 | |
_________________________________________________________________ | |
complex_avg_pooling2d (Compl (None, 15, 15, 32) 0 | |
_________________________________________________________________ | |
complex_conv2d_1 (ComplexCon (None, 13, 13, 64) 36992 | |
_________________________________________________________________ | |
complex_max_pooling2d (Compl (None, 6, 6, 64) 0 | |
_________________________________________________________________ | |
complex_conv2d_2 (ComplexCon (None, 4, 4, 64) 73856 | |
_________________________________________________________________ | |
complex_flatten (ComplexFlat (None, 1024) 0 | |
_________________________________________________________________ | |
complex_dense (ComplexDense) (None, 64) 131200 | |
_________________________________________________________________ | |
complex_dense_1 (ComplexDens (None, 10) 1300 | |
================================================================= | |
Total params: 245,140 | |
Trainable params: 245,140 | |
Non-trainable params: 0 | |
_________________________________________________________________ | |
Epoch 1/50 | |
1563/1563 [==============================] - 143s 91ms/step - loss: 1.5110 - accuracy: 0.4488 - val_loss: 1.2889 - val_accuracy: 0.5366 | |
Epoch 2/50 | |
1563/1563 [==============================] - 140s 89ms/step - loss: 1.1109 - accuracy: 0.6088 - val_loss: 1.0252 - val_accuracy: 0.6405 | |
Epoch 3/50 | |
1563/1563 [==============================] - 140s 89ms/step - loss: 0.9422 - accuracy: 0.6695 - val_loss: 0.9887 - val_accuracy: 0.6490 | |
Epoch 4/50 | |
1563/1563 [==============================] - 140s 90ms/step - loss: 0.8216 - accuracy: 0.7118 - val_loss: 0.9612 - val_accuracy: 0.6705 | |
Epoch 5/50 | |
1563/1563 [==============================] - 140s 90ms/step - loss: 0.7277 - accuracy: 0.7441 - val_loss: 0.8866 - val_accuracy: 0.6936 | |
Epoch 6/50 | |
1563/1563 [==============================] - 140s 90ms/step - loss: 0.6409 - accuracy: 0.7767 - val_loss: 0.9238 - val_accuracy: 0.6909 | |
Epoch 7/50 | |
1563/1563 [==============================] - 140s 90ms/step - loss: 0.5603 - accuracy: 0.8029 - val_loss: 0.9498 - val_accuracy: 0.6981 | |
Epoch 8/50 | |
1563/1563 [==============================] - 144s 92ms/step - loss: 0.4865 - accuracy: 0.8275 - val_loss: 1.0241 - val_accuracy: 0.6928 | |
Epoch 9/50 | |
1563/1563 [==============================] - 141s 90ms/step - loss: 0.4191 - accuracy: 0.8506 - val_loss: 1.0685 - val_accuracy: 0.6888 | |
Epoch 10/50 | |
1563/1563 [==============================] - 141s 91ms/step - loss: 0.3565 - accuracy: 0.8714 - val_loss: 1.2009 - val_accuracy: 0.6870 | |
Epoch 11/50 | |
1563/1563 [==============================] - 142s 91ms/step - loss: 0.3047 - accuracy: 0.8907 - val_loss: 1.2294 - val_accuracy: 0.6857 | |
Epoch 12/50 | |
1563/1563 [==============================] - 141s 90ms/step - loss: 0.2645 - accuracy: 0.9049 - val_loss: 1.3471 - val_accuracy: 0.6866 | |
Epoch 13/50 | |
1563/1563 [==============================] - 142s 91ms/step - loss: 0.2339 - accuracy: 0.9165 - val_loss: 1.4194 - val_accuracy: 0.6832 | |
Epoch 14/50 | |
1563/1563 [==============================] - 142s 91ms/step - loss: 0.2004 - accuracy: 0.9287 - val_loss: 1.5005 - val_accuracy: 0.6914 | |
Epoch 15/50 | |
1563/1563 [==============================] - 142s 91ms/step - loss: 0.1870 - accuracy: 0.9336 - val_loss: 1.7081 - val_accuracy: 0.6856 | |
Epoch 16/50 | |
1563/1563 [==============================] - 143s 92ms/step - loss: 0.1687 - accuracy: 0.9410 - val_loss: 1.7442 - val_accuracy: 0.6844 | |
Epoch 17/50 | |
1563/1563 [==============================] - 136s 87ms/step - loss: 0.1544 - accuracy: 0.9462 - val_loss: 1.7740 - val_accuracy: 0.6841 | |
Epoch 18/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1510 - accuracy: 0.9479 - val_loss: 1.8155 - val_accuracy: 0.6783 | |
Epoch 19/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1490 - accuracy: 0.9475 - val_loss: 1.9241 - val_accuracy: 0.6802 | |
Epoch 20/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.1335 - accuracy: 0.9541 - val_loss: 2.0245 - val_accuracy: 0.6829 | |
Epoch 21/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1371 - accuracy: 0.9532 - val_loss: 2.1338 - val_accuracy: 0.6723 | |
Epoch 22/50 | |
1563/1563 [==============================] - 133s 85ms/step - loss: 0.1235 - accuracy: 0.9585 - val_loss: 2.0892 - val_accuracy: 0.6801 | |
Epoch 23/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1229 - accuracy: 0.9590 - val_loss: 2.2426 - val_accuracy: 0.6836 | |
Epoch 24/50 | |
1563/1563 [==============================] - 132s 84ms/step - loss: 0.1241 - accuracy: 0.9588 - val_loss: 2.2278 - val_accuracy: 0.6710 | |
Epoch 25/50 | |
1563/1563 [==============================] - 132s 85ms/step - loss: 0.1174 - accuracy: 0.9611 - val_loss: 2.3759 - val_accuracy: 0.6754 | |
Epoch 26/50 | |
1563/1563 [==============================] - 132s 84ms/step - loss: 0.1142 - accuracy: 0.9624 - val_loss: 2.3872 - val_accuracy: 0.6822 | |
Epoch 27/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1127 - accuracy: 0.9626 - val_loss: 2.5138 - val_accuracy: 0.6815 | |
Epoch 28/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1135 - accuracy: 0.9631 - val_loss: 2.4889 - val_accuracy: 0.6842 | |
Epoch 29/50 | |
1563/1563 [==============================] - 132s 84ms/step - loss: 0.1098 - accuracy: 0.9634 - val_loss: 2.5979 - val_accuracy: 0.6754 | |
Epoch 30/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1056 - accuracy: 0.9663 - val_loss: 2.7030 - val_accuracy: 0.6693 | |
Epoch 31/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1082 - accuracy: 0.9654 - val_loss: 2.7626 - val_accuracy: 0.6807 | |
Epoch 32/50 | |
1563/1563 [==============================] - 133s 85ms/step - loss: 0.1049 - accuracy: 0.9673 - val_loss: 2.7668 - val_accuracy: 0.6727 | |
Epoch 33/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1079 - accuracy: 0.9659 - val_loss: 2.7229 - val_accuracy: 0.6784 | |
Epoch 34/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1044 - accuracy: 0.9675 - val_loss: 2.8055 - val_accuracy: 0.6718 | |
Epoch 35/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.0977 - accuracy: 0.9691 - val_loss: 3.0105 - val_accuracy: 0.6742 | |
Epoch 36/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.0946 - accuracy: 0.9699 - val_loss: 3.0860 - val_accuracy: 0.6650 | |
Epoch 37/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.0982 - accuracy: 0.9687 - val_loss: 3.0617 - val_accuracy: 0.6721 | |
Epoch 38/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.1032 - accuracy: 0.9684 - val_loss: 3.0260 - val_accuracy: 0.6762 | |
Epoch 39/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.0969 - accuracy: 0.9706 - val_loss: 3.0526 - val_accuracy: 0.6770 | |
Epoch 40/50 | |
1563/1563 [==============================] - 130s 83ms/step - loss: 0.0867 - accuracy: 0.9726 - val_loss: 3.1313 - val_accuracy: 0.6656 | |
Epoch 41/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0986 - accuracy: 0.9693 - val_loss: 3.1024 - val_accuracy: 0.6779 | |
Epoch 42/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0853 - accuracy: 0.9742 - val_loss: 3.2359 - val_accuracy: 0.6696 | |
Epoch 43/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0948 - accuracy: 0.9717 - val_loss: 3.2210 - val_accuracy: 0.6707 | |
Epoch 44/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.1017 - accuracy: 0.9692 - val_loss: 3.2260 - val_accuracy: 0.6797 | |
Epoch 45/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0848 - accuracy: 0.9747 - val_loss: 3.3114 - val_accuracy: 0.6718 | |
Epoch 46/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0831 - accuracy: 0.9748 - val_loss: 3.2722 - val_accuracy: 0.6844 | |
Epoch 47/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0887 - accuracy: 0.9733 - val_loss: 3.2204 - val_accuracy: 0.6780 | |
Epoch 48/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0868 - accuracy: 0.9740 - val_loss: 3.2955 - val_accuracy: 0.6651 | |
Epoch 49/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0838 - accuracy: 0.9751 - val_loss: 3.2689 - val_accuracy: 0.6721 | |
Epoch 50/50 | |
1563/1563 [==============================] - 131s 84ms/step - loss: 0.0873 - accuracy: 0.9741 - val_loss: 3.3897 - val_accuracy: 0.6672 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment