Skip to content

Instantly share code, notes, and snippets.

@madaan
Last active October 8, 2019 14:05
Show Gist options
  • Save madaan/3ef4e5ded8322cde125649e47e6e6a03 to your computer and use it in GitHub Desktop.
Save madaan/3ef4e5ded8322cde125649e47e6e6a03 to your computer and use it in GitHub Desktop.
HypE Train Log
Training the HypE model...
Number of training data points: 61911
Starting training...
iteration#: 1, loss: 1128.5969721078873
iteration#: 2, loss: 589.3206909894943
iteration#: 3, loss: 352.0712777376175
iteration#: 4, loss: 246.19432146847248
iteration#: 5, loss: 202.48553057014942
iteration#: 6, loss: 165.68770626187325
iteration#: 7, loss: 145.24890618771315
iteration#: 8, loss: 132.29567590355873
iteration#: 9, loss: 122.3706817701459
iteration#: 10, loss: 110.28342608362436
iteration#: 11, loss: 103.30189225077629
iteration#: 12, loss: 98.75038635358214
iteration#: 13, loss: 93.63530520349741
iteration#: 14, loss: 92.06604382395744
iteration#: 15, loss: 83.64566880837083
iteration#: 16, loss: 79.78967126086354
iteration#: 17, loss: 77.55824789777398
iteration#: 18, loss: 78.40112666413188
iteration#: 19, loss: 76.36364080756903
iteration#: 20, loss: 73.5182331353426
iteration#: 21, loss: 72.82811096310616
iteration#: 22, loss: 68.37139315903187
iteration#: 23, loss: 67.71940031647682
iteration#: 24, loss: 66.86391191929579
iteration#: 25, loss: 66.00029823556542
iteration#: 26, loss: 63.77029738947749
iteration#: 27, loss: 62.39784202352166
iteration#: 28, loss: 60.96891175955534
iteration#: 29, loss: 61.461283415555954
iteration#: 30, loss: 57.59357615932822
iteration#: 31, loss: 56.80059587955475
iteration#: 32, loss: 55.279016226530075
iteration#: 33, loss: 53.79471708461642
iteration#: 34, loss: 52.90129294246435
iteration#: 35, loss: 55.00731261074543
iteration#: 36, loss: 53.071109656244516
iteration#: 37, loss: 53.67683732137084
iteration#: 38, loss: 54.17810820043087
iteration#: 39, loss: 49.99514580518007
iteration#: 40, loss: 49.96600095927715
iteration#: 41, loss: 49.71589883044362
iteration#: 42, loss: 50.49418694153428
iteration#: 43, loss: 47.33111986704171
iteration#: 44, loss: 47.55682394653559
iteration#: 45, loss: 47.433820594102144
iteration#: 46, loss: 45.79797434806824
iteration#: 47, loss: 45.78519377857447
iteration#: 48, loss: 46.44497795775533
iteration#: 49, loss: 43.37494556233287
iteration#: 50, loss: 44.099109657108784
iteration#: 51, loss: 45.47817849740386
iteration#: 52, loss: 44.225241988897324
iteration#: 53, loss: 42.102620895951986
iteration#: 54, loss: 44.138226084411144
iteration#: 55, loss: 41.6612194776535
iteration#: 56, loss: 42.043484246358275
iteration#: 57, loss: 41.97770299948752
iteration#: 58, loss: 42.43217799440026
iteration#: 59, loss: 40.212553314864635
iteration#: 60, loss: 41.666214637458324
iteration#: 61, loss: 40.84178464487195
iteration#: 62, loss: 38.91881050914526
iteration#: 63, loss: 39.607658829540014
iteration#: 64, loss: 40.6425687558949
iteration#: 65, loss: 39.766071401536465
iteration#: 66, loss: 40.13560397922993
iteration#: 67, loss: 38.51617764309049
iteration#: 68, loss: 39.09016981907189
iteration#: 69, loss: 37.40770496428013
iteration#: 70, loss: 39.52282164245844
iteration#: 71, loss: 37.689951818436384
iteration#: 72, loss: 38.63494444359094
iteration#: 73, loss: 35.92030730098486
iteration#: 74, loss: 38.22083969414234
iteration#: 75, loss: 36.69814841076732
iteration#: 76, loss: 34.85340274870396
iteration#: 77, loss: 37.02121592871845
iteration#: 78, loss: 37.17175584100187
iteration#: 79, loss: 34.77896904479712
iteration#: 80, loss: 35.08971842750907
iteration#: 81, loss: 36.15607845596969
iteration#: 82, loss: 35.563974909484386
iteration#: 83, loss: 37.93343322724104
iteration#: 84, loss: 35.19352715462446
iteration#: 85, loss: 34.01389496028423
iteration#: 86, loss: 34.73526332899928
iteration#: 87, loss: 35.384004667401314
iteration#: 88, loss: 35.128996040672064
iteration#: 89, loss: 35.14112612232566
iteration#: 90, loss: 33.744155475869775
iteration#: 91, loss: 34.359025252982974
iteration#: 92, loss: 34.5803159289062
iteration#: 93, loss: 34.566614117473364
iteration#: 94, loss: 33.158173859119415
iteration#: 95, loss: 34.567622024565935
iteration#: 96, loss: 33.56074149161577
iteration#: 97, loss: 33.440386816859245
iteration#: 98, loss: 33.15666122082621
iteration#: 99, loss: 33.35540076158941
iteration#: 100, loss: 32.31053277011961
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.2291604272736348
Hit@10 = 0.405884995507637
MR = 937.4299440950384
MRR = 0.14917899870352636
Fil setting:
Hit@1 = 0.27979934112009586
Hit@3 = 0.39061096136567836
Hit@10 = 0.5278027353499052
MR = 913.5135769192373
MRR = 0.36328344097470155
iteration#: 101, loss: 33.26325755007565
iteration#: 102, loss: 34.22918685525656
iteration#: 103, loss: 31.72119122557342
iteration#: 104, loss: 34.25393953919411
iteration#: 105, loss: 35.19353667087853
iteration#: 106, loss: 31.300839794799685
iteration#: 107, loss: 32.417203683406115
iteration#: 108, loss: 32.92384012788534
iteration#: 109, loss: 31.86550393141806
iteration#: 110, loss: 31.662774812430143
iteration#: 111, loss: 31.615627698600292
iteration#: 112, loss: 30.160429138690233
iteration#: 113, loss: 30.863088745623827
iteration#: 114, loss: 31.117492131888866
iteration#: 115, loss: 30.772704727947712
iteration#: 116, loss: 30.377160154283047
iteration#: 117, loss: 30.72225643042475
iteration#: 118, loss: 31.225141864269972
iteration#: 119, loss: 31.050309631973505
iteration#: 120, loss: 31.19096688553691
iteration#: 121, loss: 29.92217805981636
iteration#: 122, loss: 30.261887602508068
iteration#: 123, loss: 31.232261735014617
iteration#: 124, loss: 29.986971170641482
iteration#: 125, loss: 28.92918374761939
iteration#: 126, loss: 29.307744782418013
iteration#: 127, loss: 29.513292593881488
iteration#: 128, loss: 30.21643714979291
iteration#: 129, loss: 30.159386394545436
iteration#: 130, loss: 29.226475175470114
iteration#: 131, loss: 30.06484153866768
iteration#: 132, loss: 30.340365203097463
iteration#: 133, loss: 29.6525942645967
iteration#: 134, loss: 28.415171306580305
iteration#: 135, loss: 28.65402167290449
iteration#: 136, loss: 30.443494034931064
iteration#: 137, loss: 28.39379720017314
iteration#: 138, loss: 31.505824368912727
iteration#: 139, loss: 31.484795162454247
iteration#: 140, loss: 28.738601291552186
iteration#: 141, loss: 30.208784237504005
iteration#: 142, loss: 28.228824887424707
iteration#: 143, loss: 28.55437532067299
iteration#: 144, loss: 28.28537051472813
iteration#: 145, loss: 27.25414576753974
iteration#: 146, loss: 29.058412563055754
iteration#: 147, loss: 28.984738670289516
iteration#: 148, loss: 28.05106611084193
iteration#: 149, loss: 28.65531910955906
iteration#: 150, loss: 27.327295269817114
iteration#: 151, loss: 28.951372979208827
iteration#: 152, loss: 28.768770076334476
iteration#: 153, loss: 26.743937350809574
iteration#: 154, loss: 27.564233478158712
iteration#: 155, loss: 27.864890094846487
iteration#: 156, loss: 30.39787882193923
iteration#: 157, loss: 28.431658174842596
iteration#: 158, loss: 27.678705608472228
iteration#: 159, loss: 27.66184150055051
iteration#: 160, loss: 26.19655105425045
iteration#: 161, loss: 28.142703719437122
iteration#: 162, loss: 27.079211939126253
iteration#: 163, loss: 30.366659447550774
iteration#: 164, loss: 27.64105688035488
iteration#: 165, loss: 27.564809288363904
iteration#: 166, loss: 26.746821008622646
iteration#: 167, loss: 28.730659483000636
iteration#: 168, loss: 27.27083331719041
iteration#: 169, loss: 26.759906120598316
iteration#: 170, loss: 26.359628465026617
iteration#: 171, loss: 26.729565834626555
iteration#: 172, loss: 27.446469645947218
iteration#: 173, loss: 27.38310758769512
iteration#: 174, loss: 27.526637017726898
iteration#: 175, loss: 26.98622157610953
iteration#: 176, loss: 26.761791734956205
iteration#: 177, loss: 27.418844245374203
iteration#: 178, loss: 26.305560232140124
iteration#: 179, loss: 27.326188180595636
iteration#: 180, loss: 26.088891372084618
iteration#: 181, loss: 26.502641008235514
iteration#: 182, loss: 26.568427070975304
iteration#: 183, loss: 26.10986440628767
iteration#: 184, loss: 27.35618031769991
iteration#: 185, loss: 26.71378854289651
iteration#: 186, loss: 26.792690694332123
iteration#: 187, loss: 26.765573816373944
iteration#: 188, loss: 27.189598760567605
iteration#: 189, loss: 27.012643849477172
iteration#: 190, loss: 25.817803516983986
iteration#: 191, loss: 25.85440818965435
iteration#: 192, loss: 25.07075311988592
iteration#: 193, loss: 23.886949023231864
iteration#: 194, loss: 26.634819438681006
iteration#: 195, loss: 25.313688285648823
iteration#: 196, loss: 26.744353350251913
iteration#: 197, loss: 26.14979605935514
iteration#: 198, loss: 26.016125928144902
iteration#: 199, loss: 25.724481565877795
iteration#: 200, loss: 25.14638690277934
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.23649795347908556
Hit@10 = 0.42041030248577416
MR = 1000.9781371668164
MRR = 0.1539652263499597
Fil setting:
Hit@1 = 0.3131676150544075
Hit@3 = 0.4312418887890586
Hit@10 = 0.5641160027952481
MR = 976.2180043925327
MRR = 0.3979868198955892
iteration#: 201, loss: 27.24712151940912
iteration#: 202, loss: 24.7819806933403
iteration#: 203, loss: 26.716864474117756
iteration#: 204, loss: 24.220598552376032
iteration#: 205, loss: 25.058776330202818
iteration#: 206, loss: 24.889723427593708
iteration#: 207, loss: 25.093551199883223
iteration#: 208, loss: 25.470895286649466
iteration#: 209, loss: 25.849243696779013
iteration#: 210, loss: 24.411983223631978
iteration#: 211, loss: 25.2604296291247
iteration#: 212, loss: 25.062909573316574
iteration#: 213, loss: 25.689380656927824
iteration#: 214, loss: 24.776960257440805
iteration#: 215, loss: 24.7105468865484
iteration#: 216, loss: 25.115280451253057
iteration#: 217, loss: 25.003646360710263
iteration#: 218, loss: 24.386205823160708
iteration#: 219, loss: 25.139376003295183
iteration#: 220, loss: 26.026214078068733
iteration#: 221, loss: 24.418052868917584
iteration#: 222, loss: 25.533744409680367
iteration#: 223, loss: 26.67623369395733
iteration#: 224, loss: 25.56923023238778
iteration#: 225, loss: 24.79012242332101
iteration#: 226, loss: 24.617675449699163
iteration#: 227, loss: 24.43231729604304
iteration#: 228, loss: 24.549676839262247
iteration#: 229, loss: 26.06411285791546
iteration#: 230, loss: 24.40025589382276
iteration#: 231, loss: 25.245934923179448
iteration#: 232, loss: 25.283985276240855
iteration#: 233, loss: 24.336508235894144
iteration#: 234, loss: 24.490268025081605
iteration#: 235, loss: 26.47953055659309
iteration#: 236, loss: 25.421675144694746
iteration#: 237, loss: 24.01611863076687
iteration#: 238, loss: 24.482236433774233
iteration#: 239, loss: 24.822535067796707
iteration#: 240, loss: 24.617661265656352
iteration#: 241, loss: 25.402254071086645
iteration#: 242, loss: 25.182959680445492
iteration#: 243, loss: 24.926196499727666
iteration#: 244, loss: 24.1664194855839
iteration#: 245, loss: 25.3267030864954
iteration#: 246, loss: 25.069063547067344
iteration#: 247, loss: 23.23500164039433
iteration#: 248, loss: 24.93844872713089
iteration#: 249, loss: 24.657083835452795
iteration#: 250, loss: 24.02684534341097
iteration#: 251, loss: 24.74254036694765
iteration#: 252, loss: 23.778330402448773
iteration#: 253, loss: 24.912817834876478
iteration#: 254, loss: 25.329655833542347
iteration#: 255, loss: 25.94052093103528
iteration#: 256, loss: 24.734104819595814
iteration#: 257, loss: 26.226534105837345
iteration#: 258, loss: 23.75828493386507
iteration#: 259, loss: 24.486495088785887
iteration#: 260, loss: 24.302721105515957
iteration#: 261, loss: 25.560199317522347
iteration#: 262, loss: 25.861542949220166
iteration#: 263, loss: 25.19694292731583
iteration#: 264, loss: 24.488210739567876
iteration#: 265, loss: 25.264429111033678
iteration#: 266, loss: 24.15079783136025
iteration#: 267, loss: 24.848060972988605
iteration#: 268, loss: 24.85860359482467
iteration#: 269, loss: 25.26806028187275
iteration#: 270, loss: 24.723592843860388
iteration#: 271, loss: 24.9519777931273
iteration#: 272, loss: 24.12641540542245
iteration#: 273, loss: 25.345917662605643
iteration#: 274, loss: 24.698667232878506
iteration#: 275, loss: 24.103543657809496
iteration#: 276, loss: 24.678639722056687
iteration#: 277, loss: 23.85308739542961
iteration#: 278, loss: 25.21232662163675
iteration#: 279, loss: 23.68872652295977
iteration#: 280, loss: 24.280228855088353
iteration#: 281, loss: 23.895063892006874
iteration#: 282, loss: 24.12961891386658
iteration#: 283, loss: 24.267005558591336
iteration#: 284, loss: 24.854911517351866
iteration#: 285, loss: 24.724048418924212
iteration#: 286, loss: 24.230520529672503
iteration#: 287, loss: 24.04946636594832
iteration#: 288, loss: 24.340305379591882
iteration#: 289, loss: 22.771409523207694
iteration#: 290, loss: 22.645989569835365
iteration#: 291, loss: 24.190760102123022
iteration#: 292, loss: 24.118768325075507
iteration#: 293, loss: 22.914921359624714
iteration#: 294, loss: 25.218876730650663
iteration#: 295, loss: 23.376327358186245
iteration#: 296, loss: 23.65202672034502
iteration#: 297, loss: 23.588180601596832
iteration#: 298, loss: 23.46082061715424
iteration#: 299, loss: 22.512943250127137
iteration#: 300, loss: 24.99040351808071
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.23851951682140363
Hit@10 = 0.4250773684735949
MR = 999.751547369472
MRR = 0.15537151787075468
Fil setting:
Hit@1 = 0.3315613457122891
Hit@3 = 0.45130777677947487
Hit@10 = 0.5833333333333334
MR = 973.5657132874114
MRR = 0.416801066038361
iteration#: 301, loss: 23.46126427873969
iteration#: 302, loss: 23.080572795122862
iteration#: 303, loss: 23.627124927937984
iteration#: 304, loss: 22.848930288106203
iteration#: 305, loss: 22.69973354972899
iteration#: 306, loss: 23.34152278956026
iteration#: 307, loss: 22.907369073480368
iteration#: 308, loss: 22.784663654863834
iteration#: 309, loss: 24.20392457768321
iteration#: 310, loss: 24.275926711969078
iteration#: 311, loss: 23.126015905290842
iteration#: 312, loss: 24.156770821660757
iteration#: 313, loss: 22.9060525521636
iteration#: 314, loss: 23.94069342687726
iteration#: 315, loss: 22.368323352187872
iteration#: 316, loss: 22.767602367326617
iteration#: 317, loss: 23.37242553755641
iteration#: 318, loss: 24.176541523076594
iteration#: 319, loss: 23.689086553640664
iteration#: 320, loss: 23.148982455953956
iteration#: 321, loss: 22.40683679538779
iteration#: 322, loss: 23.597058003302664
iteration#: 323, loss: 23.193390090949833
iteration#: 324, loss: 23.11877152696252
iteration#: 325, loss: 21.720063833519816
iteration#: 326, loss: 23.821586064994335
iteration#: 327, loss: 22.570988729130477
iteration#: 328, loss: 23.12326236255467
iteration#: 329, loss: 23.54500874504447
iteration#: 330, loss: 22.800207087770104
iteration#: 331, loss: 22.296832972206175
iteration#: 332, loss: 22.781476109288633
iteration#: 333, loss: 22.800559541210532
iteration#: 334, loss: 22.658214198425412
iteration#: 335, loss: 23.009371127933264
iteration#: 336, loss: 22.986757159233093
iteration#: 337, loss: 23.880425991490483
iteration#: 338, loss: 22.787995271384716
iteration#: 339, loss: 23.881447231397033
iteration#: 340, loss: 22.433480247855186
iteration#: 341, loss: 22.915081072831526
iteration#: 342, loss: 22.919296022038907
iteration#: 343, loss: 23.584480277262628
iteration#: 344, loss: 23.08296275511384
iteration#: 345, loss: 22.462437914684415
iteration#: 346, loss: 21.69634547457099
iteration#: 347, loss: 23.85763270407915
iteration#: 348, loss: 22.032596422359347
iteration#: 349, loss: 22.971270080655813
iteration#: 350, loss: 20.970440840814263
iteration#: 351, loss: 23.434675257652998
iteration#: 352, loss: 22.587820943444967
iteration#: 353, loss: 23.733249640092254
iteration#: 354, loss: 22.34412457048893
iteration#: 355, loss: 23.633487179875374
iteration#: 356, loss: 22.89092537946999
iteration#: 357, loss: 22.90736585855484
iteration#: 358, loss: 21.578827131539583
iteration#: 359, loss: 22.56746038980782
iteration#: 360, loss: 22.82948959618807
iteration#: 361, loss: 22.76260907854885
iteration#: 362, loss: 21.764938471838832
iteration#: 363, loss: 22.301545649766922
iteration#: 364, loss: 20.379208324477077
iteration#: 365, loss: 21.50777213834226
iteration#: 366, loss: 22.299156483262777
iteration#: 367, loss: 23.41532531939447
iteration#: 368, loss: 21.520611131563783
iteration#: 369, loss: 22.176325000007637
iteration#: 370, loss: 22.944751404225826
iteration#: 371, loss: 23.069116791710258
iteration#: 372, loss: 22.878415377810597
iteration#: 373, loss: 22.71334956213832
iteration#: 374, loss: 21.910293377470225
iteration#: 375, loss: 22.1777365738526
iteration#: 376, loss: 23.59987726714462
iteration#: 377, loss: 22.85042697377503
iteration#: 378, loss: 22.21125878766179
iteration#: 379, loss: 21.230747547000647
iteration#: 380, loss: 21.74002715945244
iteration#: 381, loss: 21.82493080943823
iteration#: 382, loss: 23.168578254058957
iteration#: 383, loss: 22.883175514638424
iteration#: 384, loss: 21.251771594397724
iteration#: 385, loss: 21.9423017911613
iteration#: 386, loss: 21.918760740489233
iteration#: 387, loss: 22.617916559334844
iteration#: 388, loss: 23.02409655181691
iteration#: 389, loss: 22.426793751539662
iteration#: 390, loss: 22.259402617812157
iteration#: 391, loss: 21.917677057906985
iteration#: 392, loss: 22.443641448393464
iteration#: 393, loss: 21.6094628428109
iteration#: 394, loss: 21.239283117000014
iteration#: 395, loss: 21.214492505649105
iteration#: 396, loss: 22.3387345764786
iteration#: 397, loss: 22.484913914930075
iteration#: 398, loss: 20.789034124463797
iteration#: 399, loss: 20.889675866812468
iteration#: 400, loss: 21.43484972976148
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.24011680143755615
Hit@10 = 0.4281221922731357
MR = 1020.8772337027054
MRR = 0.1556742550537393
Fil setting:
Hit@1 = 0.3391484476390137
Hit@3 = 0.45874513327343514
Hit@10 = 0.5902465808126185
MR = 994.0474942597584
MRR = 0.42401144467347296
iteration#: 401, loss: 23.249696318991482
iteration#: 402, loss: 21.19429254811257
iteration#: 403, loss: 21.800255993381143
iteration#: 404, loss: 23.13605154119432
iteration#: 405, loss: 21.8390542101115
iteration#: 406, loss: 23.006552616134286
iteration#: 407, loss: 21.65937490016222
iteration#: 408, loss: 20.897103879600763
iteration#: 409, loss: 21.163311316631734
iteration#: 410, loss: 21.761389765888453
iteration#: 411, loss: 21.744938909076154
iteration#: 412, loss: 21.104854705045
iteration#: 413, loss: 22.478530399501324
iteration#: 414, loss: 22.180987920612097
iteration#: 415, loss: 22.460816520266235
iteration#: 416, loss: 22.103346420452
iteration#: 417, loss: 20.592110466212034
iteration#: 418, loss: 21.77649979107082
iteration#: 419, loss: 21.847989186644554
iteration#: 420, loss: 22.98958497494459
iteration#: 421, loss: 21.422638792544603
iteration#: 422, loss: 22.31363247986883
iteration#: 423, loss: 22.286662636324763
iteration#: 424, loss: 21.947178239002824
iteration#: 425, loss: 22.80321466177702
iteration#: 426, loss: 22.49919576756656
iteration#: 427, loss: 21.44191326573491
iteration#: 428, loss: 21.49029839783907
iteration#: 429, loss: 22.7541650403291
iteration#: 430, loss: 20.59069173783064
iteration#: 431, loss: 21.9158752579242
iteration#: 432, loss: 21.397590935230255
iteration#: 433, loss: 20.64666796848178
iteration#: 434, loss: 21.702178256586194
iteration#: 435, loss: 21.354624032974243
iteration#: 436, loss: 21.242575827986002
iteration#: 437, loss: 22.129924602806568
iteration#: 438, loss: 21.949727043393068
iteration#: 439, loss: 22.98925157636404
iteration#: 440, loss: 21.90987526997924
iteration#: 441, loss: 20.93463759869337
iteration#: 442, loss: 21.083083886653185
iteration#: 443, loss: 21.40187248773873
iteration#: 444, loss: 20.83390999957919
iteration#: 445, loss: 20.821795403957367
iteration#: 446, loss: 21.842540256679058
iteration#: 447, loss: 21.12655450310558
iteration#: 448, loss: 20.486725889146328
iteration#: 449, loss: 21.199680435471237
iteration#: 450, loss: 21.91143242502585
iteration#: 451, loss: 21.094152233563364
iteration#: 452, loss: 21.61913809971884
iteration#: 453, loss: 20.73567051347345
iteration#: 454, loss: 21.349566344171762
iteration#: 455, loss: 21.358126300387084
iteration#: 456, loss: 21.32631280273199
iteration#: 457, loss: 21.162315514869988
iteration#: 458, loss: 20.479331642389297
iteration#: 459, loss: 20.64435400441289
iteration#: 460, loss: 20.058593570254743
iteration#: 461, loss: 20.16186575172469
iteration#: 462, loss: 20.42991684190929
iteration#: 463, loss: 21.380917229689658
iteration#: 464, loss: 21.952415805775672
iteration#: 465, loss: 21.204546571709216
iteration#: 466, loss: 21.117643024772406
iteration#: 467, loss: 20.96833833679557
iteration#: 468, loss: 21.598797030746937
iteration#: 469, loss: 21.566693725064397
iteration#: 470, loss: 21.01600386854261
iteration#: 471, loss: 21.755610020831227
iteration#: 472, loss: 20.88640418369323
iteration#: 473, loss: 21.967098671011627
iteration#: 474, loss: 22.052843501791358
iteration#: 475, loss: 21.59802544489503
iteration#: 476, loss: 21.497184867272153
iteration#: 477, loss: 20.564411368221045
iteration#: 478, loss: 21.266166856512427
iteration#: 479, loss: 20.8535547144711
iteration#: 480, loss: 21.277395579963923
iteration#: 481, loss: 20.365421837195754
iteration#: 482, loss: 21.97977838292718
iteration#: 483, loss: 22.33593601733446
iteration#: 484, loss: 19.664362984593026
iteration#: 485, loss: 20.15722167585045
iteration#: 486, loss: 19.909053412266076
iteration#: 487, loss: 20.92108030617237
iteration#: 488, loss: 21.175014791078866
iteration#: 489, loss: 21.17112911399454
iteration#: 490, loss: 20.74111422151327
iteration#: 491, loss: 21.348958572372794
iteration#: 492, loss: 20.988203558139503
iteration#: 493, loss: 21.218574207276106
iteration#: 494, loss: 20.588658027350903
iteration#: 495, loss: 21.39802703820169
iteration#: 496, loss: 21.00297266198322
iteration#: 497, loss: 21.58903961442411
iteration#: 498, loss: 20.839866237714887
iteration#: 499, loss: 22.32119382917881
iteration#: 500, loss: 21.06327011436224
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.23891883797544175
Hit@10 = 0.42904562244184885
MR = 1026.381476489967
MRR = 0.1556658803334297
Fil setting:
Hit@1 = 0.3412698412698413
Hit@3 = 0.46508435659379055
Hit@10 = 0.5954127982429869
MR = 999.2251422581611
MRR = 0.42800969227445673
test in iteration 500:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.3019202320097257
Hit@10 = 0.5003442063480439
MR = 843.954696585766
MRR = 0.18622815834027878
Fil setting:
Hit@1 = 0.3958519473291052
Hit@3 = 0.5254346520586469
Hit@10 = 0.6423329866858055
MR = 818.7834722364624
MRR = 0.4821113873262497
Training the HypE model...
Number of training data points: 61911
Starting training...
iteration#: 1, loss: 1568.167760848999
iteration#: 2, loss: 1566.214586019516
iteration#: 3, loss: 1541.8936672210693
iteration#: 4, loss: 1417.850191116333
iteration#: 5, loss: 1297.3212687969208
iteration#: 6, loss: 1244.835282087326
iteration#: 7, loss: 1208.0162200927734
iteration#: 8, loss: 1175.3001044988632
iteration#: 9, loss: 1140.302342057228
iteration#: 10, loss: 1113.1877946853638
iteration#: 11, loss: 1080.1524951457977
iteration#: 12, loss: 1052.0448784828186
iteration#: 13, loss: 1027.3391007184982
iteration#: 14, loss: 1001.5486435890198
iteration#: 15, loss: 982.2413285970688
iteration#: 16, loss: 962.0183218717575
iteration#: 17, loss: 940.1079822778702
iteration#: 18, loss: 919.8490008115768
iteration#: 19, loss: 903.4703222513199
iteration#: 20, loss: 891.1299673318863
iteration#: 21, loss: 873.34446144104
iteration#: 22, loss: 861.6609258651733
iteration#: 23, loss: 847.7475399971008
iteration#: 24, loss: 835.9063550233841
iteration#: 25, loss: 826.2098978161812
iteration#: 26, loss: 817.9560393095016
iteration#: 27, loss: 804.9229393005371
iteration#: 28, loss: 800.1948198080063
iteration#: 29, loss: 789.1226322650909
iteration#: 30, loss: 783.9763506650925
iteration#: 31, loss: 774.9150045514107
iteration#: 32, loss: 769.6904550790787
iteration#: 33, loss: 762.201839029789
iteration#: 34, loss: 756.8505566120148
iteration#: 35, loss: 747.5125322341919
iteration#: 36, loss: 739.0960103869438
iteration#: 37, loss: 733.1006280183792
iteration#: 38, loss: 729.8794151544571
iteration#: 39, loss: 724.0399323701859
iteration#: 40, loss: 716.5943242311478
iteration#: 41, loss: 714.3693867325783
iteration#: 42, loss: 707.9790508747101
iteration#: 43, loss: 709.0170536637306
iteration#: 44, loss: 701.4162878990173
iteration#: 45, loss: 696.2010463476181
iteration#: 46, loss: 688.8218839764595
iteration#: 47, loss: 687.6234297156334
iteration#: 48, loss: 681.2742008566856
iteration#: 49, loss: 679.8837047815323
iteration#: 50, loss: 676.9654585719109
iteration#: 51, loss: 673.6863637566566
iteration#: 52, loss: 671.328863799572
iteration#: 53, loss: 665.8944901823997
iteration#: 54, loss: 663.1141242384911
iteration#: 55, loss: 655.7340251803398
iteration#: 56, loss: 658.5383692383766
iteration#: 57, loss: 652.6376151442528
iteration#: 58, loss: 651.4512258768082
iteration#: 59, loss: 647.7386912703514
iteration#: 60, loss: 646.2963688373566
iteration#: 61, loss: 640.1041775941849
iteration#: 62, loss: 635.9918885827065
iteration#: 63, loss: 635.3883060216904
iteration#: 64, loss: 633.0268769860268
iteration#: 65, loss: 629.8320934176445
iteration#: 66, loss: 628.29300147295
iteration#: 67, loss: 625.1998963356018
iteration#: 68, loss: 623.630620598793
iteration#: 69, loss: 620.4073736071587
iteration#: 70, loss: 615.5201799273491
iteration#: 71, loss: 615.9175664186478
iteration#: 72, loss: 612.4741801023483
iteration#: 73, loss: 608.8711268901825
iteration#: 74, loss: 605.5081042051315
iteration#: 75, loss: 604.7320785522461
iteration#: 76, loss: 603.9914383292198
iteration#: 77, loss: 602.0444964170456
iteration#: 78, loss: 600.0602433085442
iteration#: 79, loss: 596.3603237867355
iteration#: 80, loss: 595.1059394478798
iteration#: 81, loss: 592.4563137292862
iteration#: 82, loss: 590.8598407506943
iteration#: 83, loss: 587.8735997676849
iteration#: 84, loss: 585.5731995105743
iteration#: 85, loss: 583.492031276226
iteration#: 86, loss: 582.0197809934616
iteration#: 87, loss: 581.3638744950294
iteration#: 88, loss: 580.101521730423
iteration#: 89, loss: 580.3902084827423
iteration#: 90, loss: 573.6683760881424
iteration#: 91, loss: 574.2133513689041
iteration#: 92, loss: 572.592414021492
iteration#: 93, loss: 569.6194210648537
iteration#: 94, loss: 566.7204034924507
iteration#: 95, loss: 565.8578650355339
iteration#: 96, loss: 563.179713666439
iteration#: 97, loss: 562.6816758513451
iteration#: 98, loss: 562.322850883007
iteration#: 99, loss: 557.883641064167
iteration#: 100, loss: 554.598696410656
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.07968952780273535
Hit@10 = 0.1684386542877109
MR = 1918.3950284516322
MRR = 0.05771639627053916
Fil setting:
Hit@1 = 0.0707297594090047
Hit@3 = 0.1164021164021164
Hit@10 = 0.1888040331436558
MR = 1896.0934661076171
MRR = 0.11180452337225298
iteration#: 101, loss: 559.8016046285629
iteration#: 102, loss: 555.424596965313
iteration#: 103, loss: 553.1797646284103
iteration#: 104, loss: 551.440612256527
iteration#: 105, loss: 546.1298203468323
iteration#: 106, loss: 547.7121923565865
iteration#: 107, loss: 546.372306227684
iteration#: 108, loss: 547.0027021169662
iteration#: 109, loss: 544.2123408317566
iteration#: 110, loss: 541.4800645112991
iteration#: 111, loss: 539.1617589592934
iteration#: 112, loss: 541.2255623340607
iteration#: 113, loss: 535.338037788868
iteration#: 114, loss: 537.7478042244911
iteration#: 115, loss: 537.2070134282112
iteration#: 116, loss: 531.894392311573
iteration#: 117, loss: 532.7747201919556
iteration#: 118, loss: 530.6886927485466
iteration#: 119, loss: 529.3532997369766
iteration#: 120, loss: 526.1926347613335
iteration#: 121, loss: 523.3632307648659
iteration#: 122, loss: 523.9297092556953
iteration#: 123, loss: 521.3822351694107
iteration#: 124, loss: 520.7364527583122
iteration#: 125, loss: 517.5229389071465
iteration#: 126, loss: 515.3672217130661
iteration#: 127, loss: 514.8951343894005
iteration#: 128, loss: 513.8985552787781
iteration#: 129, loss: 512.1268594264984
iteration#: 130, loss: 513.9858108758926
iteration#: 131, loss: 510.7731364965439
iteration#: 132, loss: 511.6264444589615
iteration#: 133, loss: 508.9394762516022
iteration#: 134, loss: 507.108266890049
iteration#: 135, loss: 504.29965353012085
iteration#: 136, loss: 503.66888707876205
iteration#: 137, loss: 503.9881919026375
iteration#: 138, loss: 503.6559793353081
iteration#: 139, loss: 501.6867606639862
iteration#: 140, loss: 498.38399654626846
iteration#: 141, loss: 497.4851675629616
iteration#: 142, loss: 496.40507376194
iteration#: 143, loss: 493.29402577877045
iteration#: 144, loss: 494.9842913746834
iteration#: 145, loss: 491.192021548748
iteration#: 146, loss: 490.58339977264404
iteration#: 147, loss: 490.70460218191147
iteration#: 148, loss: 489.88346165418625
iteration#: 149, loss: 486.5093340873718
iteration#: 150, loss: 485.59354984760284
iteration#: 151, loss: 485.64287358522415
iteration#: 152, loss: 482.9145469069481
iteration#: 153, loss: 482.0025139451027
iteration#: 154, loss: 482.3066344857216
iteration#: 155, loss: 483.2207680940628
iteration#: 156, loss: 476.7367460131645
iteration#: 157, loss: 476.29851561784744
iteration#: 158, loss: 475.7752392888069
iteration#: 159, loss: 474.7712396979332
iteration#: 160, loss: 474.0923569202423
iteration#: 161, loss: 475.19912588596344
iteration#: 162, loss: 471.778970181942
iteration#: 163, loss: 471.2301301956177
iteration#: 164, loss: 471.5875823497772
iteration#: 165, loss: 470.0107241868973
iteration#: 166, loss: 466.97025060653687
iteration#: 167, loss: 467.8331816792488
iteration#: 168, loss: 467.01165199279785
iteration#: 169, loss: 464.41453754901886
iteration#: 170, loss: 463.1976269483566
iteration#: 171, loss: 461.94786685705185
iteration#: 172, loss: 459.40781247615814
iteration#: 173, loss: 458.2370115816593
iteration#: 174, loss: 457.001424908638
iteration#: 175, loss: 458.23172402381897
iteration#: 176, loss: 457.4402832388878
iteration#: 177, loss: 455.03006291389465
iteration#: 178, loss: 456.4562439918518
iteration#: 179, loss: 450.5750281512737
iteration#: 180, loss: 452.5940616130829
iteration#: 181, loss: 455.51245361566544
iteration#: 182, loss: 451.6308769583702
iteration#: 183, loss: 449.2044324874878
iteration#: 184, loss: 451.5439919233322
iteration#: 185, loss: 450.5342267155647
iteration#: 186, loss: 446.3885226249695
iteration#: 187, loss: 443.0868782401085
iteration#: 188, loss: 446.13572055101395
iteration#: 189, loss: 444.27594244480133
iteration#: 190, loss: 441.81806164979935
iteration#: 191, loss: 442.94692796468735
iteration#: 192, loss: 440.69153213500977
iteration#: 193, loss: 440.91591465473175
iteration#: 194, loss: 438.87320375442505
iteration#: 195, loss: 437.1831392645836
iteration#: 196, loss: 436.8668641448021
iteration#: 197, loss: 435.47779101133347
iteration#: 198, loss: 434.3679272532463
iteration#: 199, loss: 435.85581290721893
iteration#: 200, loss: 433.5850983262062
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.09062094439452931
Hit@10 = 0.19798841968653288
MR = 1460.8217280622941
MRR = 0.06860958406516715
Fil setting:
Hit@1 = 0.08280922431865828
Hit@3 = 0.14422980932414894
Hit@10 = 0.22334531296795448
MR = 1438.370270540082
MRR = 0.13313115286628738
iteration#: 201, loss: 433.67991161346436
iteration#: 202, loss: 432.6385287642479
iteration#: 203, loss: 431.2019330263138
iteration#: 204, loss: 427.39601093530655
iteration#: 205, loss: 429.9979004263878
iteration#: 206, loss: 429.1669909954071
iteration#: 207, loss: 427.7530042529106
iteration#: 208, loss: 426.27536940574646
iteration#: 209, loss: 425.8893008828163
iteration#: 210, loss: 425.88283997774124
iteration#: 211, loss: 423.8967629671097
iteration#: 212, loss: 420.21863067150116
iteration#: 213, loss: 423.96309703588486
iteration#: 214, loss: 421.24091386795044
iteration#: 215, loss: 420.3519204258919
iteration#: 216, loss: 419.68327528238297
iteration#: 217, loss: 421.5554976761341
iteration#: 218, loss: 415.1808803677559
iteration#: 219, loss: 419.84583073854446
iteration#: 220, loss: 415.1623503565788
iteration#: 221, loss: 414.99821799993515
iteration#: 222, loss: 412.16440346837044
iteration#: 223, loss: 410.8448459506035
iteration#: 224, loss: 415.0926994085312
iteration#: 225, loss: 412.1634010076523
iteration#: 226, loss: 411.2853391766548
iteration#: 227, loss: 410.84768110513687
iteration#: 228, loss: 411.13572549819946
iteration#: 229, loss: 410.1673155426979
iteration#: 230, loss: 410.35696256160736
iteration#: 231, loss: 408.1554993093014
iteration#: 232, loss: 406.9225237965584
iteration#: 233, loss: 405.65640991926193
iteration#: 234, loss: 401.78463193774223
iteration#: 235, loss: 404.21481999754906
iteration#: 236, loss: 405.28807705640793
iteration#: 237, loss: 403.48543164134026
iteration#: 238, loss: 402.06403678655624
iteration#: 239, loss: 401.0401903092861
iteration#: 240, loss: 402.3869358301163
iteration#: 241, loss: 402.0510524213314
iteration#: 242, loss: 401.1164905130863
iteration#: 243, loss: 398.3261576294899
iteration#: 244, loss: 396.47216391563416
iteration#: 245, loss: 398.89933344721794
iteration#: 246, loss: 396.23440727591515
iteration#: 247, loss: 397.4310474395752
iteration#: 248, loss: 393.6791759133339
iteration#: 249, loss: 396.70723807811737
iteration#: 250, loss: 393.44657322764397
iteration#: 251, loss: 392.97483921051025
iteration#: 252, loss: 393.0045629143715
iteration#: 253, loss: 392.7963542342186
iteration#: 254, loss: 391.05482006073
iteration#: 255, loss: 392.6677203774452
iteration#: 256, loss: 391.8400337398052
iteration#: 257, loss: 389.1419381201267
iteration#: 258, loss: 388.55281189084053
iteration#: 259, loss: 388.8954659104347
iteration#: 260, loss: 386.7999877035618
iteration#: 261, loss: 386.4086154997349
iteration#: 262, loss: 388.0598818063736
iteration#: 263, loss: 386.73850443959236
iteration#: 264, loss: 387.47803193330765
iteration#: 265, loss: 384.2378141283989
iteration#: 266, loss: 384.36729550361633
iteration#: 267, loss: 383.5607122182846
iteration#: 268, loss: 383.3955937922001
iteration#: 269, loss: 381.5712905526161
iteration#: 270, loss: 382.49248188734055
iteration#: 271, loss: 379.7854151725769
iteration#: 272, loss: 379.3772096335888
iteration#: 273, loss: 378.51240092515945
iteration#: 274, loss: 378.57735776901245
iteration#: 275, loss: 377.2166854739189
iteration#: 276, loss: 374.9553499817848
iteration#: 277, loss: 376.84264439344406
iteration#: 278, loss: 374.2142585217953
iteration#: 279, loss: 378.4008546471596
iteration#: 280, loss: 374.4985133111477
iteration#: 281, loss: 369.29878357052803
iteration#: 282, loss: 373.0281850993633
iteration#: 283, loss: 373.4110248684883
iteration#: 284, loss: 373.33459427952766
iteration#: 285, loss: 371.1389313042164
iteration#: 286, loss: 372.0841909945011
iteration#: 287, loss: 370.5276307761669
iteration#: 288, loss: 370.2843883931637
iteration#: 289, loss: 371.3988904058933
iteration#: 290, loss: 368.0029245018959
iteration#: 291, loss: 368.79469591379166
iteration#: 292, loss: 367.9614169895649
iteration#: 293, loss: 366.92587381601334
iteration#: 294, loss: 366.7251242697239
iteration#: 295, loss: 367.8933227658272
iteration#: 296, loss: 366.3390215933323
iteration#: 297, loss: 362.18443191051483
iteration#: 298, loss: 364.63197362422943
iteration#: 299, loss: 362.81093671917915
iteration#: 300, loss: 363.6192865371704
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.09671059199361086
Hit@10 = 0.2151841868823001
MR = 1258.7387940501148
MRR = 0.07342847577981225
Fil setting:
Hit@1 = 0.07906558849955077
Hit@3 = 0.16155036438055306
Hit@10 = 0.24598183088749126
MR = 1236.223145652391
MRR = 0.140234106598075
iteration#: 301, loss: 362.0550481081009
iteration#: 302, loss: 364.1372180879116
iteration#: 303, loss: 362.45387703180313
iteration#: 304, loss: 361.3957024216652
iteration#: 305, loss: 360.99782538414
iteration#: 306, loss: 360.876686245203
iteration#: 307, loss: 359.9561446905136
iteration#: 308, loss: 357.62471064925194
iteration#: 309, loss: 360.19194734096527
iteration#: 310, loss: 359.89724311232567
iteration#: 311, loss: 355.83196645975113
iteration#: 312, loss: 358.18710520863533
iteration#: 313, loss: 356.512462079525
iteration#: 314, loss: 358.3030268251896
iteration#: 315, loss: 353.3265173435211
iteration#: 316, loss: 354.60065484046936
iteration#: 317, loss: 352.78160014748573
iteration#: 318, loss: 352.5755736529827
iteration#: 319, loss: 355.3528196811676
iteration#: 320, loss: 352.07200986146927
iteration#: 321, loss: 353.69762060046196
iteration#: 322, loss: 354.15397933125496
iteration#: 323, loss: 350.88472333550453
iteration#: 324, loss: 351.57082012295723
iteration#: 325, loss: 351.2246691286564
iteration#: 326, loss: 348.99123176932335
iteration#: 327, loss: 349.58010709285736
iteration#: 328, loss: 353.258061170578
iteration#: 329, loss: 346.87196508049965
iteration#: 330, loss: 350.4743128120899
iteration#: 331, loss: 349.20356357097626
iteration#: 332, loss: 346.7297428250313
iteration#: 333, loss: 345.8383474946022
iteration#: 334, loss: 347.1690584719181
iteration#: 335, loss: 346.53756153583527
iteration#: 336, loss: 346.5806316435337
iteration#: 337, loss: 345.21068972349167
iteration#: 338, loss: 343.68646225333214
iteration#: 339, loss: 343.93583634495735
iteration#: 340, loss: 345.3575050830841
iteration#: 341, loss: 343.85235145688057
iteration#: 342, loss: 343.4860292375088
iteration#: 343, loss: 343.4499623775482
iteration#: 344, loss: 341.7234136760235
iteration#: 345, loss: 341.73369267582893
iteration#: 346, loss: 344.4919342696667
iteration#: 347, loss: 343.35405284166336
iteration#: 348, loss: 343.00502666831017
iteration#: 349, loss: 342.5102865099907
iteration#: 350, loss: 341.4928072988987
iteration#: 351, loss: 337.9898720383644
iteration#: 352, loss: 337.6053454875946
iteration#: 353, loss: 339.7706755399704
iteration#: 354, loss: 338.97426277399063
iteration#: 355, loss: 336.7424829900265
iteration#: 356, loss: 335.82852256298065
iteration#: 357, loss: 336.02012223005295
iteration#: 358, loss: 334.1269554197788
iteration#: 359, loss: 332.12921729683876
iteration#: 360, loss: 334.2592325806618
iteration#: 361, loss: 334.79517671465874
iteration#: 362, loss: 333.8806663155556
iteration#: 363, loss: 334.3031278550625
iteration#: 364, loss: 331.8672328591347
iteration#: 365, loss: 335.2798627912998
iteration#: 366, loss: 332.1890047490597
iteration#: 367, loss: 333.75583881139755
iteration#: 368, loss: 328.45755112171173
iteration#: 369, loss: 331.9276827275753
iteration#: 370, loss: 332.5015029013157
iteration#: 371, loss: 328.30278384685516
iteration#: 372, loss: 329.68616184592247
iteration#: 373, loss: 328.5424091219902
iteration#: 374, loss: 329.11688363552094
iteration#: 375, loss: 326.53719261288643
iteration#: 376, loss: 327.95374658703804
iteration#: 377, loss: 327.3341547548771
iteration#: 378, loss: 327.08672401309013
iteration#: 379, loss: 327.16070637106895
iteration#: 380, loss: 326.9876582622528
iteration#: 381, loss: 327.27855533361435
iteration#: 382, loss: 329.8521021306515
iteration#: 383, loss: 326.599995046854
iteration#: 384, loss: 325.15897756814957
iteration#: 385, loss: 323.234727203846
iteration#: 386, loss: 322.22014957666397
iteration#: 387, loss: 324.8478179574013
iteration#: 388, loss: 319.85030576586723
iteration#: 389, loss: 323.1701871752739
iteration#: 390, loss: 324.86871787905693
iteration#: 391, loss: 324.1428101360798
iteration#: 392, loss: 323.5595455467701
iteration#: 393, loss: 321.1294749081135
iteration#: 394, loss: 324.3398912847042
iteration#: 395, loss: 319.75747260451317
iteration#: 396, loss: 318.4849357306957
iteration#: 397, loss: 318.4679958820343
iteration#: 398, loss: 323.5144476592541
iteration#: 399, loss: 320.38373243808746
iteration#: 400, loss: 319.6116570830345
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.10277528202056505
Hit@10 = 0.22748826994110014
MR = 1157.343640810622
MRR = 0.07753826252394752
Fil setting:
Hit@1 = 0.08495557552161326
Hit@3 = 0.17158330837576122
Hit@10 = 0.26185484676050713
MR = 1134.8250474193871
MRR = 0.14961166156042982
iteration#: 401, loss: 318.0460639297962
iteration#: 402, loss: 315.8163110613823
iteration#: 403, loss: 318.72855600714684
iteration#: 404, loss: 320.4333835244179
iteration#: 405, loss: 318.87679782509804
iteration#: 406, loss: 317.2534981071949
iteration#: 407, loss: 316.8394778072834
iteration#: 408, loss: 316.78147026896477
iteration#: 409, loss: 313.67458778619766
iteration#: 410, loss: 314.2118670642376
iteration#: 411, loss: 316.4713217020035
iteration#: 412, loss: 313.25721886754036
iteration#: 413, loss: 315.10892447829247
iteration#: 414, loss: 314.48215278983116
iteration#: 415, loss: 315.7228404581547
iteration#: 416, loss: 312.8907223045826
iteration#: 417, loss: 312.1016100347042
iteration#: 418, loss: 311.62455356121063
iteration#: 419, loss: 311.79775750637054
iteration#: 420, loss: 311.2503015100956
iteration#: 421, loss: 311.02993482351303
iteration#: 422, loss: 311.4002348780632
iteration#: 423, loss: 309.80508148670197
iteration#: 424, loss: 311.5664741694927
iteration#: 425, loss: 308.71182918548584
iteration#: 426, loss: 307.3025166094303
iteration#: 427, loss: 307.1225330233574
iteration#: 428, loss: 308.87857723236084
iteration#: 429, loss: 308.1991158425808
iteration#: 430, loss: 308.757972240448
iteration#: 431, loss: 306.5659838318825
iteration#: 432, loss: 311.0468390285969
iteration#: 433, loss: 309.0148835480213
iteration#: 434, loss: 306.19483268260956
iteration#: 435, loss: 305.36255073547363
iteration#: 436, loss: 306.1929804980755
iteration#: 437, loss: 305.518702596426
iteration#: 438, loss: 304.147862136364
iteration#: 439, loss: 305.09618148207664
iteration#: 440, loss: 306.75750717520714
iteration#: 441, loss: 302.8204023241997
iteration#: 442, loss: 305.74012395739555
iteration#: 443, loss: 303.61883464455605
iteration#: 444, loss: 302.98917031288147
iteration#: 445, loss: 302.02699264883995
iteration#: 446, loss: 302.70231157541275
iteration#: 447, loss: 301.8834336400032
iteration#: 448, loss: 301.6329956948757
iteration#: 449, loss: 302.4336866736412
iteration#: 450, loss: 300.7644668519497
iteration#: 451, loss: 302.11891677975655
iteration#: 452, loss: 297.95350247621536
iteration#: 453, loss: 303.17072370648384
iteration#: 454, loss: 300.9946230351925
iteration#: 455, loss: 300.9241723716259
iteration#: 456, loss: 300.17983412742615
iteration#: 457, loss: 300.8071482181549
iteration#: 458, loss: 300.6240890324116
iteration#: 459, loss: 298.17977717518806
iteration#: 460, loss: 298.32701885700226
iteration#: 461, loss: 296.9834838807583
iteration#: 462, loss: 298.4198579788208
iteration#: 463, loss: 297.92480930685997
iteration#: 464, loss: 299.18625274300575
iteration#: 465, loss: 296.33101910352707
iteration#: 466, loss: 298.1267457604408
iteration#: 467, loss: 298.12392741441727
iteration#: 468, loss: 298.0067714750767
iteration#: 469, loss: 297.1097930967808
iteration#: 470, loss: 296.0025942027569
iteration#: 471, loss: 293.348445802927
iteration#: 472, loss: 296.03564316034317
iteration#: 473, loss: 296.38946399092674
iteration#: 474, loss: 294.4876548945904
iteration#: 475, loss: 295.42666202783585
iteration#: 476, loss: 292.4943344593048
iteration#: 477, loss: 291.8892799913883
iteration#: 478, loss: 289.2284910082817
iteration#: 479, loss: 293.03092634677887
iteration#: 480, loss: 293.0556006729603
iteration#: 481, loss: 294.01857274770737
iteration#: 482, loss: 292.58265921473503
iteration#: 483, loss: 293.62098583579063
iteration#: 484, loss: 290.0496415197849
iteration#: 485, loss: 292.03115767240524
iteration#: 486, loss: 294.07916221022606
iteration#: 487, loss: 292.279886841774
iteration#: 488, loss: 291.943849503994
iteration#: 489, loss: 290.75366324186325
iteration#: 490, loss: 288.9123333990574
iteration#: 491, loss: 289.60179659724236
iteration#: 492, loss: 287.9389854967594
iteration#: 493, loss: 287.99252086877823
iteration#: 494, loss: 290.3257251083851
iteration#: 495, loss: 288.2990952730179
iteration#: 496, loss: 287.81286585330963
iteration#: 497, loss: 287.9914470911026
iteration#: 498, loss: 287.85713693499565
iteration#: 499, loss: 287.39511796832085
iteration#: 500, loss: 285.55172634124756
validation:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.10210142757312568
Hit@10 = 0.23739642607567135
MR = 1106.319981032245
MRR = 0.0801095049367218
Fil setting:
Hit@1 = 0.08108715184186882
Hit@3 = 0.1804682040531097
Hit@10 = 0.2770290506139563
MR = 1083.8091244883697
MRR = 0.15317772736680976
test in iteration 500:
Raw setting:
Hit@1 = 0.0
Hit@3 = 0.11708874664948075
Hit@10 = 0.27180583832554595
MR = 914.2306621944253
MRR = 0.0892902467081437
Fil setting:
Hit@1 = 0.08378128982174506
Hit@3 = 0.20350651062645556
Hit@10 = 0.3043077058280726
MR = 893.4659968069368
MRR = 0.16554024942358908
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment