Learning to code a 2 layer Neural Network from scratch

Learning a 2 layer Neural Network

Coding a 2 layer Neural Network for binary classification from scratch

In [1]:
import numpy as np
import matplotlib.pyplot as plt
import numpy.matlib

Dataset creation- data set is centred around two clusters c1 and c2 as their centre points.

In [2]:
c1 = [2,3]
c2 = [10,11]
no = 50
Class1 = np.matlib.repmat(c1, no,1) + np.random.randn(no,len(c1))
Class2 = np.matlib.repmat(c2, no,1) + np.random.randn(no,len(c2))
Data = np.append(Class1,Class2,axis = 0)
Trainlabel  = np.append(np.zeros((no,1)),np.ones((no,1)),axis = 0)

Plotting the data

In [3]:
import matplotlib.pyplot as plt
plt.plot(Class1[:,0],Class1[:,1],'ro')
plt.plot(Class2[:,0],Class2[:,1],'bo')
plt.ylabel('Data')
plt.show()
In [4]:
m = Data.shape[0]
X = Data.T
y = Trainlabel.T

n_i => number of nodes in input layer(here its 2, since each data point is an element of R^2 )

n_h => number of neurons in the hidden layer

n_l => number of neurons in the last layer.

A 2 layer neural network means it has 1 hidden layer and 1 output layer. Since it is a binary classification we have used only 1 neuron in the output layer.

In [17]:
n_i = 2
n_h = 2
n_l = 1
learningrate = 0.005
numiter = 5000
l = [n_i,n_h,n_l]

W1 = np.random.randn(n_h,n_i)*0.01
b1 = np.random.randn(n_h,1)*0.01
W2 = np.random.randn(n_l,n_h)*0.01
b2 = np.random.randn(n_l,1)                  
print X.shape
(2, 100)

Activation Functions

In [6]:
def sigmoid(x,play):
    z = 1/(1+np.exp(-x))
    if (play == "forward"):
        return z
    elif (play =="backward"):
        return z*(1-z)
In [7]:
def relu(x,play):
    if (play=="forward"):
        return np.maximum(x,0)
    elif (play=="backward"):
        x[x<=0] = 0     
        x[x>0] = 1     
        return x

Main part of our program. Here we are minimizing our cost function with respect to weights and bias. And we choose that weights and bias for which our cost function is minimum. Of course this depends on many factors. They are number of epochs, number of neurons in hidden layer, learning rate. These are called hyperparameters and a optimum value for these hyperparameters can be obtained by hyperparameter tuning.

In [18]:
for i in range(numiter):
    Z1 = np.dot(W1,X) + b1
    A1 = relu(Z1,play = "forward")
    Z2 = np.dot(W2,A1) + b2
    A2 = sigmoid(Z2,play = "forward")
   
    L = (1.0/m) * -np.sum(np.multiply(y,np.log(A2)) + np.multiply(1-y,np.log(1-A2)))
    
    dZ2 = A2 - y
    dZ1 = np.multiply(np.dot(W2.T,dZ2),relu(A1,play="backward"))
    dW2 = (1.0/m) * np.dot(dZ2,A1.T)
    db2 = (1.0/m) * np.sum(dZ2,axis = 1,keepdims = True)
    dW1 = (1.0/m) * np.dot(dZ1,X.T)
    db1 = (1.0/m) * np.sum(dZ1,axis = 1,keepdims = True)
    W1 = W1 - learningrate * dW1
    b1 = b1 - learningrate * db1
    W2 = W2 - learningrate * dW2
    b2 = b2 - learningrate* db2
    print "Loss for ",i,"th iteration =>",L
Loss for  0 th iteration => 0.697485157779
Loss for  1 th iteration => 0.697471738571
Loss for  2 th iteration => 0.697457847441
Loss for  3 th iteration => 0.697443798326
Loss for  4 th iteration => 0.697429740669
Loss for  5 th iteration => 0.697415616951
Loss for  6 th iteration => 0.697401206116
Loss for  7 th iteration => 0.697386879393
Loss for  8 th iteration => 0.697372700416
Loss for  9 th iteration => 0.697358632697
Loss for  10 th iteration => 0.697344694126
Loss for  11 th iteration => 0.697330932116
Loss for  12 th iteration => 0.697317203767
Loss for  13 th iteration => 0.697303508699
Loss for  14 th iteration => 0.697289861228
Loss for  15 th iteration => 0.697276455872
Loss for  16 th iteration => 0.697263308357
Loss for  17 th iteration => 0.697250243231
Loss for  18 th iteration => 0.697237267493
Loss for  19 th iteration => 0.697224607526
Loss for  20 th iteration => 0.697212051393
Loss for  21 th iteration => 0.697199598038
Loss for  22 th iteration => 0.697187250592
Loss for  23 th iteration => 0.697175268093
Loss for  24 th iteration => 0.697163407226
Loss for  25 th iteration => 0.697151670494
Loss for  26 th iteration => 0.6971403185
Loss for  27 th iteration => 0.697129104909
Loss for  28 th iteration => 0.697118027609
Loss for  29 th iteration => 0.697107084497
Loss for  30 th iteration => 0.697096273476
Loss for  31 th iteration => 0.697085601982
Loss for  32 th iteration => 0.697075372284
Loss for  33 th iteration => 0.697065286427
Loss for  34 th iteration => 0.697055341633
Loss for  35 th iteration => 0.697045535135
Loss for  36 th iteration => 0.697035864173
Loss for  37 th iteration => 0.697026329686
Loss for  38 th iteration => 0.697017263626
Loss for  39 th iteration => 0.697008340988
Loss for  40 th iteration => 0.696999558202
Loss for  41 th iteration => 0.696990912926
Loss for  42 th iteration => 0.696982764196
Loss for  43 th iteration => 0.696974758294
Loss for  44 th iteration => 0.696966890741
Loss for  45 th iteration => 0.69695915707
Loss for  46 th iteration => 0.696951552825
Loss for  47 th iteration => 0.696944073561
Loss for  48 th iteration => 0.696936714846
Loss for  49 th iteration => 0.696929472258
Loss for  50 th iteration => 0.696922341388
Loss for  51 th iteration => 0.696915317835
Loss for  52 th iteration => 0.696908397215
Loss for  53 th iteration => 0.696901575151
Loss for  54 th iteration => 0.69689484728
Loss for  55 th iteration => 0.69688820925
Loss for  56 th iteration => 0.696881656721
Loss for  57 th iteration => 0.696875185363
Loss for  58 th iteration => 0.696868790861
Loss for  59 th iteration => 0.696862468908
Loss for  60 th iteration => 0.696856215212
Loss for  61 th iteration => 0.69685002549
Loss for  62 th iteration => 0.696843895472
Loss for  63 th iteration => 0.6968378209
Loss for  64 th iteration => 0.696831797527
Loss for  65 th iteration => 0.696825821119
Loss for  66 th iteration => 0.696819887452
Loss for  67 th iteration => 0.696813992315
Loss for  68 th iteration => 0.696808131508
Loss for  69 th iteration => 0.696802300845
Loss for  70 th iteration => 0.696796496147
Loss for  71 th iteration => 0.696790713253
Loss for  72 th iteration => 0.696784948008
Loss for  73 th iteration => 0.696779196273
Loss for  74 th iteration => 0.696773453918
Loss for  75 th iteration => 0.696767716826
Loss for  76 th iteration => 0.696761980893
Loss for  77 th iteration => 0.696756242024
Loss for  78 th iteration => 0.696750496137
Loss for  79 th iteration => 0.696744739162
Loss for  80 th iteration => 0.696738967041
Loss for  81 th iteration => 0.696733175728
Loss for  82 th iteration => 0.696727361186
Loss for  83 th iteration => 0.696721519392
Loss for  84 th iteration => 0.696715646336
Loss for  85 th iteration => 0.696709738016
Loss for  86 th iteration => 0.696703790445
Loss for  87 th iteration => 0.696697799645
Loss for  88 th iteration => 0.696691761652
Loss for  89 th iteration => 0.69668567251
Loss for  90 th iteration => 0.696679528279
Loss for  91 th iteration => 0.696673325028
Loss for  92 th iteration => 0.696667058837
Loss for  93 th iteration => 0.696660725798
Loss for  94 th iteration => 0.696654322016
Loss for  95 th iteration => 0.696647843605
Loss for  96 th iteration => 0.696641286692
Loss for  97 th iteration => 0.696634647414
Loss for  98 th iteration => 0.696627921921
Loss for  99 th iteration => 0.696621106373
Loss for  100 th iteration => 0.696614196942
Loss for  101 th iteration => 0.696607187372
Loss for  102 th iteration => 0.69659972319
Loss for  103 th iteration => 0.696592168978
Loss for  104 th iteration => 0.69658452188
Loss for  105 th iteration => 0.696576779049
Loss for  106 th iteration => 0.696568932417
Loss for  107 th iteration => 0.696560664265
Loss for  108 th iteration => 0.696552309882
Loss for  109 th iteration => 0.696543867275
Loss for  110 th iteration => 0.696535334459
Loss for  111 th iteration => 0.696526709459
Loss for  112 th iteration => 0.696517978102
Loss for  113 th iteration => 0.696508857644
Loss for  114 th iteration => 0.696499658813
Loss for  115 th iteration => 0.696490380341
Loss for  116 th iteration => 0.696481020972
Loss for  117 th iteration => 0.696471579455
Loss for  118 th iteration => 0.696462045878
Loss for  119 th iteration => 0.69645218823
Loss for  120 th iteration => 0.696442265752
Loss for  121 th iteration => 0.696432028632
Loss for  122 th iteration => 0.696421749885
Loss for  123 th iteration => 0.696411429274
Loss for  124 th iteration => 0.696401066567
Loss for  125 th iteration => 0.696390641935
Loss for  126 th iteration => 0.696379977333
Loss for  127 th iteration => 0.696369294362
Loss for  128 th iteration => 0.696358581994
Loss for  129 th iteration => 0.696347665087
Loss for  130 th iteration => 0.696336583332
Loss for  131 th iteration => 0.696325535096
Loss for  132 th iteration => 0.696314520591
Loss for  133 th iteration => 0.69630352078
Loss for  134 th iteration => 0.696292427213
Loss for  135 th iteration => 0.696281260928
Loss for  136 th iteration => 0.696270054622
Loss for  137 th iteration => 0.696258776981
Loss for  138 th iteration => 0.696247510869
Loss for  139 th iteration => 0.696236603858
Loss for  140 th iteration => 0.696225791431
Loss for  141 th iteration => 0.696215047968
Loss for  142 th iteration => 0.696204648801
Loss for  143 th iteration => 0.696194786131
Loss for  144 th iteration => 0.696185267367
Loss for  145 th iteration => 0.696176049396
Loss for  146 th iteration => 0.696167207418
Loss for  147 th iteration => 0.696158536359
Loss for  148 th iteration => 0.696150085535
Loss for  149 th iteration => 0.69614192609
Loss for  150 th iteration => 0.696133977136
Loss for  151 th iteration => 0.696126107215
Loss for  152 th iteration => 0.696118309043
Loss for  153 th iteration => 0.696110563415
Loss for  154 th iteration => 0.696102870665
Loss for  155 th iteration => 0.696095222567
Loss for  156 th iteration => 0.696087595043
Loss for  157 th iteration => 0.696079988102
Loss for  158 th iteration => 0.696072411125
Loss for  159 th iteration => 0.696064849781
Loss for  160 th iteration => 0.696057319562
Loss for  161 th iteration => 0.696049817386
Loss for  162 th iteration => 0.696042356087
Loss for  163 th iteration => 0.69603493056
Loss for  164 th iteration => 0.696027519895
Loss for  165 th iteration => 0.696020122703
Loss for  166 th iteration => 0.696012738802
Loss for  167 th iteration => 0.696005368015
Loss for  168 th iteration => 0.695998019419
Loss for  169 th iteration => 0.695990685638
Loss for  170 th iteration => 0.695983365686
Loss for  171 th iteration => 0.695976063575
Loss for  172 th iteration => 0.695968782159
Loss for  173 th iteration => 0.695961515485
Loss for  174 th iteration => 0.695954263437
Loss for  175 th iteration => 0.695947025894
Loss for  176 th iteration => 0.69593980274
Loss for  177 th iteration => 0.695932602368
Loss for  178 th iteration => 0.695925415545
Loss for  179 th iteration => 0.695918243801
Loss for  180 th iteration => 0.695911087042
Loss for  181 th iteration => 0.695903945173
Loss for  182 th iteration => 0.6958968181
Loss for  183 th iteration => 0.695889705731
Loss for  184 th iteration => 0.69588260797
Loss for  185 th iteration => 0.695875524726
Loss for  186 th iteration => 0.695868455903
Loss for  187 th iteration => 0.695861401408
Loss for  188 th iteration => 0.695854361148
Loss for  189 th iteration => 0.69584733503
Loss for  190 th iteration => 0.69584032296
Loss for  191 th iteration => 0.695833324845
Loss for  192 th iteration => 0.695826340592
Loss for  193 th iteration => 0.695819370107
Loss for  194 th iteration => 0.695812414816
Loss for  195 th iteration => 0.695805477512
Loss for  196 th iteration => 0.695798554393
Loss for  197 th iteration => 0.695791645379
Loss for  198 th iteration => 0.69578475039
Loss for  199 th iteration => 0.695777869345
Loss for  200 th iteration => 0.695771002164
Loss for  201 th iteration => 0.695764148768
Loss for  202 th iteration => 0.695757309076
Loss for  203 th iteration => 0.695750483009
Loss for  204 th iteration => 0.695743670486
Loss for  205 th iteration => 0.695736871429
Loss for  206 th iteration => 0.695730085758
Loss for  207 th iteration => 0.695723313394
Loss for  208 th iteration => 0.695716554256
Loss for  209 th iteration => 0.695709808268
Loss for  210 th iteration => 0.695703075348
Loss for  211 th iteration => 0.695696355419
Loss for  212 th iteration => 0.695689648402
Loss for  213 th iteration => 0.695682954218
Loss for  214 th iteration => 0.695676272788
Loss for  215 th iteration => 0.695669604034
Loss for  216 th iteration => 0.695662947878
Loss for  217 th iteration => 0.695656304241
Loss for  218 th iteration => 0.695649673046
Loss for  219 th iteration => 0.695643054214
Loss for  220 th iteration => 0.695636447667
Loss for  221 th iteration => 0.695629853328
Loss for  222 th iteration => 0.695623271119
Loss for  223 th iteration => 0.695616700963
Loss for  224 th iteration => 0.695610142782
Loss for  225 th iteration => 0.695603596498
Loss for  226 th iteration => 0.695597062034
Loss for  227 th iteration => 0.695590537121
Loss for  228 th iteration => 0.695583998657
Loss for  229 th iteration => 0.695577471459
Loss for  230 th iteration => 0.69557095545
Loss for  231 th iteration => 0.69556445797
Loss for  232 th iteration => 0.695557956847
Loss for  233 th iteration => 0.695551474041
Loss for  234 th iteration => 0.695545002529
Loss for  235 th iteration => 0.695538541349
Loss for  236 th iteration => 0.695532094452
Loss for  237 th iteration => 0.695525652269
Loss for  238 th iteration => 0.695519223046
Loss for  239 th iteration => 0.695512794159
Loss for  240 th iteration => 0.69550632456
Loss for  241 th iteration => 0.695499870424
Loss for  242 th iteration => 0.69549343825
Loss for  243 th iteration => 0.695487058883
Loss for  244 th iteration => 0.695480761893
Loss for  245 th iteration => 0.695474470984
Loss for  246 th iteration => 0.69546819155
Loss for  247 th iteration => 0.695461932125
Loss for  248 th iteration => 0.695455668054
Loss for  249 th iteration => 0.695449420327
Loss for  250 th iteration => 0.695443186812
Loss for  251 th iteration => 0.695436959523
Loss for  252 th iteration => 0.695430749255
Loss for  253 th iteration => 0.695424540512
Loss for  254 th iteration => 0.695418343612
Loss for  255 th iteration => 0.695412171059
Loss for  256 th iteration => 0.695405985538
Loss for  257 th iteration => 0.695399822703
Loss for  258 th iteration => 0.695393667427
Loss for  259 th iteration => 0.695387519727
Loss for  260 th iteration => 0.695381394766
Loss for  261 th iteration => 0.695375260408
Loss for  262 th iteration => 0.695369143887
Loss for  263 th iteration => 0.695363039632
Loss for  264 th iteration => 0.695356939309
Loss for  265 th iteration => 0.695350859624
Loss for  266 th iteration => 0.695344738245
Loss for  267 th iteration => 0.695338631252
Loss for  268 th iteration => 0.695332539136
Loss for  269 th iteration => 0.695326447821
Loss for  270 th iteration => 0.695320378576
Loss for  271 th iteration => 0.695314301295
Loss for  272 th iteration => 0.695308237453
Loss for  273 th iteration => 0.695302189185
Loss for  274 th iteration => 0.69529613998
Loss for  275 th iteration => 0.695290114586
Loss for  276 th iteration => 0.695284078322
Loss for  277 th iteration => 0.695278057231
Loss for  278 th iteration => 0.695272049672
Loss for  279 th iteration => 0.695266040764
Loss for  280 th iteration => 0.695260059024
Loss for  281 th iteration => 0.695254060452
Loss for  282 th iteration => 0.695248081991
Loss for  283 th iteration => 0.695242111751
Loss for  284 th iteration => 0.695236141336
Loss for  285 th iteration => 0.695230203341
Loss for  286 th iteration => 0.695224229443
Loss for  287 th iteration => 0.695218256206
Loss for  288 th iteration => 0.695212260515
Loss for  289 th iteration => 0.695206270865
Loss for  290 th iteration => 0.695200287172
Loss for  291 th iteration => 0.695194309355
Loss for  292 th iteration => 0.695188337333
Loss for  293 th iteration => 0.695182371023
Loss for  294 th iteration => 0.695176410344
Loss for  295 th iteration => 0.695170455214
Loss for  296 th iteration => 0.695164521196
Loss for  297 th iteration => 0.69515864469
Loss for  298 th iteration => 0.695152774649
Loss for  299 th iteration => 0.695146907816
Loss for  300 th iteration => 0.695141048634
Loss for  301 th iteration => 0.695135195049
Loss for  302 th iteration => 0.695129346981
Loss for  303 th iteration => 0.695123501535
Loss for  304 th iteration => 0.69511761234
Loss for  305 th iteration => 0.695111727364
Loss for  306 th iteration => 0.69510584652
Loss for  307 th iteration => 0.695099971354
Loss for  308 th iteration => 0.695094096192
Loss for  309 th iteration => 0.695088227281
Loss for  310 th iteration => 0.695082362157
Loss for  311 th iteration => 0.695076500732
Loss for  312 th iteration => 0.695070642922
Loss for  313 th iteration => 0.695064788638
Loss for  314 th iteration => 0.695058937798
Loss for  315 th iteration => 0.695053090313
Loss for  316 th iteration => 0.695047250786
Loss for  317 th iteration => 0.695041423139
Loss for  318 th iteration => 0.695035625295
Loss for  319 th iteration => 0.695029790821
Loss for  320 th iteration => 0.695023992308
Loss for  321 th iteration => 0.695018190448
Loss for  322 th iteration => 0.695012364746
Loss for  323 th iteration => 0.695006547483
Loss for  324 th iteration => 0.695000687449
Loss for  325 th iteration => 0.694994852006
Loss for  326 th iteration => 0.694989039003
Loss for  327 th iteration => 0.694983176629
Loss for  328 th iteration => 0.69497734369
Loss for  329 th iteration => 0.694971517436
Loss for  330 th iteration => 0.694965679104
Loss for  331 th iteration => 0.694959860393
Loss for  332 th iteration => 0.694954028402
Loss for  333 th iteration => 0.694948199933
Loss for  334 th iteration => 0.694942390808
Loss for  335 th iteration => 0.694936550551
Loss for  336 th iteration => 0.694930717568
Loss for  337 th iteration => 0.694924930301
Loss for  338 th iteration => 0.694919044034
Loss for  339 th iteration => 0.694913174655
Loss for  340 th iteration => 0.694907304777
Loss for  341 th iteration => 0.694901439367
Loss for  342 th iteration => 0.694895567036
Loss for  343 th iteration => 0.694889680162
Loss for  344 th iteration => 0.694883807326
Loss for  345 th iteration => 0.694877933491
Loss for  346 th iteration => 0.694872058558
Loss for  347 th iteration => 0.694866182428
Loss for  348 th iteration => 0.694860305
Loss for  349 th iteration => 0.694854431181
Loss for  350 th iteration => 0.694848548388
Loss for  351 th iteration => 0.694842666548
Loss for  352 th iteration => 0.694836783015
Loss for  353 th iteration => 0.694830897691
Loss for  354 th iteration => 0.694825018411
Loss for  355 th iteration => 0.694819109623
Loss for  356 th iteration => 0.694813218331
Loss for  357 th iteration => 0.694807331628
Loss for  358 th iteration => 0.694801432312
Loss for  359 th iteration => 0.694795534241
Loss for  360 th iteration => 0.69478963369
Loss for  361 th iteration => 0.694783730561
Loss for  362 th iteration => 0.694777824756
Loss for  363 th iteration => 0.694771916177
Loss for  364 th iteration => 0.694766009926
Loss for  365 th iteration => 0.694760080047
Loss for  366 th iteration => 0.694754048876
Loss for  367 th iteration => 0.694747785479
Loss for  368 th iteration => 0.694741512859
Loss for  369 th iteration => 0.69473523089
Loss for  370 th iteration => 0.694728885772
Loss for  371 th iteration => 0.694722412147
Loss for  372 th iteration => 0.694715728026
Loss for  373 th iteration => 0.694709039159
Loss for  374 th iteration => 0.694702682603
Loss for  375 th iteration => 0.694696314637
Loss for  376 th iteration => 0.694689935113
Loss for  377 th iteration => 0.694683543882
Loss for  378 th iteration => 0.694677140796
Loss for  379 th iteration => 0.694670725708
Loss for  380 th iteration => 0.694664289905
Loss for  381 th iteration => 0.694657694683
Loss for  382 th iteration => 0.694650917399
Loss for  383 th iteration => 0.69464387604
Loss for  384 th iteration => 0.694637077754
Loss for  385 th iteration => 0.694630169779
Loss for  386 th iteration => 0.694623237365
Loss for  387 th iteration => 0.694616261054
Loss for  388 th iteration => 0.694609311249
Loss for  389 th iteration => 0.694602259886
Loss for  390 th iteration => 0.694595285544
Loss for  391 th iteration => 0.694588181643
Loss for  392 th iteration => 0.694581198928
Loss for  393 th iteration => 0.694574099819
Loss for  394 th iteration => 0.694566919983
Loss for  395 th iteration => 0.694559729298
Loss for  396 th iteration => 0.694552495702
Loss for  397 th iteration => 0.69454515153
Loss for  398 th iteration => 0.694537798872
Loss for  399 th iteration => 0.694530424769
Loss for  400 th iteration => 0.694523062711
Loss for  401 th iteration => 0.694515623397
Loss for  402 th iteration => 0.694508167027
Loss for  403 th iteration => 0.694500709466
Loss for  404 th iteration => 0.694493195587
Loss for  405 th iteration => 0.69448563696
Loss for  406 th iteration => 0.69447804765
Loss for  407 th iteration => 0.69447052379
Loss for  408 th iteration => 0.694462842758
Loss for  409 th iteration => 0.694455171673
Loss for  410 th iteration => 0.694447496213
Loss for  411 th iteration => 0.694439780444
Loss for  412 th iteration => 0.694432003115
Loss for  413 th iteration => 0.694424193262
Loss for  414 th iteration => 0.694416462311
Loss for  415 th iteration => 0.694408552318
Loss for  416 th iteration => 0.694400655451
Loss for  417 th iteration => 0.694392775949
Loss for  418 th iteration => 0.694384818314
Loss for  419 th iteration => 0.694376809588
Loss for  420 th iteration => 0.694368786661
Loss for  421 th iteration => 0.694360776061
Loss for  422 th iteration => 0.694352664373
Loss for  423 th iteration => 0.694344530812
Loss for  424 th iteration => 0.694336469761
Loss for  425 th iteration => 0.694328221635
Loss for  426 th iteration => 0.694319968855
Loss for  427 th iteration => 0.694311771429
Loss for  428 th iteration => 0.694303454078
Loss for  429 th iteration => 0.694295092943
Loss for  430 th iteration => 0.694286775145
Loss for  431 th iteration => 0.694278375573
Loss for  432 th iteration => 0.694269903877
Loss for  433 th iteration => 0.694261434314
Loss for  434 th iteration => 0.694252958068
Loss for  435 th iteration => 0.69424437341
Loss for  436 th iteration => 0.694235772895
Loss for  437 th iteration => 0.694227216919
Loss for  438 th iteration => 0.694218517102
Loss for  439 th iteration => 0.694209788368
Loss for  440 th iteration => 0.694201122647
Loss for  441 th iteration => 0.694192305276
Loss for  442 th iteration => 0.694183453032
Loss for  443 th iteration => 0.694174687727
Loss for  444 th iteration => 0.694165755013
Loss for  445 th iteration => 0.694156784144
Loss for  446 th iteration => 0.694147894377
Loss for  447 th iteration => 0.69413883513
Loss for  448 th iteration => 0.694129750853
Loss for  449 th iteration => 0.694120734748
Loss for  450 th iteration => 0.694111568587
Loss for  451 th iteration => 0.694102372574
Loss for  452 th iteration => 0.694093220009
Loss for  453 th iteration => 0.694083909749
Loss for  454 th iteration => 0.694074616929
Loss for  455 th iteration => 0.694065317025
Loss for  456 th iteration => 0.694055882047
Loss for  457 th iteration => 0.694046504948
Loss for  458 th iteration => 0.694036963916
Loss for  459 th iteration => 0.694027245651
Loss for  460 th iteration => 0.694017480084
Loss for  461 th iteration => 0.694007803804
Loss for  462 th iteration => 0.693997896502
Loss for  463 th iteration => 0.693988006652
Loss for  464 th iteration => 0.693978047414
Loss for  465 th iteration => 0.693968152012
Loss for  466 th iteration => 0.693958104949
Loss for  467 th iteration => 0.693948262182
Loss for  468 th iteration => 0.693937865163
Loss for  469 th iteration => 0.693927516644
Loss for  470 th iteration => 0.693917117247
Loss for  471 th iteration => 0.693906656864
Loss for  472 th iteration => 0.693896143704
Loss for  473 th iteration => 0.693885577487
Loss for  474 th iteration => 0.693874957936
Loss for  475 th iteration => 0.693864284775
Loss for  476 th iteration => 0.693853583329
Loss for  477 th iteration => 0.693842793336
Loss for  478 th iteration => 0.693831957795
Loss for  479 th iteration => 0.693821094744
Loss for  480 th iteration => 0.693810480869
Loss for  481 th iteration => 0.69379913956
Loss for  482 th iteration => 0.693788101658
Loss for  483 th iteration => 0.693777007049
Loss for  484 th iteration => 0.693766229296
Loss for  485 th iteration => 0.693754502403
Loss for  486 th iteration => 0.693743763941
Loss for  487 th iteration => 0.693731882097
Loss for  488 th iteration => 0.693720600557
Loss for  489 th iteration => 0.693708879765
Loss for  490 th iteration => 0.693696914395
Loss for  491 th iteration => 0.693684862343
Loss for  492 th iteration => 0.693673645554
Loss for  493 th iteration => 0.693661164866
Loss for  494 th iteration => 0.693648942088
Loss for  495 th iteration => 0.693636852422
Loss for  496 th iteration => 0.693624392173
Loss for  497 th iteration => 0.693611831579
Loss for  498 th iteration => 0.693599200418
Loss for  499 th iteration => 0.69358649835
Loss for  500 th iteration => 0.693573726801
Loss for  501 th iteration => 0.693560900489
Loss for  502 th iteration => 0.693547983781
Loss for  503 th iteration => 0.693534994807
Loss for  504 th iteration => 0.693521933228
Loss for  505 th iteration => 0.693508798707
Loss for  506 th iteration => 0.693495590904
Loss for  507 th iteration => 0.693482426287
Loss for  508 th iteration => 0.693469013025
Loss for  509 th iteration => 0.693455585466
Loss for  510 th iteration => 0.693442083276
Loss for  511 th iteration => 0.693428506117
Loss for  512 th iteration => 0.693414853652
Loss for  513 th iteration => 0.693401157626
Loss for  514 th iteration => 0.693387436759
Loss for  515 th iteration => 0.693373516399
Loss for  516 th iteration => 0.693359561634
Loss for  517 th iteration => 0.69334552988
Loss for  518 th iteration => 0.693331420802
Loss for  519 th iteration => 0.693317267553
Loss for  520 th iteration => 0.693302999748
Loss for  521 th iteration => 0.693288717172
Loss for  522 th iteration => 0.693274297887
Loss for  523 th iteration => 0.693259799602
Loss for  524 th iteration => 0.693245222498
Loss for  525 th iteration => 0.69323058886
Loss for  526 th iteration => 0.693215851696
Loss for  527 th iteration => 0.693201224866
Loss for  528 th iteration => 0.693186221639
Loss for  529 th iteration => 0.693171244952
Loss for  530 th iteration => 0.693156191695
Loss for  531 th iteration => 0.693141072392
Loss for  532 th iteration => 0.693125878552
Loss for  533 th iteration => 0.693110101801
Loss for  534 th iteration => 0.693093892866
Loss for  535 th iteration => 0.69307744788
Loss for  536 th iteration => 0.693061314553
Loss for  537 th iteration => 0.693044577957
Loss for  538 th iteration => 0.693028087649
Loss for  539 th iteration => 0.693011296473
Loss for  540 th iteration => 0.69299446315
Loss for  541 th iteration => 0.692977439218
Loss for  542 th iteration => 0.692960113186
Loss for  543 th iteration => 0.692942671852
Loss for  544 th iteration => 0.692925266378
Loss for  545 th iteration => 0.692907536514
Loss for  546 th iteration => 0.692889773138
Loss for  547 th iteration => 0.692871903438
Loss for  548 th iteration => 0.692853956069
Loss for  549 th iteration => 0.69283608333
Loss for  550 th iteration => 0.692817774445
Loss for  551 th iteration => 0.692799477794
Loss for  552 th iteration => 0.692781072649
Loss for  553 th iteration => 0.692762636873
Loss for  554 th iteration => 0.692744083564
Loss for  555 th iteration => 0.692725340964
Loss for  556 th iteration => 0.692706500293
Loss for  557 th iteration => 0.692687548964
Loss for  558 th iteration => 0.692668738669
Loss for  559 th iteration => 0.692649449456
Loss for  560 th iteration => 0.692630183406
Loss for  561 th iteration => 0.692610788027
Loss for  562 th iteration => 0.692591478172
Loss for  563 th iteration => 0.692571796582
Loss for  564 th iteration => 0.692552064312
Loss for  565 th iteration => 0.692532247084
Loss for  566 th iteration => 0.692512471921
Loss for  567 th iteration => 0.692492365955
Loss for  568 th iteration => 0.692472177755
Loss for  569 th iteration => 0.692451873727
Loss for  570 th iteration => 0.692431666812
Loss for  571 th iteration => 0.692411117885
Loss for  572 th iteration => 0.692390466972
Loss for  573 th iteration => 0.692369698518
Loss for  574 th iteration => 0.692349094796
Loss for  575 th iteration => 0.692327992849
Loss for  576 th iteration => 0.692306894976
Loss for  577 th iteration => 0.692285693065
Loss for  578 th iteration => 0.692264505785
Loss for  579 th iteration => 0.692243029515
Loss for  580 th iteration => 0.692221432713
Loss for  581 th iteration => 0.692199898175
Loss for  582 th iteration => 0.692178137794
Loss for  583 th iteration => 0.692156169463
Loss for  584 th iteration => 0.692134089663
Loss for  585 th iteration => 0.692112279767
Loss for  586 th iteration => 0.69208979675
Loss for  587 th iteration => 0.692067372941
Loss for  588 th iteration => 0.692045035101
Loss for  589 th iteration => 0.692022373222
Loss for  590 th iteration => 0.691999557921
Loss for  591 th iteration => 0.69197668562
Loss for  592 th iteration => 0.691953817042
Loss for  593 th iteration => 0.691930677144
Loss for  594 th iteration => 0.69190734975
Loss for  595 th iteration => 0.691884445369
Loss for  596 th iteration => 0.69186101047
Loss for  597 th iteration => 0.691837321286
Loss for  598 th iteration => 0.69181353179
Loss for  599 th iteration => 0.691789595743
Loss for  600 th iteration => 0.69176552096
Loss for  601 th iteration => 0.69174138407
Loss for  602 th iteration => 0.691717302573
Loss for  603 th iteration => 0.691692849525
Loss for  604 th iteration => 0.691668328554
Loss for  605 th iteration => 0.691643904026
Loss for  606 th iteration => 0.691619053689
Loss for  607 th iteration => 0.691594114724
Loss for  608 th iteration => 0.691569311933
Loss for  609 th iteration => 0.691544109528
Loss for  610 th iteration => 0.691518848804
Loss for  611 th iteration => 0.691494087655
Loss for  612 th iteration => 0.691468451373
Loss for  613 th iteration => 0.691442680642
Loss for  614 th iteration => 0.691416798845
Loss for  615 th iteration => 0.691390775267
Loss for  616 th iteration => 0.691364746202
Loss for  617 th iteration => 0.69133923903
Loss for  618 th iteration => 0.691312798286
Loss for  619 th iteration => 0.691286227813
Loss for  620 th iteration => 0.691259547714
Loss for  621 th iteration => 0.691232695125
Loss for  622 th iteration => 0.691206193731
Loss for  623 th iteration => 0.691179617231
Loss for  624 th iteration => 0.69115235717
Loss for  625 th iteration => 0.691125005911
Loss for  626 th iteration => 0.691097461847
Loss for  627 th iteration => 0.691069782553
Loss for  628 th iteration => 0.691043131965
Loss for  629 th iteration => 0.691015053095
Loss for  630 th iteration => 0.690986794612
Loss for  631 th iteration => 0.690958370867
Loss for  632 th iteration => 0.690929795975
Loss for  633 th iteration => 0.690900975612
Loss for  634 th iteration => 0.69087142001
Loss for  635 th iteration => 0.69084174167
Loss for  636 th iteration => 0.690811895162
Loss for  637 th iteration => 0.690781884037
Loss for  638 th iteration => 0.690751727209
Loss for  639 th iteration => 0.690721424315
Loss for  640 th iteration => 0.690690974994
Loss for  641 th iteration => 0.690660378884
Loss for  642 th iteration => 0.690630045899
Loss for  643 th iteration => 0.690599488
Loss for  644 th iteration => 0.690568438895
Loss for  645 th iteration => 0.690537241542
Loss for  646 th iteration => 0.690505895579
Loss for  647 th iteration => 0.690474852338
Loss for  648 th iteration => 0.690443493404
Loss for  649 th iteration => 0.690411688511
Loss for  650 th iteration => 0.690379733556
Loss for  651 th iteration => 0.690347628175
Loss for  652 th iteration => 0.690315724486
Loss for  653 th iteration => 0.690283694533
Loss for  654 th iteration => 0.6902511243
Loss for  655 th iteration => 0.690218402194
Loss for  656 th iteration => 0.690185527851
Loss for  657 th iteration => 0.690152604852
Loss for  658 th iteration => 0.690120042633
Loss for  659 th iteration => 0.690086697529
Loss for  660 th iteration => 0.690053198744
Loss for  661 th iteration => 0.690019545917
Loss for  662 th iteration => 0.689985738683
Loss for  663 th iteration => 0.689951893551
Loss for  664 th iteration => 0.689918367919
Loss for  665 th iteration => 0.689884082853
Loss for  666 th iteration => 0.689849641942
Loss for  667 th iteration => 0.689815044823
Loss for  668 th iteration => 0.689780291134
Loss for  669 th iteration => 0.68974538051
Loss for  670 th iteration => 0.689710603638
Loss for  671 th iteration => 0.689675776037
Loss for  672 th iteration => 0.68964037935
Loss for  673 th iteration => 0.689604824297
Loss for  674 th iteration => 0.689569110513
Loss for  675 th iteration => 0.689533237634
Loss for  676 th iteration => 0.689497205298
Loss for  677 th iteration => 0.68946115492
Loss for  678 th iteration => 0.689425328197
Loss for  679 th iteration => 0.689388801595
Loss for  680 th iteration => 0.689352114108
Loss for  681 th iteration => 0.689315265374
Loss for  682 th iteration => 0.689278255028
Loss for  683 th iteration => 0.689241082705
Loss for  684 th iteration => 0.689203748043
Loss for  685 th iteration => 0.689166250675
Loss for  686 th iteration => 0.689128858258
Loss for  687 th iteration => 0.689091398982
Loss for  688 th iteration => 0.689053396831
Loss for  689 th iteration => 0.689015230559
Loss for  690 th iteration => 0.688976899801
Loss for  691 th iteration => 0.688938404194
Loss for  692 th iteration => 0.688899743371
Loss for  693 th iteration => 0.688860916968
Loss for  694 th iteration => 0.688821924621
Loss for  695 th iteration => 0.688782765963
Loss for  696 th iteration => 0.688743478491
Loss for  697 th iteration => 0.688704536108
Loss for  698 th iteration => 0.688664861044
Loss for  699 th iteration => 0.688625018267
Loss for  700 th iteration => 0.688585007414
Loss for  701 th iteration => 0.688544828119
Loss for  702 th iteration => 0.688504480018
Loss for  703 th iteration => 0.688463962745
Loss for  704 th iteration => 0.688423275936
Loss for  705 th iteration => 0.688382419226
Loss for  706 th iteration => 0.68834139225
Loss for  707 th iteration => 0.688300194642
Loss for  708 th iteration => 0.688258826038
Loss for  709 th iteration => 0.688217286073
Loss for  710 th iteration => 0.688175574382
Loss for  711 th iteration => 0.688133690599
Loss for  712 th iteration => 0.688091634359
Loss for  713 th iteration => 0.688049427439
Loss for  714 th iteration => 0.68800749272
Loss for  715 th iteration => 0.687964900463
Loss for  716 th iteration => 0.687922134378
Loss for  717 th iteration => 0.6878791941
Loss for  718 th iteration => 0.687836079267
Loss for  719 th iteration => 0.687792789515
Loss for  720 th iteration => 0.687749324481
Loss for  721 th iteration => 0.687705683801
Loss for  722 th iteration => 0.687661867112
Loss for  723 th iteration => 0.687617874052
Loss for  724 th iteration => 0.687573704257
Loss for  725 th iteration => 0.687529357366
Loss for  726 th iteration => 0.687484833016
Loss for  727 th iteration => 0.687440130844
Loss for  728 th iteration => 0.68739525049
Loss for  729 th iteration => 0.687350191592
Loss for  730 th iteration => 0.687304953788
Loss for  731 th iteration => 0.687259536717
Loss for  732 th iteration => 0.687213940019
Loss for  733 th iteration => 0.687168163333
Loss for  734 th iteration => 0.6871222063
Loss for  735 th iteration => 0.68707606856
Loss for  736 th iteration => 0.687029744535
Loss for  737 th iteration => 0.686983172775
Loss for  738 th iteration => 0.686936419016
Loss for  739 th iteration => 0.686889482901
Loss for  740 th iteration => 0.686842364071
Loss for  741 th iteration => 0.68679506217
Loss for  742 th iteration => 0.686747536157
Loss for  743 th iteration => 0.686699630014
Loss for  744 th iteration => 0.686651339077
Loss for  745 th iteration => 0.686602748452
Loss for  746 th iteration => 0.6865537757
Loss for  747 th iteration => 0.686504613709
Loss for  748 th iteration => 0.686455180278
Loss for  749 th iteration => 0.686405204734
Loss for  750 th iteration => 0.686354928023
Loss for  751 th iteration => 0.686303509545
Loss for  752 th iteration => 0.686251901273
Loss for  753 th iteration => 0.686200102832
Loss for  754 th iteration => 0.686148113848
Loss for  755 th iteration => 0.686095933944
Loss for  756 th iteration => 0.686043562744
Loss for  757 th iteration => 0.685991101615
Loss for  758 th iteration => 0.685939424483
Loss for  759 th iteration => 0.685886447058
Loss for  760 th iteration => 0.685833204148
Loss for  761 th iteration => 0.685780564658
Loss for  762 th iteration => 0.685727736088
Loss for  763 th iteration => 0.685674841246
Loss for  764 th iteration => 0.685622646493
Loss for  765 th iteration => 0.685569268785
Loss for  766 th iteration => 0.685515816118
Loss for  767 th iteration => 0.685462173029
Loss for  768 th iteration => 0.685408339136
Loss for  769 th iteration => 0.685354314055
Loss for  770 th iteration => 0.685300408315
Loss for  771 th iteration => 0.685246768783
Loss for  772 th iteration => 0.685192136828
Loss for  773 th iteration => 0.68513731241
Loss for  774 th iteration => 0.685082295144
Loss for  775 th iteration => 0.685027084645
Loss for  776 th iteration => 0.684971680527
Loss for  777 th iteration => 0.684916082406
Loss for  778 th iteration => 0.684860684747
Loss for  779 th iteration => 0.684805311657
Loss for  780 th iteration => 0.684749097566
Loss for  781 th iteration => 0.684692688212
Loss for  782 th iteration => 0.684636083209
Loss for  783 th iteration => 0.684579282171
Loss for  784 th iteration => 0.684522312359
Loss for  785 th iteration => 0.684465167592
Loss for  786 th iteration => 0.684407775788
Loss for  787 th iteration => 0.684350224471
Loss for  788 th iteration => 0.684293321745
Loss for  789 th iteration => 0.68423530382
Loss for  790 th iteration => 0.684177087468
Loss for  791 th iteration => 0.684118672305
Loss for  792 th iteration => 0.684060078336
Loss for  793 th iteration => 0.684001321855
Loss for  794 th iteration => 0.683942307579
Loss for  795 th iteration => 0.683883092948
Loss for  796 th iteration => 0.683823677576
Loss for  797 th iteration => 0.683764061077
Loss for  798 th iteration => 0.683704243066
Loss for  799 th iteration => 0.68364422316
Loss for  800 th iteration => 0.683584000974
Loss for  801 th iteration => 0.683523627083
Loss for  802 th iteration => 0.683463112529
Loss for  803 th iteration => 0.68340294438
Loss for  804 th iteration => 0.683341876232
Loss for  805 th iteration => 0.683280604257
Loss for  806 th iteration => 0.683219128072
Loss for  807 th iteration => 0.683157447297
Loss for  808 th iteration => 0.683095561551
Loss for  809 th iteration => 0.683033476138
Loss for  810 th iteration => 0.682971252558
Loss for  811 th iteration => 0.68290874924
Loss for  812 th iteration => 0.682846039441
Loss for  813 th iteration => 0.682783122785
Loss for  814 th iteration => 0.682719998896
Loss for  815 th iteration => 0.682656667402
Loss for  816 th iteration => 0.682593100365
Loss for  817 th iteration => 0.682529263697
Loss for  818 th iteration => 0.682465156727
Loss for  819 th iteration => 0.682399341122
Loss for  820 th iteration => 0.68233342648
Loss for  821 th iteration => 0.682267803189
Loss for  822 th iteration => 0.682202710883
Loss for  823 th iteration => 0.682136170364
Loss for  824 th iteration => 0.682069380917
Loss for  825 th iteration => 0.682002020492
Loss for  826 th iteration => 0.681934941286
Loss for  827 th iteration => 0.681866315747
Loss for  828 th iteration => 0.681797214786
Loss for  829 th iteration => 0.68172965771
Loss for  830 th iteration => 0.681659561164
Loss for  831 th iteration => 0.681590451053
Loss for  832 th iteration => 0.681521488882
Loss for  833 th iteration => 0.68145062165
Loss for  834 th iteration => 0.681381354493
Loss for  835 th iteration => 0.681311006139
Loss for  836 th iteration => 0.681239648073
Loss for  837 th iteration => 0.681169821485
Loss for  838 th iteration => 0.681098652162
Loss for  839 th iteration => 0.681026437862
Loss for  840 th iteration => 0.680956487518
Loss for  841 th iteration => 0.680883800292
Loss for  842 th iteration => 0.680810977658
Loss for  843 th iteration => 0.680740279875
Loss for  844 th iteration => 0.680666838705
Loss for  845 th iteration => 0.680593092399
Loss for  846 th iteration => 0.680521061576
Loss for  847 th iteration => 0.680447516204
Loss for  848 th iteration => 0.680373085628
Loss for  849 th iteration => 0.68029995085
Loss for  850 th iteration => 0.680226109202
Loss for  851 th iteration => 0.680150990625
Loss for  852 th iteration => 0.680076569243
Loss for  853 th iteration => 0.680002606721
Loss for  854 th iteration => 0.679926796439
Loss for  855 th iteration => 0.679850905327
Loss for  856 th iteration => 0.679776997912
Loss for  857 th iteration => 0.67970049225
Loss for  858 th iteration => 0.679623778999
Loss for  859 th iteration => 0.679547784522
Loss for  860 th iteration => 0.679472111957
Loss for  861 th iteration => 0.679394697739
Loss for  862 th iteration => 0.679317074226
Loss for  863 th iteration => 0.679240610531
Loss for  864 th iteration => 0.679163524487
Loss for  865 th iteration => 0.679085194488
Loss for  866 th iteration => 0.679006653507
Loss for  867 th iteration => 0.678929353438
Loss for  868 th iteration => 0.678851207191
Loss for  869 th iteration => 0.678771954284
Loss for  870 th iteration => 0.678692488729
Loss for  871 th iteration => 0.678613984407
Loss for  872 th iteration => 0.678535132283
Loss for  873 th iteration => 0.678454949447
Loss for  874 th iteration => 0.678374552321
Loss for  875 th iteration => 0.678294475717
Loss for  876 th iteration => 0.678215272539
Loss for  877 th iteration => 0.678134152873
Loss for  878 th iteration => 0.678052817303
Loss for  879 th iteration => 0.677971265117
Loss for  880 th iteration => 0.677890428961
Loss for  881 th iteration => 0.677809582527
Loss for  882 th iteration => 0.67772730077
Loss for  883 th iteration => 0.677644794991
Loss for  884 th iteration => 0.677561748242
Loss for  885 th iteration => 0.677478962003
Loss for  886 th iteration => 0.677396498617
Loss for  887 th iteration => 0.67731274262
Loss for  888 th iteration => 0.677228727864
Loss for  889 th iteration => 0.67714441428
Loss for  890 th iteration => 0.677060063903
Loss for  891 th iteration => 0.676976251731
Loss for  892 th iteration => 0.676892375745
Loss for  893 th iteration => 0.676806918379
Loss for  894 th iteration => 0.676721652833
Loss for  895 th iteration => 0.676635850616
Loss for  896 th iteration => 0.676550004531
Loss for  897 th iteration => 0.676464365346
Loss for  898 th iteration => 0.676379202225
Loss for  899 th iteration => 0.676292674483
Loss for  900 th iteration => 0.676205569236
Loss for  901 th iteration => 0.676118721625
Loss for  902 th iteration => 0.676031132273
Loss for  903 th iteration => 0.675943696902
Loss for  904 th iteration => 0.675855800991
Loss for  905 th iteration => 0.675768479255
Loss for  906 th iteration => 0.675681133029
Loss for  907 th iteration => 0.675592389756
Loss for  908 th iteration => 0.675503796689
Loss for  909 th iteration => 0.675414504919
Loss for  910 th iteration => 0.67532541489
Loss for  911 th iteration => 0.675235770891
Loss for  912 th iteration => 0.675146044101
Loss for  913 th iteration => 0.675056097863
Loss for  914 th iteration => 0.67496572545
Loss for  915 th iteration => 0.67487592336
Loss for  916 th iteration => 0.674785896714
Loss for  917 th iteration => 0.67469508131
Loss for  918 th iteration => 0.674603711986
Loss for  919 th iteration => 0.674512235571
Loss for  920 th iteration => 0.674420576583
Loss for  921 th iteration => 0.674328430131
Loss for  922 th iteration => 0.674236485626
Loss for  923 th iteration => 0.674143753228
Loss for  924 th iteration => 0.674050959975
Loss for  925 th iteration => 0.673957680726
Loss for  926 th iteration => 0.67386428183
Loss for  927 th iteration => 0.673770558484
Loss for  928 th iteration => 0.673676409608
Loss for  929 th iteration => 0.67358201257
Loss for  930 th iteration => 0.673487366812
Loss for  931 th iteration => 0.67339240371
Loss for  932 th iteration => 0.673296989487
Loss for  933 th iteration => 0.673201323891
Loss for  934 th iteration => 0.673105406388
Loss for  935 th iteration => 0.673009236448
Loss for  936 th iteration => 0.672912705178
Loss for  937 th iteration => 0.672815405396
Loss for  938 th iteration => 0.672717387095
Loss for  939 th iteration => 0.672619254032
Loss for  940 th iteration => 0.672521436862
Loss for  941 th iteration => 0.672423104511
Loss for  942 th iteration => 0.672324446554
Loss for  943 th iteration => 0.67222591145
Loss for  944 th iteration => 0.672126686159
Loss for  945 th iteration => 0.672027442277
Loss for  946 th iteration => 0.671927916943
Loss for  947 th iteration => 0.67182789589
Loss for  948 th iteration => 0.671728079292
Loss for  949 th iteration => 0.67162754214
Loss for  950 th iteration => 0.671526888406
Loss for  951 th iteration => 0.671426129969
Loss for  952 th iteration => 0.671324774392
Loss for  953 th iteration => 0.671223569137
Loss for  954 th iteration => 0.671121721641
Loss for  955 th iteration => 0.671019429867
Loss for  956 th iteration => 0.670916837428
Loss for  957 th iteration => 0.670813593047
Loss for  958 th iteration => 0.670710075142
Loss for  959 th iteration => 0.670606283318
Loss for  960 th iteration => 0.670502201548
Loss for  961 th iteration => 0.670397321944
Loss for  962 th iteration => 0.670292163565
Loss for  963 th iteration => 0.670186630367
Loss for  964 th iteration => 0.670080236215
Loss for  965 th iteration => 0.669973236518
Loss for  966 th iteration => 0.669866140515
Loss for  967 th iteration => 0.669757711517
Loss for  968 th iteration => 0.669649633179
Loss for  969 th iteration => 0.66954074751
Loss for  970 th iteration => 0.669432771796
Loss for  971 th iteration => 0.669322794216
Loss for  972 th iteration => 0.669213430262
Loss for  973 th iteration => 0.669103903917
Loss for  974 th iteration => 0.668994890531
Loss for  975 th iteration => 0.66888468309
Loss for  976 th iteration => 0.668774111761
Loss for  977 th iteration => 0.668663773905
Loss for  978 th iteration => 0.668551852836
Loss for  979 th iteration => 0.668440431827
Loss for  980 th iteration => 0.668328770983
Loss for  981 th iteration => 0.668217710374
Loss for  982 th iteration => 0.668104110173
Loss for  983 th iteration => 0.667990188685
Loss for  984 th iteration => 0.667877016185
Loss for  985 th iteration => 0.667762071681
Loss for  986 th iteration => 0.667646748804
Loss for  987 th iteration => 0.667532147008
Loss for  988 th iteration => 0.66741843475
Loss for  989 th iteration => 0.667302110416
Loss for  990 th iteration => 0.667185677649
Loss for  991 th iteration => 0.667070980909
Loss for  992 th iteration => 0.666953756306
Loss for  993 th iteration => 0.666837259851
Loss for  994 th iteration => 0.666719749879
Loss for  995 th iteration => 0.666599310928
Loss for  996 th iteration => 0.666480486576
Loss for  997 th iteration => 0.666358995207
Loss for  998 th iteration => 0.666236902186
Loss for  999 th iteration => 0.666114476001
Loss for  1000 th iteration => 0.665991695103
Loss for  1001 th iteration => 0.665868559311
Loss for  1002 th iteration => 0.665747627319
Loss for  1003 th iteration => 0.665622959187
Loss for  1004 th iteration => 0.665497627719
Loss for  1005 th iteration => 0.665371212732
Loss for  1006 th iteration => 0.665246120005
Loss for  1007 th iteration => 0.665116438292
Loss for  1008 th iteration => 0.664986374244
Loss for  1009 th iteration => 0.664855908134
Loss for  1010 th iteration => 0.664728093801
Loss for  1011 th iteration => 0.664596183952
Loss for  1012 th iteration => 0.664464579635
Loss for  1013 th iteration => 0.664333896572
Loss for  1014 th iteration => 0.66420453899
Loss for  1015 th iteration => 0.664067849845
Loss for  1016 th iteration => 0.663933121719
Loss for  1017 th iteration => 0.663802053798
Loss for  1018 th iteration => 0.663668728134
Loss for  1019 th iteration => 0.663529529717
Loss for  1020 th iteration => 0.663392986516
Loss for  1021 th iteration => 0.663257981242
Loss for  1022 th iteration => 0.663116731316
Loss for  1023 th iteration => 0.662977609984
Loss for  1024 th iteration => 0.662839963011
Loss for  1025 th iteration => 0.662696092792
Loss for  1026 th iteration => 0.662553588289
Loss for  1027 th iteration => 0.662411182756
Loss for  1028 th iteration => 0.662265785712
Loss for  1029 th iteration => 0.66212076729
Loss for  1030 th iteration => 0.661974038183
Loss for  1031 th iteration => 0.661825687741
Loss for  1032 th iteration => 0.661680410468
Loss for  1033 th iteration => 0.661528850097
Loss for  1034 th iteration => 0.661376583682
Loss for  1035 th iteration => 0.661227561065
Loss for  1036 th iteration => 0.66107407391
Loss for  1037 th iteration => 0.660922406983
Loss for  1038 th iteration => 0.660769801115
Loss for  1039 th iteration => 0.660614726151
Loss for  1040 th iteration => 0.660464128152
Loss for  1041 th iteration => 0.660307818045
Loss for  1042 th iteration => 0.660163508429
Loss for  1043 th iteration => 0.660068413686
Loss for  1044 th iteration => 0.659907595818
Loss for  1045 th iteration => 0.659925952556
Loss for  1046 th iteration => 0.659543931378
Loss for  1047 th iteration => 0.659448045925
Loss for  1048 th iteration => 0.659530954706
Loss for  1049 th iteration => 0.659110642343
Loss for  1050 th iteration => 0.658986489293
Loss for  1051 th iteration => 0.659159155432
Loss for  1052 th iteration => 0.658709315301
Loss for  1053 th iteration => 0.658568162161
Loss for  1054 th iteration => 0.658907914212
Loss for  1055 th iteration => 0.658442886901
Loss for  1056 th iteration => 0.658015186258
Loss for  1057 th iteration => 0.658026396258
Loss for  1058 th iteration => 0.658404762417
Loss for  1059 th iteration => 0.657931428371
Loss for  1060 th iteration => 0.657460585476
Loss for  1061 th iteration => 0.657524944505
Loss for  1062 th iteration => 0.657865962719
Loss for  1063 th iteration => 0.657384796068
Loss for  1064 th iteration => 0.656904265339
Loss for  1065 th iteration => 0.656824652345
Loss for  1066 th iteration => 0.657236003018
Loss for  1067 th iteration => 0.656745607894
Loss for  1068 th iteration => 0.656257262287
Loss for  1069 th iteration => 0.656309634089
Loss for  1070 th iteration => 0.656733788996
Loss for  1071 th iteration => 0.656233788337
Loss for  1072 th iteration => 0.655737309045
Loss for  1073 th iteration => 0.655420976141
Loss for  1074 th iteration => 0.655618278492
Loss for  1075 th iteration => 0.65511243677
Loss for  1076 th iteration => 0.65532462727
Loss for  1077 th iteration => 0.655644367376
Loss for  1078 th iteration => 0.655129957614
Loss for  1079 th iteration => 0.65461965535
Loss for  1080 th iteration => 0.654379806411
Loss for  1081 th iteration => 0.654672821534
Loss for  1082 th iteration => 0.654152609459
Loss for  1083 th iteration => 0.653900071685
Loss for  1084 th iteration => 0.654089035414
Loss for  1085 th iteration => 0.653562800893
Loss for  1086 th iteration => 0.653577758865
Loss for  1087 th iteration => 0.653989671229
Loss for  1088 th iteration => 0.653453835194
Loss for  1089 th iteration => 0.652928593835
Loss for  1090 th iteration => 0.652935316803
Loss for  1091 th iteration => 0.65329246292
Loss for  1092 th iteration => 0.652754551853
Loss for  1093 th iteration => 0.652239199429
Loss for  1094 th iteration => 0.65224627412
Loss for  1095 th iteration => 0.652452474318
Loss for  1096 th iteration => 0.651899020044
Loss for  1097 th iteration => 0.651726756158
Loss for  1098 th iteration => 0.651855777223
Loss for  1099 th iteration => 0.651317469307
Loss for  1100 th iteration => 0.651286133181
Loss for  1101 th iteration => 0.651389246109
Loss for  1102 th iteration => 0.650844409436
Loss for  1103 th iteration => 0.650776364805
Loss for  1104 th iteration => 0.650881955996
Loss for  1105 th iteration => 0.650344530925
Loss for  1106 th iteration => 0.65025737786
Loss for  1107 th iteration => 0.650381225687
Loss for  1108 th iteration => 0.649850412933
Loss for  1109 th iteration => 0.649739680134
Loss for  1110 th iteration => 0.649845589026
Loss for  1111 th iteration => 0.649333013565
Loss for  1112 th iteration => 0.649265883637
Loss for  1113 th iteration => 0.649345712183
Loss for  1114 th iteration => 0.648836633501
Loss for  1115 th iteration => 0.648736876236
Loss for  1116 th iteration => 0.648735111623
Loss for  1117 th iteration => 0.64835912979
Loss for  1118 th iteration => 0.648316064103
Loss for  1119 th iteration => 0.648043575976
Loss for  1120 th iteration => 0.647990367537
Loss for  1121 th iteration => 0.647694887548
Loss for  1122 th iteration => 0.647634060007
Loss for  1123 th iteration => 0.647376733935
Loss for  1124 th iteration => 0.64730243425
Loss for  1125 th iteration => 0.647026736163
Loss for  1126 th iteration => 0.646943076626
Loss for  1127 th iteration => 0.646652829958
Loss for  1128 th iteration => 0.646480422253
Loss for  1129 th iteration => 0.64631041666
Loss for  1130 th iteration => 0.646141592522
Loss for  1131 th iteration => 0.645969105743
Loss for  1132 th iteration => 0.64579746169
Loss for  1133 th iteration => 0.645628568679
Loss for  1134 th iteration => 0.645453215185
Loss for  1135 th iteration => 0.645283432489
Loss for  1136 th iteration => 0.64511127324
Loss for  1137 th iteration => 0.644936514954
Loss for  1138 th iteration => 0.644764472821
Loss for  1139 th iteration => 0.644592058708
Loss for  1140 th iteration => 0.644416916356
Loss for  1141 th iteration => 0.644241950255
Loss for  1142 th iteration => 0.644069123242
Loss for  1143 th iteration => 0.643892585341
Loss for  1144 th iteration => 0.643718166032
Loss for  1145 th iteration => 0.643545700716
Loss for  1146 th iteration => 0.643368658258
Loss for  1147 th iteration => 0.643191160816
Loss for  1148 th iteration => 0.643014741852
Loss for  1149 th iteration => 0.642839197644
Loss for  1150 th iteration => 0.642661422112
Loss for  1151 th iteration => 0.64248371049
Loss for  1152 th iteration => 0.642307696631
Loss for  1153 th iteration => 0.642130291248
Loss for  1154 th iteration => 0.641949600057
Loss for  1155 th iteration => 0.641771257157
Loss for  1156 th iteration => 0.641592957068
Loss for  1157 th iteration => 0.641414995714
Loss for  1158 th iteration => 0.641235255096
Loss for  1159 th iteration => 0.641054877417
Loss for  1160 th iteration => 0.640874889041
Loss for  1161 th iteration => 0.64069453126
Loss for  1162 th iteration => 0.640514734432
Loss for  1163 th iteration => 0.640332483692
Loss for  1164 th iteration => 0.640150333553
Loss for  1165 th iteration => 0.639968477868
Loss for  1166 th iteration => 0.639787058298
Loss for  1167 th iteration => 0.639607028059
Loss for  1168 th iteration => 0.639423477532
Loss for  1169 th iteration => 0.639240120437
Loss for  1170 th iteration => 0.63905674595
Loss for  1171 th iteration => 0.638872586272
Loss for  1172 th iteration => 0.638688467624
Loss for  1173 th iteration => 0.63850748694
Loss for  1174 th iteration => 0.638322371531
Loss for  1175 th iteration => 0.638137501113
Loss for  1176 th iteration => 0.637952352143
Loss for  1177 th iteration => 0.637766566793
Loss for  1178 th iteration => 0.637580571792
Loss for  1179 th iteration => 0.637394360724
Loss for  1180 th iteration => 0.637207927495
Loss for  1181 th iteration => 0.637022063739
Loss for  1182 th iteration => 0.636837266137
Loss for  1183 th iteration => 0.636650101831
Loss for  1184 th iteration => 0.636462794601
Loss for  1185 th iteration => 0.636274814811
Loss for  1186 th iteration => 0.636086601002
Loss for  1187 th iteration => 0.635898148023
Loss for  1188 th iteration => 0.635709450992
Loss for  1189 th iteration => 0.635520505288
Loss for  1190 th iteration => 0.635331306535
Loss for  1191 th iteration => 0.635141850595
Loss for  1192 th iteration => 0.634952133554
Loss for  1193 th iteration => 0.634762151714
Loss for  1194 th iteration => 0.634571901582
Loss for  1195 th iteration => 0.63438137986
Loss for  1196 th iteration => 0.634190583439
Loss for  1197 th iteration => 0.633999509386
Loss for  1198 th iteration => 0.63380815494
Loss for  1199 th iteration => 0.633616517501
Loss for  1200 th iteration => 0.633424594624
Loss for  1201 th iteration => 0.63323238401
Loss for  1202 th iteration => 0.633039883501
Loss for  1203 th iteration => 0.632847091075
Loss for  1204 th iteration => 0.632654004835
Loss for  1205 th iteration => 0.632460623006
Loss for  1206 th iteration => 0.632266943929
Loss for  1207 th iteration => 0.632072966054
Loss for  1208 th iteration => 0.631878687939
Loss for  1209 th iteration => 0.631684108238
Loss for  1210 th iteration => 0.631489225703
Loss for  1211 th iteration => 0.631294039176
Loss for  1212 th iteration => 0.631098547583
Loss for  1213 th iteration => 0.630902749934
Loss for  1214 th iteration => 0.630706645318
Loss for  1215 th iteration => 0.630510232896
Loss for  1216 th iteration => 0.6303135119
Loss for  1217 th iteration => 0.630116481631
Loss for  1218 th iteration => 0.629919141452
Loss for  1219 th iteration => 0.629721490788
Loss for  1220 th iteration => 0.629523529122
Loss for  1221 th iteration => 0.629325496892
Loss for  1222 th iteration => 0.629126761136
Loss for  1223 th iteration => 0.62892789887
Loss for  1224 th iteration => 0.62872872215
Loss for  1225 th iteration => 0.628529230814
Loss for  1226 th iteration => 0.628329424734
Loss for  1227 th iteration => 0.628129303819
Loss for  1228 th iteration => 0.627928868012
Loss for  1229 th iteration => 0.627728117287
Loss for  1230 th iteration => 0.627527209341
Loss for  1231 th iteration => 0.627325911366
Loss for  1232 th iteration => 0.627124243078
Loss for  1233 th iteration => 0.626922258617
Loss for  1234 th iteration => 0.626719958175
Loss for  1235 th iteration => 0.626517341963
Loss for  1236 th iteration => 0.62631441021
Loss for  1237 th iteration => 0.626111163164
Loss for  1238 th iteration => 0.625908101195
Loss for  1239 th iteration => 0.625704075813
Loss for  1240 th iteration => 0.625499905586
Loss for  1241 th iteration => 0.625295420137
Loss for  1242 th iteration => 0.625090619848
Loss for  1243 th iteration => 0.624885505112
Loss for  1244 th iteration => 0.624680076329
Loss for  1245 th iteration => 0.624475087248
Loss for  1246 th iteration => 0.624268714741
Loss for  1247 th iteration => 0.624062362717
Loss for  1248 th iteration => 0.62385569756
Loss for  1249 th iteration => 0.623648719756
Loss for  1250 th iteration => 0.623441429794
Loss for  1251 th iteration => 0.62323439401
Loss for  1252 th iteration => 0.623026419157
Loss for  1253 th iteration => 0.622818208347
Loss for  1254 th iteration => 0.6226118747
Loss for  1255 th iteration => 0.622402566937
Loss for  1256 th iteration => 0.622193445752
Loss for  1257 th iteration => 0.621985018414
Loss for  1258 th iteration => 0.621774969845
Loss for  1259 th iteration => 0.621567703901
Loss for  1260 th iteration => 0.62135680097
Loss for  1261 th iteration => 0.621146124068
Loss for  1262 th iteration => 0.620936428254
Loss for  1263 th iteration => 0.620726443444
Loss for  1264 th iteration => 0.62051583147
Loss for  1265 th iteration => 0.620303881615
Loss for  1266 th iteration => 0.620092594607
Loss for  1267 th iteration => 0.619882572887
Loss for  1268 th iteration => 0.619670378593
Loss for  1269 th iteration => 0.619457141799
Loss for  1270 th iteration => 0.619244803388
Loss for  1271 th iteration => 0.61903576192
Loss for  1272 th iteration => 0.618820330886
Loss for  1273 th iteration => 0.618605909517
Loss for  1274 th iteration => 0.618395660621
Loss for  1275 th iteration => 0.618181082144
Loss for  1276 th iteration => 0.617965540802
Loss for  1277 th iteration => 0.617754142097
Loss for  1278 th iteration => 0.617539343965
Loss for  1279 th iteration => 0.617322782556
Loss for  1280 th iteration => 0.617110689436
Loss for  1281 th iteration => 0.616895042885
Loss for  1282 th iteration => 0.616677461035
Loss for  1283 th iteration => 0.61646594624
Loss for  1284 th iteration => 0.616248138489
Loss for  1285 th iteration => 0.61603040618
Loss for  1286 th iteration => 0.615817413545
Loss for  1287 th iteration => 0.615598500146
Loss for  1288 th iteration => 0.615383128646
Loss for  1289 th iteration => 0.61516681197
Loss for  1290 th iteration => 0.614947180291
Loss for  1291 th iteration => 0.614732216925
Loss for  1292 th iteration => 0.614512472174
Loss for  1293 th iteration => 0.614297177359
Loss for  1294 th iteration => 0.614077701964
Loss for  1295 th iteration => 0.613859517779
Loss for  1296 th iteration => 0.613640674277
Loss for  1297 th iteration => 0.613421387757
Loss for  1298 th iteration => 0.613202735671
Loss for  1299 th iteration => 0.612982722615
Loss for  1300 th iteration => 0.612764574071
Loss for  1301 th iteration => 0.612543329184
Loss for  1302 th iteration => 0.612325361991
Loss for  1303 th iteration => 0.612102685375
Loss for  1304 th iteration => 0.611884979896
Loss for  1305 th iteration => 0.611661544841
Loss for  1306 th iteration => 0.611443417177
Loss for  1307 th iteration => 0.61121985722
Loss for  1308 th iteration => 0.611000666696
Loss for  1309 th iteration => 0.610777577566
Loss for  1310 th iteration => 0.61055672428
Loss for  1311 th iteration => 0.610334665785
Loss for  1312 th iteration => 0.610111588273
Loss for  1313 th iteration => 0.609891086134
Loss for  1314 th iteration => 0.60966525915
Loss for  1315 th iteration => 0.609446806773
Loss for  1316 th iteration => 0.609219042548
Loss for  1317 th iteration => 0.608998820655
Loss for  1318 th iteration => 0.608773503052
Loss for  1319 th iteration => 0.608550086743
Loss for  1320 th iteration => 0.608327083783
Loss for  1321 th iteration => 0.608100808578
Loss for  1322 th iteration => 0.607878040275
Loss for  1323 th iteration => 0.607653383605
Loss for  1324 th iteration => 0.607426557544
Loss for  1325 th iteration => 0.607204984577
Loss for  1326 th iteration => 0.606977820566
Loss for  1327 th iteration => 0.606750785812
Loss for  1328 th iteration => 0.606528550265
Loss for  1329 th iteration => 0.606299574465
Loss for  1330 th iteration => 0.606073406107
Loss for  1331 th iteration => 0.605850672611
Loss for  1332 th iteration => 0.605620118802
Loss for  1333 th iteration => 0.605393923661
Loss for  1334 th iteration => 0.605170276902
Loss for  1335 th iteration => 0.604939451599
Loss for  1336 th iteration => 0.604711662817
Loss for  1337 th iteration => 0.604488585083
Loss for  1338 th iteration => 0.604257427251
Loss for  1339 th iteration => 0.604028573658
Loss for  1340 th iteration => 0.603801695332
Loss for  1341 th iteration => 0.60357430841
Loss for  1342 th iteration => 0.603344601622
Loss for  1343 th iteration => 0.603114588475
Loss for  1344 th iteration => 0.602886521572
Loss for  1345 th iteration => 0.602658852219
Loss for  1346 th iteration => 0.602429165905
Loss for  1347 th iteration => 0.602197333033
Loss for  1348 th iteration => 0.601967131218
Loss for  1349 th iteration => 0.601741205355
Loss for  1350 th iteration => 0.601510026065
Loss for  1351 th iteration => 0.601277720034
Loss for  1352 th iteration => 0.601045500493
Loss for  1353 th iteration => 0.600817132407
Loss for  1354 th iteration => 0.600588544608
Loss for  1355 th iteration => 0.600355666591
Loss for  1356 th iteration => 0.600122849295
Loss for  1357 th iteration => 0.599891487127
Loss for  1358 th iteration => 0.599659556802
Loss for  1359 th iteration => 0.599427683119
Loss for  1360 th iteration => 0.599197966908
Loss for  1361 th iteration => 0.598964466351
Loss for  1362 th iteration => 0.598732520807
Loss for  1363 th iteration => 0.598499553769
Loss for  1364 th iteration => 0.598265838561
Loss for  1365 th iteration => 0.59803209837
Loss for  1366 th iteration => 0.597798318859
Loss for  1367 th iteration => 0.597564857758
Loss for  1368 th iteration => 0.597333678633
Loss for  1369 th iteration => 0.597101495753
Loss for  1370 th iteration => 0.596866817794
Loss for  1371 th iteration => 0.596632088366
Loss for  1372 th iteration => 0.596397295196
Loss for  1373 th iteration => 0.596162426931
Loss for  1374 th iteration => 0.595927664783
Loss for  1375 th iteration => 0.595693965304
Loss for  1376 th iteration => 0.595458657796
Loss for  1377 th iteration => 0.59522324878
Loss for  1378 th iteration => 0.59498772983
Loss for  1379 th iteration => 0.594752093173
Loss for  1380 th iteration => 0.594516331645
Loss for  1381 th iteration => 0.594280438646
Loss for  1382 th iteration => 0.594044408103
Loss for  1383 th iteration => 0.593808234432
Loss for  1384 th iteration => 0.593571912502
Loss for  1385 th iteration => 0.593335437602
Loss for  1386 th iteration => 0.593098805414
Loss for  1387 th iteration => 0.592862011983
Loss for  1388 th iteration => 0.592625232702
Loss for  1389 th iteration => 0.592389269921
Loss for  1390 th iteration => 0.592150563977
Loss for  1391 th iteration => 0.591913200434
Loss for  1392 th iteration => 0.591676810242
Loss for  1393 th iteration => 0.591437536312
Loss for  1394 th iteration => 0.591199522446
Loss for  1395 th iteration => 0.59096132641
Loss for  1396 th iteration => 0.590722946521
Loss for  1397 th iteration => 0.590484381267
Loss for  1398 th iteration => 0.5902456293
Loss for  1399 th iteration => 0.590006689423
Loss for  1400 th iteration => 0.589767560577
Loss for  1401 th iteration => 0.589528241832
Loss for  1402 th iteration => 0.58928873238
Loss for  1403 th iteration => 0.589049031522
Loss for  1404 th iteration => 0.588809138664
Loss for  1405 th iteration => 0.588569053305
Loss for  1406 th iteration => 0.588328775037
Loss for  1407 th iteration => 0.58808830353
Loss for  1408 th iteration => 0.587847638534
Loss for  1409 th iteration => 0.587606779866
Loss for  1410 th iteration => 0.587365727411
Loss for  1411 th iteration => 0.587124481115
Loss for  1412 th iteration => 0.58688304098
Loss for  1413 th iteration => 0.58664140706
Loss for  1414 th iteration => 0.586399579458
Loss for  1415 th iteration => 0.586157558321
Loss for  1416 th iteration => 0.58591534384
Loss for  1417 th iteration => 0.585672936242
Loss for  1418 th iteration => 0.585432752514
Loss for  1419 th iteration => 0.585189721335
Loss for  1420 th iteration => 0.58494683917
Loss for  1421 th iteration => 0.584703759403
Loss for  1422 th iteration => 0.584460482766
Loss for  1423 th iteration => 0.584217009992
Loss for  1424 th iteration => 0.583976347064
Loss for  1425 th iteration => 0.583732453552
Loss for  1426 th iteration => 0.583488396119
Loss for  1427 th iteration => 0.583244953115
Loss for  1428 th iteration => 0.583000224826
Loss for  1429 th iteration => 0.582759037275
Loss for  1430 th iteration => 0.582514591914
Loss for  1431 th iteration => 0.582269503692
Loss for  1432 th iteration => 0.58202423673
Loss for  1433 th iteration => 0.581781312373
Loss for  1434 th iteration => 0.581538548864
Loss for  1435 th iteration => 0.581292570089
Loss for  1436 th iteration => 0.581046429768
Loss for  1437 th iteration => 0.580802931616
Loss for  1438 th iteration => 0.580559503994
Loss for  1439 th iteration => 0.580313088492
Loss for  1440 th iteration => 0.580066054363
Loss for  1441 th iteration => 0.579824971906
Loss for  1442 th iteration => 0.579577369607
Loss for  1443 th iteration => 0.579330858887
Loss for  1444 th iteration => 0.579086685826
Loss for  1445 th iteration => 0.5788411019
Loss for  1446 th iteration => 0.57859329622
Loss for  1447 th iteration => 0.578348607579
Loss for  1448 th iteration => 0.578103655933
Loss for  1449 th iteration => 0.577854409034
Loss for  1450 th iteration => 0.577610854932
Loss for  1451 th iteration => 0.57736471801
Loss for  1452 th iteration => 0.577114698319
Loss for  1453 th iteration => 0.576871809606
Loss for  1454 th iteration => 0.576624166262
Loss for  1455 th iteration => 0.576373386634
Loss for  1456 th iteration => 0.576132949961
Loss for  1457 th iteration => 0.575881962516
Loss for  1458 th iteration => 0.575633932319
Loss for  1459 th iteration => 0.575388855266
Loss for  1460 th iteration => 0.575137843697
Loss for  1461 th iteration => 0.574895109838
Loss for  1462 th iteration => 0.574644542882
Loss for  1463 th iteration => 0.574397259356
Loss for  1464 th iteration => 0.574148769203
Loss for  1465 th iteration => 0.573899846799
Loss for  1466 th iteration => 0.573654160665
Loss for  1467 th iteration => 0.573402298362
Loss for  1468 th iteration => 0.573160395889
Loss for  1469 th iteration => 0.572905302437
Loss for  1470 th iteration => 0.572662513019
Loss for  1471 th iteration => 0.57240855472
Loss for  1472 th iteration => 0.572164965929
Loss for  1473 th iteration => 0.571911836404
Loss for  1474 th iteration => 0.571666007564
Loss for  1475 th iteration => 0.571414342557
Loss for  1476 th iteration => 0.571167338346
Loss for  1477 th iteration => 0.570916059389
Loss for  1478 th iteration => 0.570668857922
Loss for  1479 th iteration => 0.570416980057
Loss for  1480 th iteration => 0.570170479259
Loss for  1481 th iteration => 0.569917103231
Loss for  1482 th iteration => 0.569672126823
Loss for  1483 th iteration => 0.569416607133
Loss for  1484 th iteration => 0.569171642386
Loss for  1485 th iteration => 0.568918476452
Loss for  1486 th iteration => 0.568670125273
Loss for  1487 th iteration => 0.568419974423
Loss for  1488 th iteration => 0.56816799126
Loss for  1489 th iteration => 0.567921078814
Loss for  1490 th iteration => 0.567666745107
Loss for  1491 th iteration => 0.567418629771
Loss for  1492 th iteration => 0.567169126712
Loss for  1493 th iteration => 0.566914748984
Loss for  1494 th iteration => 0.566668815918
Loss for  1495 th iteration => 0.56641349568
Loss for  1496 th iteration => 0.56616407478
Loss for  1497 th iteration => 0.565915849842
Loss for  1498 th iteration => 0.56565854374
Loss for  1499 th iteration => 0.565412938613
Loss for  1500 th iteration => 0.565161066386
Loss for  1501 th iteration => 0.564904187762
Loss for  1502 th iteration => 0.564659742249
Loss for  1503 th iteration => 0.56440668919
Loss for  1504 th iteration => 0.564150218439
Loss for  1505 th iteration => 0.563905013982
Loss for  1506 th iteration => 0.563652184692
Loss for  1507 th iteration => 0.563395919113
Loss for  1508 th iteration => 0.563148040449
Loss for  1509 th iteration => 0.562897626835
Loss for  1510 th iteration => 0.562641349542
Loss for  1511 th iteration => 0.56238860475
Loss for  1512 th iteration => 0.562143069711
Loss for  1513 th iteration => 0.561886801544
Loss for  1514 th iteration => 0.561632461471
Loss for  1515 th iteration => 0.561380953311
Loss for  1516 th iteration => 0.561132367273
Loss for  1517 th iteration => 0.560876500224
Loss for  1518 th iteration => 0.560623191024
Loss for  1519 th iteration => 0.560369728619
Loss for  1520 th iteration => 0.560121992415
Loss for  1521 th iteration => 0.559866328102
Loss for  1522 th iteration => 0.559612853415
Loss for  1523 th iteration => 0.559358990158
Loss for  1524 th iteration => 0.559106255748
Loss for  1525 th iteration => 0.558856328948
Loss for  1526 th iteration => 0.558601308547
Loss for  1527 th iteration => 0.558348735163
Loss for  1528 th iteration => 0.558093674292
Loss for  1529 th iteration => 0.557838883488
Loss for  1530 th iteration => 0.557588172631
Loss for  1531 th iteration => 0.557335839288
Loss for  1532 th iteration => 0.557081904464
Loss for  1533 th iteration => 0.55682842974
Loss for  1534 th iteration => 0.556573553794
Loss for  1535 th iteration => 0.556318871874
Loss for  1536 th iteration => 0.556064356379
Loss for  1537 th iteration => 0.555812043441
Loss for  1538 th iteration => 0.555560462625
Loss for  1539 th iteration => 0.555306150558
Loss for  1540 th iteration => 0.555052838306
Loss for  1541 th iteration => 0.554797967666
Loss for  1542 th iteration => 0.554543223373
Loss for  1543 th iteration => 0.554288584299
Loss for  1544 th iteration => 0.554034031317
Loss for  1545 th iteration => 0.55377954712
Loss for  1546 th iteration => 0.553525116048
Loss for  1547 th iteration => 0.553270723946
Loss for  1548 th iteration => 0.553016358017
Loss for  1549 th iteration => 0.552762006705
Loss for  1550 th iteration => 0.552507659576
Loss for  1551 th iteration => 0.552253481053
Loss for  1552 th iteration => 0.552001327086
Loss for  1553 th iteration => 0.551746454803
Loss for  1554 th iteration => 0.551491598801
Loss for  1555 th iteration => 0.551236771143
Loss for  1556 th iteration => 0.550982892034
Loss for  1557 th iteration => 0.550727843208
Loss for  1558 th iteration => 0.550472791969
Loss for  1559 th iteration => 0.55021772993
Loss for  1560 th iteration => 0.549962649545
Loss for  1561 th iteration => 0.549707544027
Loss for  1562 th iteration => 0.54945240728
Loss for  1563 th iteration => 0.549197233834
Loss for  1564 th iteration => 0.548942018789
Loss for  1565 th iteration => 0.548686757756
Loss for  1566 th iteration => 0.548431446815
Loss for  1567 th iteration => 0.548176082468
Loss for  1568 th iteration => 0.547920661601
Loss for  1569 th iteration => 0.547665181447
Loss for  1570 th iteration => 0.547409639556
Loss for  1571 th iteration => 0.547154033762
Loss for  1572 th iteration => 0.546898362159
Loss for  1573 th iteration => 0.546642623074
Loss for  1574 th iteration => 0.54638681505
Loss for  1575 th iteration => 0.54613093682
Loss for  1576 th iteration => 0.545874987295
Loss for  1577 th iteration => 0.545618965541
Loss for  1578 th iteration => 0.545363278431
Loss for  1579 th iteration => 0.545106833282
Loss for  1580 th iteration => 0.54485066218
Loss for  1581 th iteration => 0.54459441051
Loss for  1582 th iteration => 0.544338078457
Loss for  1583 th iteration => 0.544081666246
Loss for  1584 th iteration => 0.543827555276
Loss for  1585 th iteration => 0.543569813316
Loss for  1586 th iteration => 0.543313239866
Loss for  1587 th iteration => 0.543056582962
Loss for  1588 th iteration => 0.542801311132
Loss for  1589 th iteration => 0.542543791035
Loss for  1590 th iteration => 0.54228689328
Loss for  1591 th iteration => 0.54203134926
Loss for  1592 th iteration => 0.541775248353
Loss for  1593 th iteration => 0.54151806539
Loss for  1594 th iteration => 0.541261822251
Loss for  1595 th iteration => 0.541004706032
Loss for  1596 th iteration => 0.540747270241
Loss for  1597 th iteration => 0.54049315141
Loss for  1598 th iteration => 0.540235466838
Loss for  1599 th iteration => 0.539978251338
Loss for  1600 th iteration => 0.539721466564
Loss for  1601 th iteration => 0.539463509059
Loss for  1602 th iteration => 0.539210405874
Loss for  1603 th iteration => 0.538951464473
Loss for  1604 th iteration => 0.538695007017
Loss for  1605 th iteration => 0.538436696894
Loss for  1606 th iteration => 0.538181699601
Loss for  1607 th iteration => 0.537924443823
Loss for  1608 th iteration => 0.537667493364
Loss for  1609 th iteration => 0.537408970555
Loss for  1610 th iteration => 0.537154840103
Loss for  1611 th iteration => 0.536896357865
Loss for  1612 th iteration => 0.536640013802
Loss for  1613 th iteration => 0.53638083775
Loss for  1614 th iteration => 0.536126750037
Loss for  1615 th iteration => 0.535868688793
Loss for  1616 th iteration => 0.535609820169
Loss for  1617 th iteration => 0.535356878141
Loss for  1618 th iteration => 0.535097382682
Loss for  1619 th iteration => 0.534839108983
Loss for  1620 th iteration => 0.534584988242
Loss for  1621 th iteration => 0.534326185796
Loss for  1622 th iteration => 0.534067731273
Loss for  1623 th iteration => 0.533814001847
Loss for  1624 th iteration => 0.533554927868
Loss for  1625 th iteration => 0.533295637627
Loss for  1626 th iteration => 0.533043644436
Loss for  1627 th iteration => 0.532783489528
Loss for  1628 th iteration => 0.532525148725
Loss for  1629 th iteration => 0.532269532741
Loss for  1630 th iteration => 0.532011914581
Loss for  1631 th iteration => 0.531756182106
Loss for  1632 th iteration => 0.531497640564
Loss for  1633 th iteration => 0.531238999629
Loss for  1634 th iteration => 0.530983636504
Loss for  1635 th iteration => 0.530724609858
Loss for  1636 th iteration => 0.530471266934
Loss for  1637 th iteration => 0.530211423528
Loss for  1638 th iteration => 0.529954804652
Loss for  1639 th iteration => 0.52969729704
Loss for  1640 th iteration => 0.529439045005
Loss for  1641 th iteration => 0.529183224118
Loss for  1642 th iteration => 0.528923847302
Loss for  1643 th iteration => 0.528669136474
Loss for  1644 th iteration => 0.52840909521
Loss for  1645 th iteration => 0.528154982242
Loss for  1646 th iteration => 0.527894692467
Loss for  1647 th iteration => 0.527640722599
Loss for  1648 th iteration => 0.527380559373
Loss for  1649 th iteration => 0.527126329046
Loss for  1650 th iteration => 0.526866629908
Loss for  1651 th iteration => 0.526611781229
Loss for  1652 th iteration => 0.526352849379
Loss for  1653 th iteration => 0.526097065195
Loss for  1654 th iteration => 0.525839172477
Loss for  1655 th iteration => 0.525582171995
Loss for  1656 th iteration => 0.525325561686
Loss for  1657 th iteration => 0.525067096575
Loss for  1658 th iteration => 0.524811985974
Loss for  1659 th iteration => 0.524551836888
Loss for  1660 th iteration => 0.524298896831
Loss for  1661 th iteration => 0.524037855844
Loss for  1662 th iteration => 0.523782645982
Loss for  1663 th iteration => 0.523524942999
Loss for  1664 th iteration => 0.52326674453
Loss for  1665 th iteration => 0.523011907233
Loss for  1666 th iteration => 0.52275412299
Loss for  1667 th iteration => 0.522503465932
Loss for  1668 th iteration => 0.522240799459
Loss for  1669 th iteration => 0.521979912653
Loss for  1670 th iteration => 0.521727980256
Loss for  1671 th iteration => 0.521470561741
Loss for  1672 th iteration => 0.521216359966
Loss for  1673 th iteration => 0.520957568931
Loss for  1674 th iteration => 0.520696236865
Loss for  1675 th iteration => 0.520438804974
Loss for  1676 th iteration => 0.520186373669
Loss for  1677 th iteration => 0.519925169323
Loss for  1678 th iteration => 0.519667868773
Loss for  1679 th iteration => 0.519415540944
Loss for  1680 th iteration => 0.519155623937
Loss for  1681 th iteration => 0.518904807644
Loss for  1682 th iteration => 0.518646421171
Loss for  1683 th iteration => 0.518386027512
Loss for  1684 th iteration => 0.518133232139
Loss for  1685 th iteration => 0.517874307256
Loss for  1686 th iteration => 0.517618949111
Loss for  1687 th iteration => 0.517364778341
Loss for  1688 th iteration => 0.517099755529
Loss for  1689 th iteration => 0.516852626058
Loss for  1690 th iteration => 0.516596230185
Loss for  1691 th iteration => 0.516330709124
Loss for  1692 th iteration => 0.516078155984
Loss for  1693 th iteration => 0.51582772495
Loss for  1694 th iteration => 0.515562123202
Loss for  1695 th iteration => 0.515313719102
Loss for  1696 th iteration => 0.515051112169
Loss for  1697 th iteration => 0.514791863466
Loss for  1698 th iteration => 0.514537154798
Loss for  1699 th iteration => 0.514280847258
Loss for  1700 th iteration => 0.51402826318
Loss for  1701 th iteration => 0.51377073581
Loss for  1702 th iteration => 0.513508757692
Loss for  1703 th iteration => 0.513256013198
Loss for  1704 th iteration => 0.513008040578
Loss for  1705 th iteration => 0.512746026071
Loss for  1706 th iteration => 0.512484612354
Loss for  1707 th iteration => 0.512236821289
Loss for  1708 th iteration => 0.511972001129
Loss for  1709 th iteration => 0.511728650384
Loss for  1710 th iteration => 0.511467132418
Loss for  1711 th iteration => 0.511206570081
Loss for  1712 th iteration => 0.51095812299
Loss for  1713 th iteration => 0.51069177879
Loss for  1714 th iteration => 0.510445914603
Loss for  1715 th iteration => 0.510183064276
Loss for  1716 th iteration => 0.509927563024
Loss for  1717 th iteration => 0.509678773804
Loss for  1718 th iteration => 0.509414033507
Loss for  1719 th iteration => 0.509170109332
Loss for  1720 th iteration => 0.50890332031
Loss for  1721 th iteration => 0.50865614827
Loss for  1722 th iteration => 0.50839477989
Loss for  1723 th iteration => 0.508139407566
Loss for  1724 th iteration => 0.50788638311
Loss for  1725 th iteration => 0.507624005304
Loss for  1726 th iteration => 0.507378035349
Loss for  1727 th iteration => 0.507110417604
Loss for  1728 th iteration => 0.506873411293
Loss for  1729 th iteration => 0.50661728604
Loss for  1730 th iteration => 0.506349144028
Loss for  1731 th iteration => 0.506099089275
Loss for  1732 th iteration => 0.505840682143
Loss for  1733 th iteration => 0.505586481435
Loss for  1734 th iteration => 0.505332127676
Loss for  1735 th iteration => 0.50507458244
Loss for  1736 th iteration => 0.504823459213
Loss for  1737 th iteration => 0.504563290843
Loss for  1738 th iteration => 0.50431466464
Loss for  1739 th iteration => 0.504052521853
Loss for  1740 th iteration => 0.503805738811
Loss for  1741 th iteration => 0.503542204466
Loss for  1742 th iteration => 0.503296681733
Loss for  1743 th iteration => 0.503032279127
Loss for  1744 th iteration => 0.502787497162
Loss for  1745 th iteration => 0.50252269581
Loss for  1746 th iteration => 0.502278191521
Loss for  1747 th iteration => 0.502013412446
Loss for  1748 th iteration => 0.501768773065
Loss for  1749 th iteration => 0.501504393631
Loss for  1750 th iteration => 0.501259251252
Loss for  1751 th iteration => 0.500995609559
Loss for  1752 th iteration => 0.500749636266
Loss for  1753 th iteration => 0.500487035144
Loss for  1754 th iteration => 0.500239938661
Loss for  1755 th iteration => 0.499978649288
Loss for  1756 th iteration => 0.499730169101
Loss for  1757 th iteration => 0.499470434275
Loss for  1758 th iteration => 0.499220338173
Loss for  1759 th iteration => 0.498962375269
Loss for  1760 th iteration => 0.498710456253
Loss for  1761 th iteration => 0.498454459887
Loss for  1762 th iteration => 0.498200533416
Loss for  1763 th iteration => 0.497946677848
Loss for  1764 th iteration => 0.497690579385
Loss for  1765 th iteration => 0.497439758782
Loss for  1766 th iteration => 0.497198426618
Loss for  1767 th iteration => 0.496926913725
Loss for  1768 th iteration => 0.496678696916
Loss for  1769 th iteration => 0.49643480348
Loss for  1770 th iteration => 0.49616321526
Loss for  1771 th iteration => 0.495918059156
Loss for  1772 th iteration => 0.495671157313
Loss for  1773 th iteration => 0.495399511954
Loss for  1774 th iteration => 0.495157809973
Loss for  1775 th iteration => 0.494907516165
Loss for  1776 th iteration => 0.494635829862
Loss for  1777 th iteration => 0.494397921901
Loss for  1778 th iteration => 0.494143906042
Loss for  1779 th iteration => 0.493872500486
Loss for  1780 th iteration => 0.493618784579
Loss for  1781 th iteration => 0.493366380416
Loss for  1782 th iteration => 0.493112495629
Loss for  1783 th iteration => 0.492858627951
Loss for  1784 th iteration => 0.492604777759
Loss for  1785 th iteration => 0.492350945448
Loss for  1786 th iteration => 0.492097131427
Loss for  1787 th iteration => 0.491845610159
Loss for  1788 th iteration => 0.491592277058
Loss for  1789 th iteration => 0.49133848365
Loss for  1790 th iteration => 0.49108471769
Loss for  1791 th iteration => 0.490831146789
Loss for  1792 th iteration => 0.490585072906
Loss for  1793 th iteration => 0.490333294783
Loss for  1794 th iteration => 0.490099119194
Loss for  1795 th iteration => 0.489828290126
Loss for  1796 th iteration => 0.489586525487
Loss for  1797 th iteration => 0.48931432692
Loss for  1798 th iteration => 0.489062703907
Loss for  1799 th iteration => 0.488808853908
Loss for  1800 th iteration => 0.488555074034
Loss for  1801 th iteration => 0.48830226096
Loss for  1802 th iteration => 0.488053377318
Loss for  1803 th iteration => 0.487804039659
Loss for  1804 th iteration => 0.487557110059
Loss for  1805 th iteration => 0.487311696742
Loss for  1806 th iteration => 0.487043204759
Loss for  1807 th iteration => 0.486789183453
Loss for  1808 th iteration => 0.486535295447
Loss for  1809 th iteration => 0.486292531225
Loss for  1810 th iteration => 0.486045222597
Loss for  1811 th iteration => 0.485799071005
Loss for  1812 th iteration => 0.485529224295
Loss for  1813 th iteration => 0.485278860962
Loss for  1814 th iteration => 0.485026796152
Loss for  1815 th iteration => 0.484800023778
Loss for  1816 th iteration => 0.484536822902
Loss for  1817 th iteration => 0.484288653092
Loss for  1818 th iteration => 0.484018722913
Loss for  1819 th iteration => 0.483770135393
Loss for  1820 th iteration => 0.483545485517
Loss for  1821 th iteration => 0.483279312625
Loss for  1822 th iteration => 0.483034251031
Loss for  1823 th iteration => 0.482761605453
Loss for  1824 th iteration => 0.48251735386
Loss for  1825 th iteration => 0.482289372781
Loss for  1826 th iteration => 0.482024718055
Loss for  1827 th iteration => 0.481778085164
Loss for  1828 th iteration => 0.481509336162
Loss for  1829 th iteration => 0.481286931939
Loss for  1830 th iteration => 0.481024523522
Loss for  1831 th iteration => 0.48076936713
Loss for  1832 th iteration => 0.480511105813
Loss for  1833 th iteration => 0.48028200775
Loss for  1834 th iteration => 0.480027269378
Loss for  1835 th iteration => 0.4797695742
Loss for  1836 th iteration => 0.479510492757
Loss for  1837 th iteration => 0.47928111517
Loss for  1838 th iteration => 0.479025745904
Loss for  1839 th iteration => 0.478768660233
Loss for  1840 th iteration => 0.478513487197
Loss for  1841 th iteration => 0.478277874079
Loss for  1842 th iteration => 0.478005382439
Loss for  1843 th iteration => 0.477764130519
Loss for  1844 th iteration => 0.477522625192
Loss for  1845 th iteration => 0.477271701966
Loss for  1846 th iteration => 0.477013959114
Loss for  1847 th iteration => 0.476779826164
Loss for  1848 th iteration => 0.476534777504
Loss for  1849 th iteration => 0.476266291033
Loss for  1850 th iteration => 0.476027236655
Loss for  1851 th iteration => 0.475774740814
Loss for  1852 th iteration => 0.475520691637
Loss for  1853 th iteration => 0.475283445167
Loss for  1854 th iteration => 0.475016927426
Loss for  1855 th iteration => 0.474770773318
Loss for  1856 th iteration => 0.474517136386
Loss for  1857 th iteration => 0.474275004917
Loss for  1858 th iteration => 0.474021342388
Loss for  1859 th iteration => 0.47377656611
Loss for  1860 th iteration => 0.473524604718
Loss for  1861 th iteration => 0.473281278062
Loss for  1862 th iteration => 0.473026788363
Loss for  1863 th iteration => 0.472787160949
Loss for  1864 th iteration => 0.472550437263
Loss for  1865 th iteration => 0.472292650217
Loss for  1866 th iteration => 0.472052365538
Loss for  1867 th iteration => 0.471798632166
Loss for  1868 th iteration => 0.471551551767
Loss for  1869 th iteration => 0.471308473647
Loss for  1870 th iteration => 0.471050415578
Loss for  1871 th iteration => 0.470818416546
Loss for  1872 th iteration => 0.470553185894
Loss for  1873 th iteration => 0.470306320831
Loss for  1874 th iteration => 0.47006002636
Loss for  1875 th iteration => 0.4698093023
Loss for  1876 th iteration => 0.469567313798
Loss for  1877 th iteration => 0.469318870516
Loss for  1878 th iteration => 0.469086545546
Loss for  1879 th iteration => 0.468829124583
Loss for  1880 th iteration => 0.468591703019
Loss for  1881 th iteration => 0.468334029645
Loss for  1882 th iteration => 0.468102600135
Loss for  1883 th iteration => 0.467838286046
Loss for  1884 th iteration => 0.467607199726
Loss for  1885 th iteration => 0.467348466124
Loss for  1886 th iteration => 0.467094594793
Loss for  1887 th iteration => 0.466855686209
Loss for  1888 th iteration => 0.466625531382
Loss for  1889 th iteration => 0.466361553191
Loss for  1890 th iteration => 0.466110799029
Loss for  1891 th iteration => 0.465874645609
Loss for  1892 th iteration => 0.465621991866
Loss for  1893 th iteration => 0.465377902751
Loss for  1894 th iteration => 0.465149647125
Loss for  1895 th iteration => 0.464890101959
Loss for  1896 th iteration => 0.464640778032
Loss for  1897 th iteration => 0.464394740991
Loss for  1898 th iteration => 0.464155913579
Loss for  1899 th iteration => 0.463908299793
Loss for  1900 th iteration => 0.463661272815
Loss for  1901 th iteration => 0.463414503921
Loss for  1902 th iteration => 0.463177557414
Loss for  1903 th iteration => 0.462927941185
Loss for  1904 th iteration => 0.462684469167
Loss for  1905 th iteration => 0.462434134312
Loss for  1906 th iteration => 0.462198622012
Loss for  1907 th iteration => 0.46195094502
Loss for  1908 th iteration => 0.461707865589
Loss for  1909 th iteration => 0.461460568127
Loss for  1910 th iteration => 0.461215765713
Loss for  1911 th iteration => 0.460979343355
Loss for  1912 th iteration => 0.460730319656
Loss for  1913 th iteration => 0.460489749194
Loss for  1914 th iteration => 0.460240534849
Loss for  1915 th iteration => 0.459998517067
Loss for  1916 th iteration => 0.459757149909
Loss for  1917 th iteration => 0.459513866067
Loss for  1918 th iteration => 0.459273468088
Loss for  1919 th iteration => 0.459025222184
Loss for  1920 th iteration => 0.458783465662
Loss for  1921 th iteration => 0.458539716542
Loss for  1922 th iteration => 0.4582946119
Loss for  1923 th iteration => 0.458061214481
Loss for  1924 th iteration => 0.457813679138
Loss for  1925 th iteration => 0.45756984687
Loss for  1926 th iteration => 0.457329259543
Loss for  1927 th iteration => 0.457082961908
Loss for  1928 th iteration => 0.456841892224
Loss for  1929 th iteration => 0.456600228731
Loss for  1930 th iteration => 0.45635591987
Loss for  1931 th iteration => 0.456119366723
Loss for  1932 th iteration => 0.455877586522
Loss for  1933 th iteration => 0.45563216961
Loss for  1934 th iteration => 0.455392616235
Loss for  1935 th iteration => 0.455150747943
Loss for  1936 th iteration => 0.454906214575
Loss for  1937 th iteration => 0.454666674195
Loss for  1938 th iteration => 0.454425986762
Loss for  1939 th iteration => 0.454182178018
Loss for  1940 th iteration => 0.453941354422
Loss for  1941 th iteration => 0.4537029317
Loss for  1942 th iteration => 0.453459729076
Loss for  1943 th iteration => 0.453216889453
Loss for  1944 th iteration => 0.452979972384
Loss for  1945 th iteration => 0.452738906997
Loss for  1946 th iteration => 0.452496525601
Loss for  1947 th iteration => 0.452258102667
Loss for  1948 th iteration => 0.452021455217
Loss for  1949 th iteration => 0.451778778884
Loss for  1950 th iteration => 0.451536467676
Loss for  1951 th iteration => 0.451298278787
Loss for  1952 th iteration => 0.451059415956
Loss for  1953 th iteration => 0.450817567711
Loss for  1954 th iteration => 0.450576027403
Loss for  1955 th iteration => 0.450339268427
Loss for  1956 th iteration => 0.450099995933
Loss for  1957 th iteration => 0.449858819927
Loss for  1958 th iteration => 0.449617911823
Loss for  1959 th iteration => 0.449380951996
Loss for  1960 th iteration => 0.449142737994
Loss for  1961 th iteration => 0.448902128829
Loss for  1962 th iteration => 0.448661759225
Loss for  1963 th iteration => 0.448423307728
Loss for  1964 th iteration => 0.448187342116
Loss for  1965 th iteration => 0.447947227875
Loss for  1966 th iteration => 0.447707332927
Loss for  1967 th iteration => 0.447467643734
Loss for  1968 th iteration => 0.447231258466
Loss for  1969 th iteration => 0.44699415413
Loss for  1970 th iteration => 0.44675467139
Loss for  1971 th iteration => 0.44651538338
Loss for  1972 th iteration => 0.446276279306
Loss for  1973 th iteration => 0.446040218194
Loss for  1974 th iteration => 0.445803630155
Loss for  1975 th iteration => 0.445564704374
Loss for  1976 th iteration => 0.445325957223
Loss for  1977 th iteration => 0.445090325984
Loss for  1978 th iteration => 0.4448526651
Loss for  1979 th iteration => 0.444617576992
Loss for  1980 th iteration => 0.444379533928
Loss for  1981 th iteration => 0.444141614722
Loss for  1982 th iteration => 0.443903816454
Loss for  1983 th iteration => 0.443666136515
Loss for  1984 th iteration => 0.443428572569
Loss for  1985 th iteration => 0.443192974922
Loss for  1986 th iteration => 0.442971883751
Loss for  1987 th iteration => 0.442723773679
Loss for  1988 th iteration => 0.442486652967
Loss for  1989 th iteration => 0.442249630287
Loss for  1990 th iteration => 0.442012705481
Loss for  1991 th iteration => 0.441775878407
Loss for  1992 th iteration => 0.441539391097
Loss for  1993 th iteration => 0.441308986298
Loss for  1994 th iteration => 0.441082942814
Loss for  1995 th iteration => 0.440837377908
Loss for  1996 th iteration => 0.440600954445
Loss for  1997 th iteration => 0.44036462757
Loss for  1998 th iteration => 0.440130912116
Loss for  1999 th iteration => 0.439898213405
Loss for  2000 th iteration => 0.439662068468
Loss for  2001 th iteration => 0.439426032224
Loss for  2002 th iteration => 0.439190102999
Loss for  2003 th iteration => 0.43895779541
Loss for  2004 th iteration => 0.438725583376
Loss for  2005 th iteration => 0.438489724528
Loss for  2006 th iteration => 0.438254075437
Loss for  2007 th iteration => 0.438020725262
Loss for  2008 th iteration => 0.437792996052
Loss for  2009 th iteration => 0.437564853357
Loss for  2010 th iteration => 0.437323273684
Loss for  2011 th iteration => 0.437087844609
Loss for  2012 th iteration => 0.43685426768
Loss for  2013 th iteration => 0.436627636594
Loss for  2014 th iteration => 0.436401095274
Loss for  2015 th iteration => 0.43615733296
Loss for  2016 th iteration => 0.435922145853
Loss for  2017 th iteration => 0.435694689836
Loss for  2018 th iteration => 0.435461297028
Loss for  2019 th iteration => 0.4352260834
Loss for  2020 th iteration => 0.4349921025
Loss for  2021 th iteration => 0.434772430601
Loss for  2022 th iteration => 0.434542823841
Loss for  2023 th iteration => 0.434298368394
Loss for  2024 th iteration => 0.434065514079
Loss for  2025 th iteration => 0.433846718952
Loss for  2026 th iteration => 0.433617852554
Loss for  2027 th iteration => 0.433371836308
Loss for  2028 th iteration => 0.433142494273
Loss for  2029 th iteration => 0.43292018214
Loss for  2030 th iteration => 0.432696185378
Loss for  2031 th iteration => 0.432446443456
Loss for  2032 th iteration => 0.432222459132
Loss for  2033 th iteration => 0.431993261198
Loss for  2034 th iteration => 0.431777313786
Loss for  2035 th iteration => 0.431525962369
Loss for  2036 th iteration => 0.43131146916
Loss for  2037 th iteration => 0.431082265183
Loss for  2038 th iteration => 0.430833145065
Loss for  2039 th iteration => 0.430611779194
Loss for  2040 th iteration => 0.430382807034
Loss for  2041 th iteration => 0.430168875767
Loss for  2042 th iteration => 0.429919305013
Loss for  2043 th iteration => 0.42970175811
Loss for  2044 th iteration => 0.429477940175
Loss for  2045 th iteration => 0.429230380586
Loss for  2046 th iteration => 0.429017210755
Loss for  2047 th iteration => 0.428790378965
Loss for  2048 th iteration => 0.42854350247
Loss for  2049 th iteration => 0.428331607288
Loss for  2050 th iteration => 0.428104265945
Loss for  2051 th iteration => 0.427858071427
Loss for  2052 th iteration => 0.427647812899
Loss for  2053 th iteration => 0.427421046049
Loss for  2054 th iteration => 0.4271761875
Loss for  2055 th iteration => 0.426959893996
Loss for  2056 th iteration => 0.426738840339
Loss for  2057 th iteration => 0.426494319509
Loss for  2058 th iteration => 0.42627339009
Loss for  2059 th iteration => 0.42605892282
Loss for  2060 th iteration => 0.425815615617
Loss for  2061 th iteration => 0.425586586807
Loss for  2062 th iteration => 0.425379801691
Loss for  2063 th iteration => 0.4251366524
Loss for  2064 th iteration => 0.424902921778
Loss for  2065 th iteration => 0.424678385328
Loss for  2066 th iteration => 0.42446013562
Loss for  2067 th iteration => 0.42424414926
Loss for  2068 th iteration => 0.424002691538
Loss for  2069 th iteration => 0.42377376725
Loss for  2070 th iteration => 0.423547106328
Loss for  2071 th iteration => 0.423330268864
Loss for  2072 th iteration => 0.423114121725
Loss for  2073 th iteration => 0.422873969484
Loss for  2074 th iteration => 0.422647995298
Loss for  2075 th iteration => 0.422420494994
Loss for  2076 th iteration => 0.422199301784
Loss for  2077 th iteration => 0.421969351818
Loss for  2078 th iteration => 0.421751360723
Loss for  2079 th iteration => 0.421524540652
Loss for  2080 th iteration => 0.421297165727
Loss for  2081 th iteration => 0.421075887636
Loss for  2082 th iteration => 0.420849957514
Loss for  2083 th iteration => 0.420630419291
Loss for  2084 th iteration => 0.420401826515
Loss for  2085 th iteration => 0.420178367641
Loss for  2086 th iteration => 0.419958086125
Loss for  2087 th iteration => 0.419729758086
Loss for  2088 th iteration => 0.419509847097
Loss for  2089 th iteration => 0.419283544005
Loss for  2090 th iteration => 0.419066584936
Loss for  2091 th iteration => 0.41883666411
Loss for  2092 th iteration => 0.418617820382
Loss for  2093 th iteration => 0.418393941741
Loss for  2094 th iteration => 0.418170422129
Loss for  2095 th iteration => 0.417946962768
Loss for  2096 th iteration => 0.417725765448
Loss for  2097 th iteration => 0.417503753331
Loss for  2098 th iteration => 0.41727799997
Loss for  2099 th iteration => 0.417061569222
Loss for  2100 th iteration => 0.416834369439
Loss for  2101 th iteration => 0.416614962053
Loss for  2102 th iteration => 0.416392029256
Loss for  2103 th iteration => 0.416171987157
Loss for  2104 th iteration => 0.415947612845
Loss for  2105 th iteration => 0.415730406106
Loss for  2106 th iteration => 0.41550431024
Loss for  2107 th iteration => 0.415286810977
Loss for  2108 th iteration => 0.415063808389
Loss for  2109 th iteration => 0.414841796627
Loss for  2110 th iteration => 0.414624445595
Loss for  2111 th iteration => 0.414400937196
Loss for  2112 th iteration => 0.414180720181
Loss for  2113 th iteration => 0.413960145308
Loss for  2114 th iteration => 0.413740505992
Loss for  2115 th iteration => 0.413517592565
Loss for  2116 th iteration => 0.41330224717
Loss for  2117 th iteration => 0.413073408168
Loss for  2118 th iteration => 0.412864663188
Loss for  2119 th iteration => 0.412635208875
Loss for  2120 th iteration => 0.412421334368
Loss for  2121 th iteration => 0.412198864515
Loss for  2122 th iteration => 0.411978103847
Loss for  2123 th iteration => 0.411763034482
Loss for  2124 th iteration => 0.411536479731
Loss for  2125 th iteration => 0.411327721648
Loss for  2126 th iteration => 0.411099281385
Loss for  2127 th iteration => 0.410885414384
Loss for  2128 th iteration => 0.410664948527
Loss for  2129 th iteration => 0.410443441562
Loss for  2130 th iteration => 0.410231023282
Loss for  2131 th iteration => 0.410003614864
Loss for  2132 th iteration => 0.409794315898
Loss for  2133 th iteration => 0.409570721825
Loss for  2134 th iteration => 0.409353734018
Loss for  2135 th iteration => 0.40913798079
Loss for  2136 th iteration => 0.408916776246
Loss for  2137 th iteration => 0.408700417542
Loss for  2138 th iteration => 0.408484267407
Loss for  2139 th iteration => 0.408260825472
Loss for  2140 th iteration => 0.408052040338
Loss for  2141 th iteration => 0.407829394491
Loss for  2142 th iteration => 0.407610310908
Loss for  2143 th iteration => 0.407398776809
Loss for  2144 th iteration => 0.407176240644
Loss for  2145 th iteration => 0.406961493296
Loss for  2146 th iteration => 0.406746755326
Loss for  2147 th iteration => 0.406526265894
Loss for  2148 th iteration => 0.406310935547
Loss for  2149 th iteration => 0.406097078162
Loss for  2150 th iteration => 0.405876567281
Loss for  2151 th iteration => 0.405662202402
Loss for  2152 th iteration => 0.405449199546
Loss for  2153 th iteration => 0.405227916446
Loss for  2154 th iteration => 0.405014464845
Loss for  2155 th iteration => 0.404800512019
Loss for  2156 th iteration => 0.404601216021
Loss for  2157 th iteration => 0.404367233712
Loss for  2158 th iteration => 0.404159515274
Loss for  2159 th iteration => 0.403954819725
Loss for  2160 th iteration => 0.403726241249
Loss for  2161 th iteration => 0.403514558542
Loss for  2162 th iteration => 0.403312404856
Loss for  2163 th iteration => 0.403082862661
Loss for  2164 th iteration => 0.402872406644
Loss for  2165 th iteration => 0.402671599746
Loss for  2166 th iteration => 0.402437641155
Loss for  2167 th iteration => 0.402236971618
Loss for  2168 th iteration => 0.402027015029
Loss for  2169 th iteration => 0.401793846378
Loss for  2170 th iteration => 0.401604741888
Loss for  2171 th iteration => 0.401383219414
Loss for  2172 th iteration => 0.401153561893
Loss for  2173 th iteration => 0.400970011774
Loss for  2174 th iteration => 0.40074087481
Loss for  2175 th iteration => 0.400518082057
Loss for  2176 th iteration => 0.40033100107
Loss for  2177 th iteration => 0.400098589578
Loss for  2178 th iteration => 0.399891555426
Loss for  2179 th iteration => 0.399687799282
Loss for  2180 th iteration => 0.399456886655
Loss for  2181 th iteration => 0.399266445957
Loss for  2182 th iteration => 0.399045416746
Loss for  2183 th iteration => 0.398818730295
Loss for  2184 th iteration => 0.398613297594
Loss for  2185 th iteration => 0.39841359074
Loss for  2186 th iteration => 0.398204820838
Loss for  2187 th iteration => 0.397975058581
Loss for  2188 th iteration => 0.397767133257
Loss for  2189 th iteration => 0.397572022462
Loss for  2190 th iteration => 0.397360124423
Loss for  2191 th iteration => 0.397131815554
Loss for  2192 th iteration => 0.396925691822
Loss for  2193 th iteration => 0.396733485771
Loss for  2194 th iteration => 0.396515656744
Loss for  2195 th iteration => 0.396293997641
Loss for  2196 th iteration => 0.396083766343
Loss for  2197 th iteration => 0.395895757505
Loss for  2198 th iteration => 0.395673124472
Loss for  2199 th iteration => 0.395460957502
Loss for  2200 th iteration => 0.395266346382
Loss for  2201 th iteration => 0.395034802446
Loss for  2202 th iteration => 0.394830745124
Loss for  2203 th iteration => 0.394633554967
Loss for  2204 th iteration => 0.394422897076
Loss for  2205 th iteration => 0.394202571289
Loss for  2206 th iteration => 0.39399412726
Loss for  2207 th iteration => 0.393788018771
Loss for  2208 th iteration => 0.393589029987
Loss for  2209 th iteration => 0.393378194505
Loss for  2210 th iteration => 0.3931652141
Loss for  2211 th iteration => 0.392956214868
Loss for  2212 th iteration => 0.392745377446
Loss for  2213 th iteration => 0.392541685809
Loss for  2214 th iteration => 0.392337246961
Loss for  2215 th iteration => 0.39213003745
Loss for  2216 th iteration => 0.391919758211
Loss for  2217 th iteration => 0.391710193302
Loss for  2218 th iteration => 0.391505160618
Loss for  2219 th iteration => 0.391306630603
Loss for  2220 th iteration => 0.391096770255
Loss for  2221 th iteration => 0.390887622604
Loss for  2222 th iteration => 0.390679113097
Loss for  2223 th iteration => 0.390472411598
Loss for  2224 th iteration => 0.390271424654
Loss for  2225 th iteration => 0.390068505291
Loss for  2226 th iteration => 0.389860192112
Loss for  2227 th iteration => 0.389652477163
Loss for  2228 th iteration => 0.389445335645
Loss for  2229 th iteration => 0.389242445464
Loss for  2230 th iteration => 0.389035641695
Loss for  2231 th iteration => 0.38883405103
Loss for  2232 th iteration => 0.388630345818
Loss for  2233 th iteration => 0.388423693373
Loss for  2234 th iteration => 0.388217523383
Loss for  2235 th iteration => 0.388013980823
Loss for  2236 th iteration => 0.387809810828
Loss for  2237 th iteration => 0.387604343002
Loss for  2238 th iteration => 0.387399262997
Loss for  2239 th iteration => 0.387196250946
Loss for  2240 th iteration => 0.38699586821
Loss for  2241 th iteration => 0.386790801685
Loss for  2242 th iteration => 0.386586746946
Loss for  2243 th iteration => 0.386384909718
Loss for  2244 th iteration => 0.386180450457
Loss for  2245 th iteration => 0.385976343074
Loss for  2246 th iteration => 0.38577255729
Loss for  2247 th iteration => 0.385569066561
Loss for  2248 th iteration => 0.385365847611
Loss for  2249 th iteration => 0.385162880026
Loss for  2250 th iteration => 0.384960145903
Loss for  2251 th iteration => 0.384757742714
Loss for  2252 th iteration => 0.384559104065
Loss for  2253 th iteration => 0.38435715199
Loss for  2254 th iteration => 0.384156217033
Loss for  2255 th iteration => 0.383953516709
Loss for  2256 th iteration => 0.383751091313
Loss for  2257 th iteration => 0.383548920057
Loss for  2258 th iteration => 0.383346984715
Loss for  2259 th iteration => 0.383145269305
Loss for  2260 th iteration => 0.382943759812
Loss for  2261 th iteration => 0.38274244394
Loss for  2262 th iteration => 0.382541310902
Loss for  2263 th iteration => 0.382340351229
Loss for  2264 th iteration => 0.382139556612
Loss for  2265 th iteration => 0.381938919753
Loss for  2266 th iteration => 0.381738434244
Loss for  2267 th iteration => 0.381538094456
Loss for  2268 th iteration => 0.381337895441
Loss for  2269 th iteration => 0.381137832851
Loss for  2270 th iteration => 0.380937902864
Loss for  2271 th iteration => 0.380738102118
Loss for  2272 th iteration => 0.380538513171
Loss for  2273 th iteration => 0.380340021727
Loss for  2274 th iteration => 0.380140437239
Loss for  2275 th iteration => 0.379940989417
Loss for  2276 th iteration => 0.379741674453
Loss for  2277 th iteration => 0.379542488997
Loss for  2278 th iteration => 0.379343430104
Loss for  2279 th iteration => 0.379144495182
Loss for  2280 th iteration => 0.378945681949
Loss for  2281 th iteration => 0.3787469884
Loss for  2282 th iteration => 0.378548412766
Loss for  2283 th iteration => 0.37834995349
Loss for  2284 th iteration => 0.378151609203
Loss for  2285 th iteration => 0.377953378694
Loss for  2286 th iteration => 0.377755260901
Loss for  2287 th iteration => 0.377557254886
Loss for  2288 th iteration => 0.377359359821
Loss for  2289 th iteration => 0.377161574979
Loss for  2290 th iteration => 0.376963899718
Loss for  2291 th iteration => 0.376766333474
Loss for  2292 th iteration => 0.376568875748
Loss for  2293 th iteration => 0.376371526104
Loss for  2294 th iteration => 0.376174284155
Loss for  2295 th iteration => 0.375977149564
Loss for  2296 th iteration => 0.375780122033
Loss for  2297 th iteration => 0.375583201303
Loss for  2298 th iteration => 0.375386387146
Loss for  2299 th iteration => 0.375189679363
Loss for  2300 th iteration => 0.374993077781
Loss for  2301 th iteration => 0.37479658225
Loss for  2302 th iteration => 0.374600192639
Loss for  2303 th iteration => 0.374403908836
Loss for  2304 th iteration => 0.374207730746
Loss for  2305 th iteration => 0.374011658287
Loss for  2306 th iteration => 0.37381569139
Loss for  2307 th iteration => 0.373619829999
Loss for  2308 th iteration => 0.373424074065
Loss for  2309 th iteration => 0.373228423552
Loss for  2310 th iteration => 0.373032878429
Loss for  2311 th iteration => 0.372837438675
Loss for  2312 th iteration => 0.372642104273
Loss for  2313 th iteration => 0.372446875214
Loss for  2314 th iteration => 0.372252364044
Loss for  2315 th iteration => 0.37205704426
Loss for  2316 th iteration => 0.371862161369
Loss for  2317 th iteration => 0.371667383149
Loss for  2318 th iteration => 0.371472709464
Loss for  2319 th iteration => 0.371278140219
Loss for  2320 th iteration => 0.371083675349
Loss for  2321 th iteration => 0.370889314817
Loss for  2322 th iteration => 0.37069680098
Loss for  2323 th iteration => 0.370501736091
Loss for  2324 th iteration => 0.370307638176
Loss for  2325 th iteration => 0.370113656846
Loss for  2326 th iteration => 0.369919790144
Loss for  2327 th iteration => 0.369726624993
Loss for  2328 th iteration => 0.369533490588
Loss for  2329 th iteration => 0.369339883916
Loss for  2330 th iteration => 0.36914639983
Loss for  2331 th iteration => 0.368953035524
Loss for  2332 th iteration => 0.368760119451
Loss for  2333 th iteration => 0.368568497997
Loss for  2334 th iteration => 0.368375293721
Loss for  2335 th iteration => 0.368182231872
Loss for  2336 th iteration => 0.36798994342
Loss for  2337 th iteration => 0.367798122611
Loss for  2338 th iteration => 0.367605313321
Loss for  2339 th iteration => 0.367412650779
Loss for  2340 th iteration => 0.367220375972
Loss for  2341 th iteration => 0.36703032645
Loss for  2342 th iteration => 0.366837783121
Loss for  2343 th iteration => 0.366645410244
Loss for  2344 th iteration => 0.366455559589
Loss for  2345 th iteration => 0.366263081389
Loss for  2346 th iteration => 0.366070977958
Loss for  2347 th iteration => 0.365879858131
Loss for  2348 th iteration => 0.365690459551
Loss for  2349 th iteration => 0.365498464194
Loss for  2350 th iteration => 0.36530748843
Loss for  2351 th iteration => 0.36511886652
Loss for  2352 th iteration => 0.364929013668
Loss for  2353 th iteration => 0.36473526322
Loss for  2354 th iteration => 0.364547646211
Loss for  2355 th iteration => 0.36435601816
Loss for  2356 th iteration => 0.364164628582
Loss for  2357 th iteration => 0.363975652137
Loss for  2358 th iteration => 0.363787266229
Loss for  2359 th iteration => 0.363602086217
Loss for  2360 th iteration => 0.363408399329
Loss for  2361 th iteration => 0.363217181756
Loss for  2362 th iteration => 0.363026235649
Loss for  2363 th iteration => 0.36283729557
Loss for  2364 th iteration => 0.36265416404
Loss for  2365 th iteration => 0.362466266088
Loss for  2366 th iteration => 0.362272446498
Loss for  2367 th iteration => 0.362091746069
Loss for  2368 th iteration => 0.361892159858
Loss for  2369 th iteration => 0.361705376641
Loss for  2370 th iteration => 0.361527538315
Loss for  2371 th iteration => 0.361333344875
Loss for  2372 th iteration => 0.361142389326
Loss for  2373 th iteration => 0.360960491853
Loss for  2374 th iteration => 0.360766908644
Loss for  2375 th iteration => 0.36057768207
Loss for  2376 th iteration => 0.360397598673
Loss for  2377 th iteration => 0.360205581093
Loss for  2378 th iteration => 0.360019206601
Loss for  2379 th iteration => 0.359838806861
Loss for  2380 th iteration => 0.359639017682
Loss for  2381 th iteration => 0.359452858789
Loss for  2382 th iteration => 0.359274934053
Loss for  2383 th iteration => 0.359087662469
Loss for  2384 th iteration => 0.358893304975
Loss for  2385 th iteration => 0.358717294739
Loss for  2386 th iteration => 0.358521198866
Loss for  2387 th iteration => 0.358345503742
Loss for  2388 th iteration => 0.358153197284
Loss for  2389 th iteration => 0.357961026706
Loss for  2390 th iteration => 0.357790691277
Loss for  2391 th iteration => 0.357590484357
Loss for  2392 th iteration => 0.357413567502
Loss for  2393 th iteration => 0.357223333788
Loss for  2394 th iteration => 0.357037488371
Loss for  2395 th iteration => 0.356860261341
Loss for  2396 th iteration => 0.356661439675
Loss for  2397 th iteration => 0.356487794842
Loss for  2398 th iteration => 0.356300737187
Loss for  2399 th iteration => 0.356106517911
Loss for  2400 th iteration => 0.355935653775
Loss for  2401 th iteration => 0.355742648722
Loss for  2402 th iteration => 0.355560257827
Loss for  2403 th iteration => 0.35537599521
Loss for  2404 th iteration => 0.355187220279
Loss for  2405 th iteration => 0.355001988521
Loss for  2406 th iteration => 0.354824309576
Loss for  2407 th iteration => 0.35464266508
Loss for  2408 th iteration => 0.35445018173
Loss for  2409 th iteration => 0.354263438965
Loss for  2410 th iteration => 0.35408510328
Loss for  2411 th iteration => 0.353907771715
Loss for  2412 th iteration => 0.353723146717
Loss for  2413 th iteration => 0.353532624683
Loss for  2414 th iteration => 0.353352232655
Loss for  2415 th iteration => 0.353172190052
Loss for  2416 th iteration => 0.352993468575
Loss for  2417 th iteration => 0.352802703912
Loss for  2418 th iteration => 0.352618049445
Loss for  2419 th iteration => 0.352443640925
Loss for  2420 th iteration => 0.352263221483
Loss for  2421 th iteration => 0.352069571195
Loss for  2422 th iteration => 0.351893414724
Loss for  2423 th iteration => 0.3517135726
Loss for  2424 th iteration => 0.351531902141
Loss for  2425 th iteration => 0.351347621561
Loss for  2426 th iteration => 0.351164394338
Loss for  2427 th iteration => 0.350981044741
Loss for  2428 th iteration => 0.350811576007
Loss for  2429 th iteration => 0.350617432366
Loss for  2430 th iteration => 0.350441228377
Loss for  2431 th iteration => 0.350265301988
Loss for  2432 th iteration => 0.350072739857
Loss for  2433 th iteration => 0.349899163034
Loss for  2434 th iteration => 0.349715688348
Loss for  2435 th iteration => 0.349532071515
Loss for  2436 th iteration => 0.349355292784
Loss for  2437 th iteration => 0.349173317028
Loss for  2438 th iteration => 0.348995567399
Loss for  2439 th iteration => 0.348813460704
Loss for  2440 th iteration => 0.348630649348
Loss for  2441 th iteration => 0.34845761668
Loss for  2442 th iteration => 0.348271880644
Loss for  2443 th iteration => 0.348095960603
Loss for  2444 th iteration => 0.347915807392
Loss for  2445 th iteration => 0.347734784303
Loss for  2446 th iteration => 0.347557857446
Loss for  2447 th iteration => 0.34737387045
Loss for  2448 th iteration => 0.34720280185
Loss for  2449 th iteration => 0.347018172295
Loss for  2450 th iteration => 0.346839938632
Loss for  2451 th iteration => 0.346664141811
Loss for  2452 th iteration => 0.346481172454
Loss for  2453 th iteration => 0.346306231859
Loss for  2454 th iteration => 0.346124475371
Loss for  2455 th iteration => 0.345952145096
Loss for  2456 th iteration => 0.345768037145
Loss for  2457 th iteration => 0.345593469178
Loss for  2458 th iteration => 0.345415239426
Loss for  2459 th iteration => 0.345234742346
Loss for  2460 th iteration => 0.345059930078
Loss for  2461 th iteration => 0.344879103856
Loss for  2462 th iteration => 0.34470557797
Loss for  2463 th iteration => 0.344523075309
Loss for  2464 th iteration => 0.344353861266
Loss for  2465 th iteration => 0.344169065289
Loss for  2466 th iteration => 0.343997703769
Loss for  2467 th iteration => 0.343817775078
Loss for  2468 th iteration => 0.343641480168
Loss for  2469 th iteration => 0.343464680628
Loss for  2470 th iteration => 0.343288324755
Loss for  2471 th iteration => 0.34311091931
Loss for  2472 th iteration => 0.342935534652
Loss for  2473 th iteration => 0.342760394211
Loss for  2474 th iteration => 0.342579764387
Loss for  2475 th iteration => 0.342410231211
Loss for  2476 th iteration => 0.342228497133
Loss for  2477 th iteration => 0.342056194845
Loss for  2478 th iteration => 0.341878035333
Loss for  2479 th iteration => 0.3417049066
Loss for  2480 th iteration => 0.341527124254
Loss for  2481 th iteration => 0.341355412139
Loss for  2482 th iteration => 0.341174292528
Loss for  2483 th iteration => 0.341006054968
Loss for  2484 th iteration => 0.340825331524
Loss for  2485 th iteration => 0.340654422552
Loss for  2486 th iteration => 0.340477011894
Loss for  2487 th iteration => 0.340302724064
Loss for  2488 th iteration => 0.340130271791
Loss for  2489 th iteration => 0.339954508141
Loss for  2490 th iteration => 0.339779842433
Loss for  2491 th iteration => 0.33960645948
Loss for  2492 th iteration => 0.339430286066
Loss for  2493 th iteration => 0.339258599917
Loss for  2494 th iteration => 0.339084002185
Loss for  2495 th iteration => 0.338925756011
Loss for  2496 th iteration => 0.33874265972
Loss for  2497 th iteration => 0.338560362647
Loss for  2498 th iteration => 0.33839886028
Loss for  2499 th iteration => 0.338228222896
Loss for  2500 th iteration => 0.338049953252
Loss for  2501 th iteration => 0.337866050761
Loss for  2502 th iteration => 0.337701106706
Loss for  2503 th iteration => 0.337522318197
Loss for  2504 th iteration => 0.337352682958
Loss for  2505 th iteration => 0.337180847232
Loss for  2506 th iteration => 0.337022371666
Loss for  2507 th iteration => 0.336841967281
Loss for  2508 th iteration => 0.336658138317
Loss for  2509 th iteration => 0.336505341142
Loss for  2510 th iteration => 0.336329484247
Loss for  2511 th iteration => 0.336151097041
Loss for  2512 th iteration => 0.335972839765
Loss for  2513 th iteration => 0.335822500396
Loss for  2514 th iteration => 0.335638310408
Loss for  2515 th iteration => 0.335459158893
Loss for  2516 th iteration => 0.335302377406
Loss for  2517 th iteration => 0.335131323236
Loss for  2518 th iteration => 0.334948954224
Loss for  2519 th iteration => 0.334778330848
Loss for  2520 th iteration => 0.334623136565
Loss for  2521 th iteration => 0.334441210665
Loss for  2522 th iteration => 0.334259582829
Loss for  2523 th iteration => 0.334113200023
Loss for  2524 th iteration => 0.333934591381
Loss for  2525 th iteration => 0.333753051889
Loss for  2526 th iteration => 0.333590990668
Loss for  2527 th iteration => 0.333427795396
Loss for  2528 th iteration => 0.333246214601
Loss for  2529 th iteration => 0.333073880099
Loss for  2530 th iteration => 0.332921911116
Loss for  2531 th iteration => 0.33274080438
Loss for  2532 th iteration => 0.332559773291
Loss for  2533 th iteration => 0.332399110535
Loss for  2534 th iteration => 0.332226898098
Loss for  2535 th iteration => 0.332075733151
Loss for  2536 th iteration => 0.33189171785
Loss for  2537 th iteration => 0.331719196982
Loss for  2538 th iteration => 0.331550181044
Loss for  2539 th iteration => 0.331389738825
Loss for  2540 th iteration => 0.331228035315
Loss for  2541 th iteration => 0.331049306426
Loss for  2542 th iteration => 0.33087734082
Loss for  2543 th iteration => 0.330726785016
Loss for  2544 th iteration => 0.330544793567
Loss for  2545 th iteration => 0.33037122829
Loss for  2546 th iteration => 0.330205223448
Loss for  2547 th iteration => 0.330044289517
Loss for  2548 th iteration => 0.329884615611
Loss for  2549 th iteration => 0.329700770049
Loss for  2550 th iteration => 0.329541480833
Loss for  2551 th iteration => 0.32938085138
Loss for  2552 th iteration => 0.329200790927
Loss for  2553 th iteration => 0.32903880009
Loss for  2554 th iteration => 0.328879908911
Loss for  2555 th iteration => 0.328701098202
Loss for  2556 th iteration => 0.328535302417
Loss for  2557 th iteration => 0.328381752074
Loss for  2558 th iteration => 0.328198784377
Loss for  2559 th iteration => 0.328037628568
Loss for  2560 th iteration => 0.327880660729
Loss for  2561 th iteration => 0.32769755664
Loss for  2562 th iteration => 0.327543079047
Loss for  2563 th iteration => 0.327380054215
Loss for  2564 th iteration => 0.32720041928
Loss for  2565 th iteration => 0.32704545556
Loss for  2566 th iteration => 0.326881139869
Loss for  2567 th iteration => 0.326703742045
Loss for  2568 th iteration => 0.326542974711
Loss for  2569 th iteration => 0.326375624511
Loss for  2570 th iteration => 0.326206977592
Loss for  2571 th iteration => 0.326046613256
Loss for  2572 th iteration => 0.325883417781
Loss for  2573 th iteration => 0.325711412705
Loss for  2574 th iteration => 0.325553303319
Loss for  2575 th iteration => 0.325388760417
Loss for  2576 th iteration => 0.325217425252
Loss for  2577 th iteration => 0.325060156366
Loss for  2578 th iteration => 0.324895606512
Loss for  2579 th iteration => 0.324726096245
Loss for  2580 th iteration => 0.324566665606
Loss for  2581 th iteration => 0.324403383802
Loss for  2582 th iteration => 0.324236327915
Loss for  2583 th iteration => 0.324072912171
Loss for  2584 th iteration => 0.323912530045
Loss for  2585 th iteration => 0.323747547451
Loss for  2586 th iteration => 0.323580226337
Loss for  2587 th iteration => 0.323421516212
Loss for  2588 th iteration => 0.323260920221
Loss for  2589 th iteration => 0.323091886431
Loss for  2590 th iteration => 0.322930070506
Loss for  2591 th iteration => 0.322773240348
Loss for  2592 th iteration => 0.3226046113
Loss for  2593 th iteration => 0.322439301576
Loss for  2594 th iteration => 0.322286610196
Loss for  2595 th iteration => 0.322118367145
Loss for  2596 th iteration => 0.321953505935
Loss for  2597 th iteration => 0.321794668537
Loss for  2598 th iteration => 0.321633932176
Loss for  2599 th iteration => 0.321469209864
Loss for  2600 th iteration => 0.321306140267
Loss for  2601 th iteration => 0.321147342219
Loss for  2602 th iteration => 0.320985872514
Loss for  2603 th iteration => 0.320823808804
Loss for  2604 th iteration => 0.320657856566
Loss for  2605 th iteration => 0.32050341168
Loss for  2606 th iteration => 0.320342197091
Loss for  2607 th iteration => 0.320176455923
Loss for  2608 th iteration => 0.320013424142
Loss for  2609 th iteration => 0.319863040272
Loss for  2610 th iteration => 0.319706453762
Loss for  2611 th iteration => 0.319536586556
Loss for  2612 th iteration => 0.319372119127
Loss for  2613 th iteration => 0.319217290902
Loss for  2614 th iteration => 0.319056685131
Loss for  2615 th iteration => 0.318894005248
Loss for  2616 th iteration => 0.318730224745
Loss for  2617 th iteration => 0.318575202728
Loss for  2618 th iteration => 0.318423245772
Loss for  2619 th iteration => 0.318265145334
Loss for  2620 th iteration => 0.318096061546
Loss for  2621 th iteration => 0.317934758443
Loss for  2622 th iteration => 0.317792925007
Loss for  2623 th iteration => 0.317624148435
Loss for  2624 th iteration => 0.317457664244
Loss for  2625 th iteration => 0.317297360313
Loss for  2626 th iteration => 0.317149695066
Loss for  2627 th iteration => 0.316995158184
Loss for  2628 th iteration => 0.316826854976
Loss for  2629 th iteration => 0.316670269227
Loss for  2630 th iteration => 0.3165178197
Loss for  2631 th iteration => 0.316351148851
Loss for  2632 th iteration => 0.316197255915
Loss for  2633 th iteration => 0.316048325311
Loss for  2634 th iteration => 0.315880378841
Loss for  2635 th iteration => 0.315716105872
Loss for  2636 th iteration => 0.315573137038
Loss for  2637 th iteration => 0.315408494311
Loss for  2638 th iteration => 0.315247357357
Loss for  2639 th iteration => 0.315104909782
Loss for  2640 th iteration => 0.314937356253
Loss for  2641 th iteration => 0.314770618775
Loss for  2642 th iteration => 0.31463160848
Loss for  2643 th iteration => 0.31446546689
Loss for  2644 th iteration => 0.314305214492
Loss for  2645 th iteration => 0.314164716203
Loss for  2646 th iteration => 0.313997606117
Loss for  2647 th iteration => 0.313832301607
Loss for  2648 th iteration => 0.313681481991
Loss for  2649 th iteration => 0.31352248474
Loss for  2650 th iteration => 0.313380323397
Loss for  2651 th iteration => 0.313216256302
Loss for  2652 th iteration => 0.313060015705
Loss for  2653 th iteration => 0.312912675023
Loss for  2654 th iteration => 0.312746085844
Loss for  2655 th iteration => 0.312591844299
Loss for  2656 th iteration => 0.312443464858
Loss for  2657 th iteration => 0.312276829858
Loss for  2658 th iteration => 0.312129907032
Loss for  2659 th iteration => 0.311979601498
Loss for  2660 th iteration => 0.31181363977
Loss for  2661 th iteration => 0.311667974554
Loss for  2662 th iteration => 0.311512111372
Loss for  2663 th iteration => 0.31134610405
Loss for  2664 th iteration => 0.31120499022
Loss for  2665 th iteration => 0.311045377921
Loss for  2666 th iteration => 0.310881870144
Loss for  2667 th iteration => 0.310735236414
Loss for  2668 th iteration => 0.310581872487
Loss for  2669 th iteration => 0.310436580381
Loss for  2670 th iteration => 0.310271058627
Loss for  2671 th iteration => 0.310122225388
Loss for  2672 th iteration => 0.309971703478
Loss for  2673 th iteration => 0.309806188517
Loss for  2674 th iteration => 0.309665251462
Loss for  2675 th iteration => 0.309507486855
Loss for  2676 th iteration => 0.309346659022
Loss for  2677 th iteration => 0.309196217577
Loss for  2678 th iteration => 0.309045052457
Loss for  2679 th iteration => 0.30890135446
Loss for  2680 th iteration => 0.308736565537
Loss for  2681 th iteration => 0.308594936483
Loss for  2682 th iteration => 0.308439131174
Loss for  2683 th iteration => 0.308278983581
Loss for  2684 th iteration => 0.308129175297
Loss for  2685 th iteration => 0.307977225727
Loss for  2686 th iteration => 0.30781989658
Loss for  2687 th iteration => 0.30767474879
Loss for  2688 th iteration => 0.307519317048
Loss for  2689 th iteration => 0.307365383377
Loss for  2690 th iteration => 0.30722244858
Loss for  2691 th iteration => 0.307069630719
Loss for  2692 th iteration => 0.306911145264
Loss for  2693 th iteration => 0.306760554167
Loss for  2694 th iteration => 0.306613635032
Loss for  2695 th iteration => 0.306465423137
Loss for  2696 th iteration => 0.306305705486
Loss for  2697 th iteration => 0.306156685944
Loss for  2698 th iteration => 0.306008398115
Loss for  2699 th iteration => 0.305862156189
Loss for  2700 th iteration => 0.305702627623
Loss for  2701 th iteration => 0.30555382161
Loss for  2702 th iteration => 0.305406193899
Loss for  2703 th iteration => 0.305259895937
Loss for  2704 th iteration => 0.305101660224
Loss for  2705 th iteration => 0.304952027042
Loss for  2706 th iteration => 0.304806628733
Loss for  2707 th iteration => 0.304658718474
Loss for  2708 th iteration => 0.304502629486
Loss for  2709 th iteration => 0.304351372239
Loss for  2710 th iteration => 0.304209414356
Loss for  2711 th iteration => 0.304058697132
Loss for  2712 th iteration => 0.303905413228
Loss for  2713 th iteration => 0.303752680878
Loss for  2714 th iteration => 0.303609473833
Loss for  2715 th iteration => 0.303458851555
Loss for  2716 th iteration => 0.303306453676
Loss for  2717 th iteration => 0.303160993128
Loss for  2718 th iteration => 0.303013202592
Loss for  2719 th iteration => 0.302861130776
Loss for  2720 th iteration => 0.30271327014
Loss for  2721 th iteration => 0.302568874473
Loss for  2722 th iteration => 0.302422257158
Loss for  2723 th iteration => 0.302268887862
Loss for  2724 th iteration => 0.302117524929
Loss for  2725 th iteration => 0.301975366454
Loss for  2726 th iteration => 0.301825909772
Loss for  2727 th iteration => 0.301674828142
Loss for  2728 th iteration => 0.301529334643
Loss for  2729 th iteration => 0.301383750621
Loss for  2730 th iteration => 0.301232951426
Loss for  2731 th iteration => 0.301084157011
Loss for  2732 th iteration => 0.300944024497
Loss for  2733 th iteration => 0.300794948273
Loss for  2734 th iteration => 0.300645169109
Loss for  2735 th iteration => 0.300495015702
Loss for  2736 th iteration => 0.300350704419
Loss for  2737 th iteration => 0.300205486247
Loss for  2738 th iteration => 0.300055590163
Loss for  2739 th iteration => 0.29990738481
Loss for  2740 th iteration => 0.299768042214
Loss for  2741 th iteration => 0.299619491013
Loss for  2742 th iteration => 0.29947099688
Loss for  2743 th iteration => 0.299321709049
Loss for  2744 th iteration => 0.299177177834
Loss for  2745 th iteration => 0.299033680562
Loss for  2746 th iteration => 0.298884635679
Loss for  2747 th iteration => 0.298735851037
Loss for  2748 th iteration => 0.298596099466
Loss for  2749 th iteration => 0.29844861136
Loss for  2750 th iteration => 0.298300052253
Loss for  2751 th iteration => 0.298155543344
Loss for  2752 th iteration => 0.298013316588
Loss for  2753 th iteration => 0.297864992486
Loss for  2754 th iteration => 0.297716923171
Loss for  2755 th iteration => 0.297576822353
Loss for  2756 th iteration => 0.29743096562
Loss for  2757 th iteration => 0.297283116031
Loss for  2758 th iteration => 0.297138070838
Loss for  2759 th iteration => 0.296998352105
Loss for  2760 th iteration => 0.296853358374
Loss for  2761 th iteration => 0.296705882155
Loss for  2762 th iteration => 0.296558582048
Loss for  2763 th iteration => 0.296415745757
Loss for  2764 th iteration => 0.296274059952
Loss for  2765 th iteration => 0.296126979953
Loss for  2766 th iteration => 0.295980144559
Loss for  2767 th iteration => 0.295840996011
Loss for  2768 th iteration => 0.295696381546
Loss for  2769 th iteration => 0.295549754889
Loss for  2770 th iteration => 0.295405286862
Loss for  2771 th iteration => 0.295267330882
Loss for  2772 th iteration => 0.295122779986
Loss for  2773 th iteration => 0.29497741694
Loss for  2774 th iteration => 0.294831756397
Loss for  2775 th iteration => 0.294687858842
Loss for  2776 th iteration => 0.294550759513
Loss for  2777 th iteration => 0.294405149589
Loss for  2778 th iteration => 0.294261621724
Loss for  2779 th iteration => 0.294116353978
Loss for  2780 th iteration => 0.29397127493
Loss for  2781 th iteration => 0.293833132357
Loss for  2782 th iteration => 0.293690149763
Loss for  2783 th iteration => 0.293545222182
Loss for  2784 th iteration => 0.29340048825
Loss for  2785 th iteration => 0.293262908791
Loss for  2786 th iteration => 0.29312109146
Loss for  2787 th iteration => 0.292976868063
Loss for  2788 th iteration => 0.292832798929
Loss for  2789 th iteration => 0.292693920095
Loss for  2790 th iteration => 0.292553062536
Loss for  2791 th iteration => 0.292409124506
Loss for  2792 th iteration => 0.292265351276
Loss for  2793 th iteration => 0.292124808411
Loss for  2794 th iteration => 0.291986812044
Loss for  2795 th iteration => 0.291844852848
Loss for  2796 th iteration => 0.291704290016
Loss for  2797 th iteration => 0.291560245354
Loss for  2798 th iteration => 0.291417263129
Loss for  2799 th iteration => 0.291279077217
Loss for  2800 th iteration => 0.291139198133
Loss for  2801 th iteration => 0.290996347016
Loss for  2802 th iteration => 0.290853643213
Loss for  2803 th iteration => 0.290712186165
Loss for  2804 th iteration => 0.290578161901
Loss for  2805 th iteration => 0.290436057686
Loss for  2806 th iteration => 0.290295832756
Loss for  2807 th iteration => 0.290153703165
Loss for  2808 th iteration => 0.290011701899
Loss for  2809 th iteration => 0.289870088508
Loss for  2810 th iteration => 0.28973769545
Loss for  2811 th iteration => 0.289591739843
Loss for  2812 th iteration => 0.289454178422
Loss for  2813 th iteration => 0.289315420313
Loss for  2814 th iteration => 0.289173521122
Loss for  2815 th iteration => 0.289032273362
Loss for  2816 th iteration => 0.288895234451
Loss for  2817 th iteration => 0.28875692791
Loss for  2818 th iteration => 0.288615834852
Loss for  2819 th iteration => 0.288474868827
Loss for  2820 th iteration => 0.288336626655
Loss for  2821 th iteration => 0.288200689902
Loss for  2822 th iteration => 0.288062713766
Loss for  2823 th iteration => 0.287922742411
Loss for  2824 th iteration => 0.287783526619
Loss for  2825 th iteration => 0.287643145929
Loss for  2826 th iteration => 0.287503559009
Loss for  2827 th iteration => 0.287366162319
Loss for  2828 th iteration => 0.287229447536
Loss for  2829 th iteration => 0.287093322613
Loss for  2830 th iteration => 0.286952409887
Loss for  2831 th iteration => 0.286814911228
Loss for  2832 th iteration => 0.286676087001
Loss for  2833 th iteration => 0.286539122947
Loss for  2834 th iteration => 0.286399677417
Loss for  2835 th iteration => 0.28626089995
Loss for  2836 th iteration => 0.286129643736
Loss for  2837 th iteration => 0.285985000781
Loss for  2838 th iteration => 0.285854773199
Loss for  2839 th iteration => 0.285713893193
Loss for  2840 th iteration => 0.285578880525
Loss for  2841 th iteration => 0.285437622132
Loss for  2842 th iteration => 0.28530270946
Loss for  2843 th iteration => 0.285162518529
Loss for  2844 th iteration => 0.28503109372
Loss for  2845 th iteration => 0.284892622772
Loss for  2846 th iteration => 0.284754262665
Loss for  2847 th iteration => 0.284616012917
Loss for  2848 th iteration => 0.284477873084
Loss for  2849 th iteration => 0.284345530399
Loss for  2850 th iteration => 0.284212219996
Loss for  2851 th iteration => 0.28407066061
Loss for  2852 th iteration => 0.283938334787
Loss for  2853 th iteration => 0.283796090218
Loss for  2854 th iteration => 0.283669380719
Loss for  2855 th iteration => 0.283527200276
Loss for  2856 th iteration => 0.283393454378
Loss for  2857 th iteration => 0.283257385546
Loss for  2858 th iteration => 0.283120655837
Loss for  2859 th iteration => 0.282987633093
Loss for  2860 th iteration => 0.282850467119
Loss for  2861 th iteration => 0.282713423132
Loss for  2862 th iteration => 0.282576498942
Loss for  2863 th iteration => 0.282445919255
Loss for  2864 th iteration => 0.282312642469
Loss for  2865 th iteration => 0.282175929785
Loss for  2866 th iteration => 0.282039335874
Loss for  2867 th iteration => 0.281903510306
Loss for  2868 th iteration => 0.281771696988
Loss for  2869 th iteration => 0.281635258561
Loss for  2870 th iteration => 0.281498954551
Loss for  2871 th iteration => 0.281366134706
Loss for  2872 th iteration => 0.28123763965
Loss for  2873 th iteration => 0.281100428016
Loss for  2874 th iteration => 0.280964470635
Loss for  2875 th iteration => 0.280828640456
Loss for  2876 th iteration => 0.280697180927
Loss for  2877 th iteration => 0.28056285666
Loss for  2878 th iteration => 0.280427157713
Loss for  2879 th iteration => 0.280293878631
Loss for  2880 th iteration => 0.280166190809
Loss for  2881 th iteration => 0.280027942757
Loss for  2882 th iteration => 0.279895271633
Loss for  2883 th iteration => 0.279761985358
Loss for  2884 th iteration => 0.279628142679
Loss for  2885 th iteration => 0.279497440412
Loss for  2886 th iteration => 0.279363428747
Loss for  2887 th iteration => 0.279225509576
Loss for  2888 th iteration => 0.279099197906
Loss for  2889 th iteration => 0.278965748302
Loss for  2890 th iteration => 0.278831089927
Loss for  2891 th iteration => 0.278696740513
Loss for  2892 th iteration => 0.278568244137
Loss for  2893 th iteration => 0.278433692543
Loss for  2894 th iteration => 0.278299305103
Loss for  2895 th iteration => 0.278167779559
Loss for  2896 th iteration => 0.278037367856
Loss for  2897 th iteration => 0.27790308086
Loss for  2898 th iteration => 0.277769752977
Loss for  2899 th iteration => 0.277647422897
Loss for  2900 th iteration => 0.277508761214
Loss for  2901 th iteration => 0.277374806184
Loss for  2902 th iteration => 0.277244950919
Loss for  2903 th iteration => 0.277118675333
Loss for  2904 th iteration => 0.276981355612
Loss for  2905 th iteration => 0.276847759023
Loss for  2906 th iteration => 0.276722698354
Loss for  2907 th iteration => 0.276592135218
Loss for  2908 th iteration => 0.276455336312
Loss for  2909 th iteration => 0.276325739678
Loss for  2910 th iteration => 0.276197220489
Loss for  2911 th iteration => 0.276067518232
Loss for  2912 th iteration => 0.275931653093
Loss for  2913 th iteration => 0.275804878292
Loss for  2914 th iteration => 0.275671741816
Loss for  2915 th iteration => 0.275539777687
Loss for  2916 th iteration => 0.275418870252
Loss for  2917 th iteration => 0.275281550159
Loss for  2918 th iteration => 0.275148821328
Loss for  2919 th iteration => 0.275022197494
Loss for  2920 th iteration => 0.274892480787
Loss for  2921 th iteration => 0.27476310218
Loss for  2922 th iteration => 0.274631532815
Loss for  2923 th iteration => 0.274502511523
Loss for  2924 th iteration => 0.274370115863
Loss for  2925 th iteration => 0.274241777156
Loss for  2926 th iteration => 0.274114744643
Loss for  2927 th iteration => 0.27398576401
Loss for  2928 th iteration => 0.273853448969
Loss for  2929 th iteration => 0.273726178088
Loss for  2930 th iteration => 0.273594162152
Loss for  2931 th iteration => 0.27346592671
Loss for  2932 th iteration => 0.273338743239
Loss for  2933 th iteration => 0.273207199392
Loss for  2934 th iteration => 0.273084512641
Loss for  2935 th iteration => 0.272952521856
Loss for  2936 th iteration => 0.272820930631
Loss for  2937 th iteration => 0.272694266695
Loss for  2938 th iteration => 0.272566505268
Loss for  2939 th iteration => 0.272435088832
Loss for  2940 th iteration => 0.272309839632
Loss for  2941 th iteration => 0.272181163871
Loss for  2942 th iteration => 0.27204993525
Loss for  2943 th iteration => 0.271927137634
Loss for  2944 th iteration => 0.271800767413
Loss for  2945 th iteration => 0.271666043637
Loss for  2946 th iteration => 0.271543877006
Loss for  2947 th iteration => 0.271413099284
Loss for  2948 th iteration => 0.271283152036
Loss for  2949 th iteration => 0.271160435451
Loss for  2950 th iteration => 0.271029612344
Loss for  2951 th iteration => 0.270902831712
Loss for  2952 th iteration => 0.270777439876
Loss for  2953 th iteration => 0.270646858955
Loss for  2954 th iteration => 0.270523314459
Loss for  2955 th iteration => 0.270395177497
Loss for  2956 th iteration => 0.270264842511
Loss for  2957 th iteration => 0.270144582135
Loss for  2958 th iteration => 0.270013651699
Loss for  2959 th iteration => 0.269886583169
Loss for  2960 th iteration => 0.26976270074
Loss for  2961 th iteration => 0.269632550542
Loss for  2962 th iteration => 0.26950986855
Loss for  2963 th iteration => 0.269382102739
Loss for  2964 th iteration => 0.269253231618
Loss for  2965 th iteration => 0.269131877739
Loss for  2966 th iteration => 0.26900196644
Loss for  2967 th iteration => 0.268878283352
Loss for  2968 th iteration => 0.268752260776
Loss for  2969 th iteration => 0.268622846485
Loss for  2970 th iteration => 0.268502768865
Loss for  2971 th iteration => 0.268373140848
Loss for  2972 th iteration => 0.268249491145
Loss for  2973 th iteration => 0.268124183735
Loss for  2974 th iteration => 0.267995140558
Loss for  2975 th iteration => 0.267875435571
Loss for  2976 th iteration => 0.267746125809
Loss for  2977 th iteration => 0.267623256487
Loss for  2978 th iteration => 0.267497925514
Loss for  2979 th iteration => 0.26736991118
Loss for  2980 th iteration => 0.267249931601
Loss for  2981 th iteration => 0.26712096691
Loss for  2982 th iteration => 0.266999411782
Loss for  2983 th iteration => 0.266873531162
Loss for  2984 th iteration => 0.266747013426
Loss for  2985 th iteration => 0.266626300391
Loss for  2986 th iteration => 0.266497700915
Loss for  2987 th iteration => 0.266377836244
Loss for  2988 th iteration => 0.266251036009
Loss for  2989 th iteration => 0.266126342396
Loss for  2990 th iteration => 0.266004575264
Loss for  2991 th iteration => 0.265876355876
Loss for  2992 th iteration => 0.265758441834
Loss for  2993 th iteration => 0.265630466489
Loss for  2994 th iteration => 0.265507821305
Loss for  2995 th iteration => 0.265385221339
Loss for  2996 th iteration => 0.265260879988
Loss for  2997 th iteration => 0.265146418426
Loss for  2998 th iteration => 0.265011450709
Loss for  2999 th iteration => 0.264892312315
Loss for  3000 th iteration => 0.264766452147
Loss for  3001 th iteration => 0.264643471384
Loss for  3002 th iteration => 0.264521661927
Loss for  3003 th iteration => 0.264395202352
Loss for  3004 th iteration => 0.264277084953
Loss for  3005 th iteration => 0.264150145567
Loss for  3006 th iteration => 0.264034491286
Loss for  3007 th iteration => 0.263912968612
Loss for  3008 th iteration => 0.263784156668
Loss for  3009 th iteration => 0.263662030377
Loss for  3010 th iteration => 0.263537398915
Loss for  3011 th iteration => 0.263418380124
Loss for  3012 th iteration => 0.263291233426
Loss for  3013 th iteration => 0.263176027463
Loss for  3014 th iteration => 0.263051962116
Loss for  3015 th iteration => 0.262937429546
Loss for  3016 th iteration => 0.262805765658
Loss for  3017 th iteration => 0.262684384083
Loss for  3018 th iteration => 0.262563087204
Loss for  3019 th iteration => 0.262439420201
Loss for  3020 th iteration => 0.262320653403
Loss for  3021 th iteration => 0.262196059424
Loss for  3022 th iteration => 0.262080417015
Loss for  3023 th iteration => 0.261960804636
Loss for  3024 th iteration => 0.261836742036
Loss for  3025 th iteration => 0.261710711559
Loss for  3026 th iteration => 0.261592893259
Loss for  3027 th iteration => 0.26146991037
Loss for  3028 th iteration => 0.261351162493
Loss for  3029 th iteration => 0.261236341571
Loss for  3030 th iteration => 0.26110693329
Loss for  3031 th iteration => 0.260987765223
Loss for  3032 th iteration => 0.260864592666
Loss for  3033 th iteration => 0.260747862808
Loss for  3034 th iteration => 0.260623547767
Loss for  3035 th iteration => 0.260507487048
Loss for  3036 th iteration => 0.260382373042
Loss for  3037 th iteration => 0.260266876826
Loss for  3038 th iteration => 0.260141475091
Loss for  3039 th iteration => 0.260026610951
Loss for  3040 th iteration => 0.259900858272
Loss for  3041 th iteration => 0.259788247304
Loss for  3042 th iteration => 0.259662167553
Loss for  3043 th iteration => 0.259548834389
Loss for  3044 th iteration => 0.259422791888
Loss for  3045 th iteration => 0.259308831961
Loss for  3046 th iteration => 0.259183745119
Loss for  3047 th iteration => 0.259069123298
Loss for  3048 th iteration => 0.258945703869
Loss for  3049 th iteration => 0.258831880053
Loss for  3050 th iteration => 0.258715378992
Loss for  3051 th iteration => 0.25859132025
Loss for  3052 th iteration => 0.258469082852
Loss for  3053 th iteration => 0.258354223727
Loss for  3054 th iteration => 0.258232589823
Loss for  3055 th iteration => 0.258115995475
Loss for  3056 th iteration => 0.257994446956
Loss for  3057 th iteration => 0.257878379021
Loss for  3058 th iteration => 0.257756712055
Loss for  3059 th iteration => 0.257641372709
Loss for  3060 th iteration => 0.257522167462
Loss for  3061 th iteration => 0.257413294526
Loss for  3062 th iteration => 0.257283144035
Loss for  3063 th iteration => 0.257168190682
Loss for  3064 th iteration => 0.257048483046
Loss for  3065 th iteration => 0.256940918829
Loss for  3066 th iteration => 0.256810588932
Loss for  3067 th iteration => 0.256696735394
Loss for  3068 th iteration => 0.256576012065
Loss for  3069 th iteration => 0.256462466793
Loss for  3070 th iteration => 0.256339522192
Loss for  3071 th iteration => 0.256227290765
Loss for  3072 th iteration => 0.256103509575
Loss for  3073 th iteration => 0.255993109685
Loss for  3074 th iteration => 0.255869928457
Loss for  3075 th iteration => 0.255758152864
Loss for  3076 th iteration => 0.255636028197
Loss for  3077 th iteration => 0.255522109613
Loss for  3078 th iteration => 0.255402413836
Loss for  3079 th iteration => 0.255288755188
Loss for  3080 th iteration => 0.255170770192
Loss for  3081 th iteration => 0.255053189967
Loss for  3082 th iteration => 0.254937660223
Loss for  3083 th iteration => 0.25481821085
Loss for  3084 th iteration => 0.254707187277
Loss for  3085 th iteration => 0.254585860294
Loss for  3086 th iteration => 0.25447398149
Loss for  3087 th iteration => 0.254351336353
Loss for  3088 th iteration => 0.254240577661
Loss for  3089 th iteration => 0.254122039771
Loss for  3090 th iteration => 0.254008577812
Loss for  3091 th iteration => 0.25389005856
Loss for  3092 th iteration => 0.253774205257
Loss for  3093 th iteration => 0.253659356712
Loss for  3094 th iteration => 0.253542838468
Loss for  3095 th iteration => 0.253429941708
Loss for  3096 th iteration => 0.253309072129
Loss for  3097 th iteration => 0.25319922762
Loss for  3098 th iteration => 0.253078990246
Loss for  3099 th iteration => 0.252968504652
Loss for  3100 th iteration => 0.252849880793
Loss for  3101 th iteration => 0.252734946966
Loss for  3102 th iteration => 0.252620464208
Loss for  3103 th iteration => 0.252504960041
Loss for  3104 th iteration => 0.252401104459
Loss for  3105 th iteration => 0.25227167481
Loss for  3106 th iteration => 0.252164780236
Loss for  3107 th iteration => 0.252044191723
Loss for  3108 th iteration => 0.251931962735
Loss for  3109 th iteration => 0.251815621497
Loss for  3110 th iteration => 0.251702450949
Loss for  3111 th iteration => 0.251589584493
Loss for  3112 th iteration => 0.251470475643
Loss for  3113 th iteration => 0.251361500255
Loss for  3114 th iteration => 0.251243447838
Loss for  3115 th iteration => 0.251132836276
Loss for  3116 th iteration => 0.251016544195
Loss for  3117 th iteration => 0.250900983305
Loss for  3118 th iteration => 0.250791354979
Loss for  3119 th iteration => 0.250672722964
Loss for  3120 th iteration => 0.250564814181
Loss for  3121 th iteration => 0.250445469166
Loss for  3122 th iteration => 0.2503350643
Loss for  3123 th iteration => 0.250221705112
Loss for  3124 th iteration => 0.250105203001
Loss for  3125 th iteration => 0.249995551146
Loss for  3126 th iteration => 0.24987899599
Loss for  3127 th iteration => 0.249769590488
Loss for  3128 th iteration => 0.249653883597
Loss for  3129 th iteration => 0.249538885447
Loss for  3130 th iteration => 0.249431780798
Loss for  3131 th iteration => 0.249312950851
Loss for  3132 th iteration => 0.249203978133
Loss for  3133 th iteration => 0.249088963938
Loss for  3134 th iteration => 0.248977113863
Loss for  3135 th iteration => 0.248866430446
Loss for  3136 th iteration => 0.24874859163
Loss for  3137 th iteration => 0.248641730443
Loss for  3138 th iteration => 0.248527458637
Loss for  3139 th iteration => 0.248413172529
Loss for  3140 th iteration => 0.248303940939
Loss for  3141 th iteration => 0.248189274255
Loss for  3142 th iteration => 0.248079806593
Loss for  3143 th iteration => 0.247965915299
Loss for  3144 th iteration => 0.247852958157
Loss for  3145 th iteration => 0.247745747298
Loss for  3146 th iteration => 0.247628809829
Loss for  3147 th iteration => 0.247518761247
Loss for  3148 th iteration => 0.247409314956
Loss for  3149 th iteration => 0.247292989527
Loss for  3150 th iteration => 0.247186887519
Loss for  3151 th iteration => 0.247073165445
Loss for  3152 th iteration => 0.246961692813
Loss for  3153 th iteration => 0.246851517827
Loss for  3154 th iteration => 0.246736831292
Loss for  3155 th iteration => 0.246630506341
Loss for  3156 th iteration => 0.246517139082
Loss for  3157 th iteration => 0.246402985096
Loss for  3158 th iteration => 0.24629913432
Loss for  3159 th iteration => 0.246183343964
Loss for  3160 th iteration => 0.246071997251
Loss for  3161 th iteration => 0.245966385699
Loss for  3162 th iteration => 0.245850369322
Loss for  3163 th iteration => 0.245742156069
Loss for  3164 th iteration => 0.245633127458
Loss for  3165 th iteration => 0.24551876254
Loss for  3166 th iteration => 0.245413254153
Loss for  3167 th iteration => 0.245300352271
Loss for  3168 th iteration => 0.245189940665
Loss for  3169 th iteration => 0.245081853006
Loss for  3170 th iteration => 0.244968818878
Loss for  3171 th iteration => 0.244861345196
Loss for  3172 th iteration => 0.244751296609
Loss for  3173 th iteration => 0.24463841458
Loss for  3174 th iteration => 0.244533030634
Loss for  3175 th iteration => 0.244421537592
Loss for  3176 th iteration => 0.244309048758
Loss for  3177 th iteration => 0.244205046414
Loss for  3178 th iteration => 0.244092539305
Loss for  3179 th iteration => 0.243980904788
Loss for  3180 th iteration => 0.243877949269
Loss for  3181 th iteration => 0.243763983633
Loss for  3182 th iteration => 0.243654303511
Loss for  3183 th iteration => 0.243549882053
Loss for  3184 th iteration => 0.243436191292
Loss for  3185 th iteration => 0.243328282887
Loss for  3186 th iteration => 0.243222560291
Loss for  3187 th iteration => 0.24310913221
Loss for  3188 th iteration => 0.24300284756
Loss for  3189 th iteration => 0.242895957991
Loss for  3190 th iteration => 0.242782782855
Loss for  3191 th iteration => 0.242678002909
Loss for  3192 th iteration => 0.242570054654
Loss for  3193 th iteration => 0.242457222609
Loss for  3194 th iteration => 0.242353301391
Loss for  3195 th iteration => 0.242244940774
Loss for  3196 th iteration => 0.242133543587
Loss for  3197 th iteration => 0.24202837052
Loss for  3198 th iteration => 0.241920473622
Loss for  3199 th iteration => 0.241810572124
Loss for  3200 th iteration => 0.241703997593
Loss for  3201 th iteration => 0.241596647093
Loss for  3202 th iteration => 0.241488293361
Loss for  3203 th iteration => 0.241380193869
Loss for  3204 th iteration => 0.241274131625
Loss for  3205 th iteration => 0.241165652368
Loss for  3206 th iteration => 0.241056622918
Loss for  3207 th iteration => 0.240953322771
Loss for  3208 th iteration => 0.2408433631
Loss for  3209 th iteration => 0.240733698067
Loss for  3210 th iteration => 0.240633123242
Loss for  3211 th iteration => 0.240521684892
Loss for  3212 th iteration => 0.240412172249
Loss for  3213 th iteration => 0.240311990178
Loss for  3214 th iteration => 0.240200674857
Loss for  3215 th iteration => 0.240092250917
Loss for  3216 th iteration => 0.239990486512
Loss for  3217 th iteration => 0.239880659357
Loss for  3218 th iteration => 0.239774026049
Loss for  3219 th iteration => 0.239668335612
Loss for  3220 th iteration => 0.239563633256
Loss for  3221 th iteration => 0.239454596896
Loss for  3222 th iteration => 0.239347051335
Loss for  3223 th iteration => 0.239247040053
Loss for  3224 th iteration => 0.23913574171
Loss for  3225 th iteration => 0.239028856011
Loss for  3226 th iteration => 0.238927849121
Loss for  3227 th iteration => 0.238817512588
Loss for  3228 th iteration => 0.238712119742
Loss for  3229 th iteration => 0.238608437407
Loss for  3230 th iteration => 0.23850264178
Loss for  3231 th iteration => 0.238394681736
Loss for  3232 th iteration => 0.238288556943
Loss for  3233 th iteration => 0.238187759695
Loss for  3234 th iteration => 0.238077977392
Loss for  3235 th iteration => 0.237972243348
Loss for  3236 th iteration => 0.237871437522
Loss for  3237 th iteration => 0.237762858525
Loss for  3238 th iteration => 0.23765716651
Loss for  3239 th iteration => 0.237552583835
Loss for  3240 th iteration => 0.237450999012
Loss for  3241 th iteration => 0.237341885941
Loss for  3242 th iteration => 0.237236545866
Loss for  3243 th iteration => 0.237136285547
Loss for  3244 th iteration => 0.237027565078
Loss for  3245 th iteration => 0.236923179196
Loss for  3246 th iteration => 0.236819067478
Loss for  3247 th iteration => 0.236717541437
Loss for  3248 th iteration => 0.236609314192
Loss for  3249 th iteration => 0.236503911551
Loss for  3250 th iteration => 0.23640463118
Loss for  3251 th iteration => 0.236296600599
Loss for  3252 th iteration => 0.236192597556
Loss for  3253 th iteration => 0.236087879218
Loss for  3254 th iteration => 0.235988253054
Loss for  3255 th iteration => 0.235880119667
Loss for  3256 th iteration => 0.235774925535
Loss for  3257 th iteration => 0.235675013194
Loss for  3258 th iteration => 0.235570170203
Loss for  3259 th iteration => 0.235465453775
Loss for  3260 th iteration => 0.235358932094
Loss for  3261 th iteration => 0.23526217683
Loss for  3262 th iteration => 0.235154364649
Loss for  3263 th iteration => 0.235051316317
Loss for  3264 th iteration => 0.234947160381
Loss for  3265 th iteration => 0.234847512003
Loss for  3266 th iteration => 0.234741528685
Loss for  3267 th iteration => 0.23463625862
Loss for  3268 th iteration => 0.23453631419
Loss for  3269 th iteration => 0.234433234287
Loss for  3270 th iteration => 0.234329886586
Loss for  3271 th iteration => 0.234222933977
Loss for  3272 th iteration => 0.23412580068
Loss for  3273 th iteration => 0.234021059557
Loss for  3274 th iteration => 0.233919350029
Loss for  3275 th iteration => 0.233812525479
Loss for  3276 th iteration => 0.23371491454
Loss for  3277 th iteration => 0.233611250159
Loss for  3278 th iteration => 0.233508077075
Loss for  3279 th iteration => 0.233403365099
Loss for  3280 th iteration => 0.2333045297
Loss for  3281 th iteration => 0.233202560771
Loss for  3282 th iteration => 0.233098326062
Loss for  3283 th iteration => 0.232995287613
Loss for  3284 th iteration => 0.232895078118
Loss for  3285 th iteration => 0.232794955677
Loss for  3286 th iteration => 0.232689990508
Loss for  3287 th iteration => 0.232588267134
Loss for  3288 th iteration => 0.232486592188
Loss for  3289 th iteration => 0.2323884103
Loss for  3290 th iteration => 0.232282989003
Loss for  3291 th iteration => 0.232182286033
Loss for  3292 th iteration => 0.232079099561
Loss for  3293 th iteration => 0.231982907354
Loss for  3294 th iteration => 0.231877259932
Loss for  3295 th iteration => 0.23177774864
Loss for  3296 th iteration => 0.231673185814
Loss for  3297 th iteration => 0.231578819487
Loss for  3298 th iteration => 0.23147304722
Loss for  3299 th iteration => 0.231373032954
Loss for  3300 th iteration => 0.231268615492
Loss for  3301 th iteration => 0.231174706168
Loss for  3302 th iteration => 0.231069870775
Loss for  3303 th iteration => 0.2309695963
Loss for  3304 th iteration => 0.230865602399
Loss for  3305 th iteration => 0.230769742844
Loss for  3306 th iteration => 0.230667785117
Loss for  3307 th iteration => 0.230567827348
Loss for  3308 th iteration => 0.230464024043
Loss for  3309 th iteration => 0.230366371408
Loss for  3310 th iteration => 0.23026669948
Loss for  3311 th iteration => 0.230167145127
Loss for  3312 th iteration => 0.230063442801
Loss for  3313 th iteration => 0.22996417252
Loss for  3314 th iteration => 0.22986563473
Loss for  3315 th iteration => 0.229768025542
Loss for  3316 th iteration => 0.2296639168
Loss for  3317 th iteration => 0.229565518544
Loss for  3318 th iteration => 0.229463674969
Loss for  3319 th iteration => 0.229369866635
Loss for  3320 th iteration => 0.229265377769
Loss for  3321 th iteration => 0.229167836308
Loss for  3322 th iteration => 0.229063905917
Loss for  3323 th iteration => 0.228969935217
Loss for  3324 th iteration => 0.22886915614
Loss for  3325 th iteration => 0.228770115507
Loss for  3326 th iteration => 0.228668321957
Loss for  3327 th iteration => 0.228569079785
Loss for  3328 th iteration => 0.22847266716
Loss for  3329 th iteration => 0.228373808021
Loss for  3330 th iteration => 0.22827396843
Loss for  3331 th iteration => 0.228173205957
Loss for  3332 th iteration => 0.228074487014
Loss for  3333 th iteration => 0.22797849153
Loss for  3334 th iteration => 0.227880395135
Loss for  3335 th iteration => 0.227778331552
Loss for  3336 th iteration => 0.227681077966
Loss for  3337 th iteration => 0.227579783992
Loss for  3338 th iteration => 0.227488106148
Loss for  3339 th iteration => 0.227385288183
Loss for  3340 th iteration => 0.227287880504
Loss for  3341 th iteration => 0.227187035265
Loss for  3342 th iteration => 0.227090055629
Loss for  3343 th iteration => 0.226994848029
Loss for  3344 th iteration => 0.226895180009
Loss for  3345 th iteration => 0.226796841404
Loss for  3346 th iteration => 0.226696579637
Loss for  3347 th iteration => 0.226600202377
Loss for  3348 th iteration => 0.226503512381
Loss for  3349 th iteration => 0.226407752593
Loss for  3350 th iteration => 0.226306074854
Loss for  3351 th iteration => 0.226209570029
Loss for  3352 th iteration => 0.226109741069
Loss for  3353 th iteration => 0.226015979432
Loss for  3354 th iteration => 0.225919079232
Loss for  3355 th iteration => 0.22581945329
Loss for  3356 th iteration => 0.225722897152
Loss for  3357 th iteration => 0.225622296795
Loss for  3358 th iteration => 0.225529428401
Loss for  3359 th iteration => 0.225431873146
Loss for  3360 th iteration => 0.225335133287
Loss for  3361 th iteration => 0.225236518253
Loss for  3362 th iteration => 0.225138591629
Loss for  3363 th iteration => 0.225041846432
Loss for  3364 th iteration => 0.224947211569
Loss for  3365 th iteration => 0.224852140504
Loss for  3366 th iteration => 0.224751820513
Loss for  3367 th iteration => 0.224656405433
Loss for  3368 th iteration => 0.224557877228
Loss for  3369 th iteration => 0.22446259578
Loss for  3370 th iteration => 0.224369302541
Loss for  3371 th iteration => 0.224270015632
Loss for  3372 th iteration => 0.224175446999
Loss for  3373 th iteration => 0.224075885307
Loss for  3374 th iteration => 0.223980700367
Loss for  3375 th iteration => 0.2238855853
Loss for  3376 th iteration => 0.223790444151
Loss for  3377 th iteration => 0.223695156392
Loss for  3378 th iteration => 0.223595932327
Loss for  3379 th iteration => 0.223502006021
Loss for  3380 th iteration => 0.22340386803
Loss for  3381 th iteration => 0.223310141105
Loss for  3382 th iteration => 0.223216854019
Loss for  3383 th iteration => 0.223118299797
Loss for  3384 th iteration => 0.223024817854
Loss for  3385 th iteration => 0.222926232428
Loss for  3386 th iteration => 0.222831246305
Loss for  3387 th iteration => 0.222736872314
Loss for  3388 th iteration => 0.222642247769
Loss for  3389 th iteration => 0.222548751516
Loss for  3390 th iteration => 0.222450450274
Loss for  3391 th iteration => 0.222356037792
Loss for  3392 th iteration => 0.222260044192
Loss for  3393 th iteration => 0.222163982327
Loss for  3394 th iteration => 0.222074523146
Loss for  3395 th iteration => 0.22197647564
Loss for  3396 th iteration => 0.221882267333
Loss for  3397 th iteration => 0.221786597606
Loss for  3398 th iteration => 0.221690130355
Loss for  3399 th iteration => 0.221597183414
Loss for  3400 th iteration => 0.221502422561
Loss for  3401 th iteration => 0.221409899886
Loss for  3402 th iteration => 0.22131520149
Loss for  3403 th iteration => 0.22121832419
Loss for  3404 th iteration => 0.221126253505
Loss for  3405 th iteration => 0.221029341618
Loss for  3406 th iteration => 0.220935009837
Loss for  3407 th iteration => 0.220845452428
Loss for  3408 th iteration => 0.220748466169
Loss for  3409 th iteration => 0.220656023397
Loss for  3410 th iteration => 0.220560447369
Loss for  3411 th iteration => 0.220465397494
Loss for  3412 th iteration => 0.220372846901
Loss for  3413 th iteration => 0.22027698279
Loss for  3414 th iteration => 0.220187296743
Loss for  3415 th iteration => 0.220093435353
Loss for  3416 th iteration => 0.219997237082
Loss for  3417 th iteration => 0.219906243075
Loss for  3418 th iteration => 0.219810268645
Loss for  3419 th iteration => 0.219716190541
Loss for  3420 th iteration => 0.219623907736
Loss for  3421 th iteration => 0.219531126735
Loss for  3422 th iteration => 0.219439275992
Loss for  3423 th iteration => 0.219346149046
Loss for  3424 th iteration => 0.21925058585
Loss for  3425 th iteration => 0.219159232901
Loss for  3426 th iteration => 0.219064978752
Loss for  3427 th iteration => 0.21897043816
Loss for  3428 th iteration => 0.218880326604
Loss for  3429 th iteration => 0.218788481157
Loss for  3430 th iteration => 0.218694850525
Loss for  3431 th iteration => 0.218603193286
Loss for  3432 th iteration => 0.21850824421
Loss for  3433 th iteration => 0.21841622164
Loss for  3434 th iteration => 0.218323721998
Loss for  3435 th iteration => 0.218229232297
Loss for  3436 th iteration => 0.21813929294
Loss for  3437 th iteration => 0.218048943147
Loss for  3438 th iteration => 0.21795432334
Loss for  3439 th iteration => 0.217863851201
Loss for  3440 th iteration => 0.217770458696
Loss for  3441 th iteration => 0.217676625641
Loss for  3442 th iteration => 0.217586922397
Loss for  3443 th iteration => 0.217492950586
Loss for  3444 th iteration => 0.2174003447
Loss for  3445 th iteration => 0.217313587601
Loss for  3446 th iteration => 0.217219483505
Loss for  3447 th iteration => 0.217126977437
Loss for  3448 th iteration => 0.217036588088
Loss for  3449 th iteration => 0.216942942584
Loss for  3450 th iteration => 0.216851270065
Loss for  3451 th iteration => 0.21676070723
Loss for  3452 th iteration => 0.216667459406
Loss for  3453 th iteration => 0.216577401492
Loss for  3454 th iteration => 0.216489124426
Loss for  3455 th iteration => 0.216395730375
Loss for  3456 th iteration => 0.216304406759
Loss for  3457 th iteration => 0.216214065165
Loss for  3458 th iteration => 0.216121092169
Loss for  3459 th iteration => 0.216030200746
Loss for  3460 th iteration => 0.21594003863
Loss for  3461 th iteration => 0.215847434208
Loss for  3462 th iteration => 0.215757245903
Loss for  3463 th iteration => 0.215670119945
Loss for  3464 th iteration => 0.215577362618
Loss for  3465 th iteration => 0.215486193574
Loss for  3466 th iteration => 0.215396841011
Loss for  3467 th iteration => 0.215304482227
Loss for  3468 th iteration => 0.215213481949
Loss for  3469 th iteration => 0.215124543525
Loss for  3470 th iteration => 0.215032535694
Loss for  3471 th iteration => 0.214941240366
Loss for  3472 th iteration => 0.214854268789
Loss for  3473 th iteration => 0.214764339882
Loss for  3474 th iteration => 0.214672385556
Loss for  3475 th iteration => 0.214584291579
Loss for  3476 th iteration => 0.214493228741
Loss for  3477 th iteration => 0.21440162838
Loss for  3478 th iteration => 0.21431316899
Loss for  3479 th iteration => 0.214222999047
Loss for  3480 th iteration => 0.214131714849
Loss for  3481 th iteration => 0.214042500743
Loss for  3482 th iteration => 0.213953567755
Loss for  3483 th iteration => 0.213864202294
Loss for  3484 th iteration => 0.213774863009
Loss for  3485 th iteration => 0.213687390853
Loss for  3486 th iteration => 0.213596252462
Loss for  3487 th iteration => 0.213505322118
Loss for  3488 th iteration => 0.213418034771
Loss for  3489 th iteration => 0.213327905005
Loss for  3490 th iteration => 0.213237273228
Loss for  3491 th iteration => 0.213148959944
Loss for  3492 th iteration => 0.213060317361
Loss for  3493 th iteration => 0.212969958188
Loss for  3494 th iteration => 0.212880775664
Loss for  3495 th iteration => 0.21279615652
Loss for  3496 th iteration => 0.212705628082
Loss for  3497 th iteration => 0.212615304212
Loss for  3498 th iteration => 0.21252807463
Loss for  3499 th iteration => 0.212439031601
Loss for  3500 th iteration => 0.2123490004
Loss for  3501 th iteration => 0.212260617032
Loss for  3502 th iteration => 0.212173181629
Loss for  3503 th iteration => 0.212083418111
Loss for  3504 th iteration => 0.211993821085
Loss for  3505 th iteration => 0.211907311033
Loss for  3506 th iteration => 0.21181863491
Loss for  3507 th iteration => 0.21173006618
Loss for  3508 th iteration => 0.211643015963
Loss for  3509 th iteration => 0.211556676203
Loss for  3510 th iteration => 0.211467159621
Loss for  3511 th iteration => 0.211377825537
Loss for  3512 th iteration => 0.211291306181
Loss for  3513 th iteration => 0.211203285176
Loss for  3514 th iteration => 0.211114206031
Loss for  3515 th iteration => 0.211026004604
Loss for  3516 th iteration => 0.210940076674
Loss for  3517 th iteration => 0.210851235079
Loss for  3518 th iteration => 0.210762546632
Loss for  3519 th iteration => 0.210675603142
Loss for  3520 th iteration => 0.210588999544
Loss for  3521 th iteration => 0.210500877784
Loss for  3522 th iteration => 0.210414272636
Loss for  3523 th iteration => 0.210327987027
Loss for  3524 th iteration => 0.210240920312
Loss for  3525 th iteration => 0.210152450896
Loss for  3526 th iteration => 0.210064456923
Loss for  3527 th iteration => 0.209979512557
Loss for  3528 th iteration => 0.209891283166
Loss for  3529 th iteration => 0.209803207748
Loss for  3530 th iteration => 0.209716407091
Loss for  3531 th iteration => 0.209630854259
Loss for  3532 th iteration => 0.209542987056
Loss for  3533 th iteration => 0.209455259478
Loss for  3534 th iteration => 0.209369034232
Loss for  3535 th iteration => 0.209283440869
Loss for  3536 th iteration => 0.209195899128
Loss for  3537 th iteration => 0.209108486319
Loss for  3538 th iteration => 0.209023048418
Loss for  3539 th iteration => 0.208939225587
Loss for  3540 th iteration => 0.208851644796
Loss for  3541 th iteration => 0.208764220506
Loss for  3542 th iteration => 0.208678497198
Loss for  3543 th iteration => 0.208593146222
Loss for  3544 th iteration => 0.208505934176
Loss for  3545 th iteration => 0.208418863383
Loss for  3546 th iteration => 0.208333662166
Loss for  3547 th iteration => 0.20824832856
Loss for  3548 th iteration => 0.208161446
Loss for  3549 th iteration => 0.208074693358
Loss for  3550 th iteration => 0.20798950553
Loss for  3551 th iteration => 0.207904661705
Loss for  3552 th iteration => 0.207818080224
Loss for  3553 th iteration => 0.207731620129
Loss for  3554 th iteration => 0.207646042881
Loss for  3555 th iteration => 0.207562067125
Loss for  3556 th iteration => 0.207475765877
Loss for  3557 th iteration => 0.207389579515
Loss for  3558 th iteration => 0.207303502443
Loss for  3559 th iteration => 0.207220111237
Loss for  3560 th iteration => 0.207136301887
Loss for  3561 th iteration => 0.207050067552
Loss for  3562 th iteration => 0.206963970315
Loss for  3563 th iteration => 0.206879738561
Loss for  3564 th iteration => 0.206795276997
Loss for  3565 th iteration => 0.206709361556
Loss for  3566 th iteration => 0.206623572632
Loss for  3567 th iteration => 0.206539061832
Loss for  3568 th iteration => 0.206455370875
Loss for  3569 th iteration => 0.206369747967
Loss for  3570 th iteration => 0.206284243589
Loss for  3571 th iteration => 0.206199078095
Loss for  3572 th iteration => 0.206116510969
Loss for  3573 th iteration => 0.20603116119
Loss for  3574 th iteration => 0.205945923842
Loss for  3575 th iteration => 0.205860793485
Loss for  3576 th iteration => 0.20577715296
Loss for  3577 th iteration => 0.205693648591
Loss for  3578 th iteration => 0.205608657162
Loss for  3579 th iteration => 0.205523769252
Loss for  3580 th iteration => 0.205438980302
Loss for  3581 th iteration => 0.205356236819
Loss for  3582 th iteration => 0.20527239535
Loss for  3583 th iteration => 0.205187735219
Loss for  3584 th iteration => 0.205103172288
Loss for  3585 th iteration => 0.20501870261
Loss for  3586 th iteration => 0.204936375351
Loss for  3587 th iteration => 0.20485266088
Loss for  3588 th iteration => 0.204768313329
Loss for  3589 th iteration => 0.204684058406
Loss for  3590 th iteration => 0.204599892602
Loss for  3591 th iteration => 0.204517611341
Loss for  3592 th iteration => 0.204434383707
Loss for  3593 th iteration => 0.204350335466
Loss for  3594 th iteration => 0.204266376483
Loss for  3595 th iteration => 0.204182503567
Loss for  3596 th iteration => 0.204099981437
Loss for  3597 th iteration => 0.204017520744
Loss for  3598 th iteration => 0.203933762175
Loss for  3599 th iteration => 0.203850195788
Loss for  3600 th iteration => 0.203776543903
Loss for  3601 th iteration => 0.203688711191
Loss for  3602 th iteration => 0.20360941084
Loss for  3603 th iteration => 0.203527843384
Loss for  3604 th iteration => 0.203439520614
Loss for  3605 th iteration => 0.203365567311
Loss for  3606 th iteration => 0.203271503925
Loss for  3607 th iteration => 0.203190241636
Loss for  3608 th iteration => 0.203108095068
Loss for  3609 th iteration => 0.203039500493
Loss for  3610 th iteration => 0.202942195238
Loss for  3611 th iteration => 0.202859186546
Loss for  3612 th iteration => 0.202777046079
Loss for  3613 th iteration => 0.202695850207
Loss for  3614 th iteration => 0.202613394356
Loss for  3615 th iteration => 0.202548519789
Loss for  3616 th iteration => 0.202448294874
Loss for  3617 th iteration => 0.202365658438
Loss for  3618 th iteration => 0.202284408773
Loss for  3619 th iteration => 0.202223204152
Loss for  3620 th iteration => 0.202121294095
Loss for  3621 th iteration => 0.202038857806
Loss for  3622 th iteration => 0.201956843909
Loss for  3623 th iteration => 0.201896811751
Loss for  3624 th iteration => 0.201793465637
Loss for  3625 th iteration => 0.201737798335
Loss for  3626 th iteration => 0.201635013315
Loss for  3627 th iteration => 0.201570936983
Loss for  3628 th iteration => 0.201474842643
Loss for  3629 th iteration => 0.201405165938
Loss for  3630 th iteration => 0.20131515061
Loss for  3631 th iteration => 0.201242712897
Loss for  3632 th iteration => 0.201156187285
Loss for  3633 th iteration => 0.201077690602
Loss for  3634 th iteration => 0.200996080284
Loss for  3635 th iteration => 0.200913596192
Loss for  3636 th iteration => 0.200835838336
Loss for  3637 th iteration => 0.200751269878
Loss for  3638 th iteration => 0.200677185758
Loss for  3639 th iteration => 0.200589378634
Loss for  3640 th iteration => 0.200517294544
Loss for  3641 th iteration => 0.200426570137
Loss for  3642 th iteration => 0.200357292385
Loss for  3643 th iteration => 0.200264506012
Loss for  3644 th iteration => 0.200197525497
Loss for  3645 th iteration => 0.200105355666
Loss for  3646 th iteration => 0.20002485388
Loss for  3647 th iteration => 0.199944500695
Loss for  3648 th iteration => 0.199864285579
Loss for  3649 th iteration => 0.199785996893
Loss for  3650 th iteration => 0.199718062865
Loss for  3651 th iteration => 0.199627045297
Loss for  3652 th iteration => 0.199557306827
Loss for  3653 th iteration => 0.199469616549
Loss for  3654 th iteration => 0.199397425595
Loss for  3655 th iteration => 0.199311058381
Loss for  3656 th iteration => 0.199236393982
Loss for  3657 th iteration => 0.199152403416
Loss for  3658 th iteration => 0.199076059901
Loss for  3659 th iteration => 0.198993674438
Loss for  3660 th iteration => 0.19891685819
Loss for  3661 th iteration => 0.198836339411
Loss for  3662 th iteration => 0.198758562951
Loss for  3663 th iteration => 0.198678122365
Loss for  3664 th iteration => 0.198599043441
Loss for  3665 th iteration => 0.198519842404
Loss for  3666 th iteration => 0.198440127958
Loss for  3667 th iteration => 0.198361521625
Loss for  3668 th iteration => 0.19828176489
Loss for  3669 th iteration => 0.198203690847
Loss for  3670 th iteration => 0.19812565219
Loss for  3671 th iteration => 0.198046424418
Loss for  3672 th iteration => 0.197967299015
Loss for  3673 th iteration => 0.197888694413
Loss for  3674 th iteration => 0.197809499065
Loss for  3675 th iteration => 0.197730943584
Loss for  3676 th iteration => 0.197652248136
Loss for  3677 th iteration => 0.197573648651
Loss for  3678 th iteration => 0.197495154078
Loss for  3679 th iteration => 0.197417267734
Loss for  3680 th iteration => 0.197339915745
Loss for  3681 th iteration => 0.197261548734
Loss for  3682 th iteration => 0.197183282122
Loss for  3683 th iteration => 0.197105110405
Loss for  3684 th iteration => 0.19702702866
Loss for  3685 th iteration => 0.196949032481
Loss for  3686 th iteration => 0.196871117927
Loss for  3687 th iteration => 0.196793281467
Loss for  3688 th iteration => 0.196715519938
Loss for  3689 th iteration => 0.196637830509
Loss for  3690 th iteration => 0.196560210641
Loss for  3691 th iteration => 0.196482658056
Loss for  3692 th iteration => 0.196405758627
Loss for  3693 th iteration => 0.196330936302
Loss for  3694 th iteration => 0.196253191286
Loss for  3695 th iteration => 0.196175545454
Loss for  3696 th iteration => 0.19609799339
Loss for  3697 th iteration => 0.196020530247
Loss for  3698 th iteration => 0.195943151688
Loss for  3699 th iteration => 0.19586585383
Loss for  3700 th iteration => 0.195788633199
Loss for  3701 th iteration => 0.195711486678
Loss for  3702 th iteration => 0.19563441148
Loss for  3703 th iteration => 0.195557405101
Loss for  3704 th iteration => 0.1954804653
Loss for  3705 th iteration => 0.195403590064
Loss for  3706 th iteration => 0.195326975638
Loss for  3707 th iteration => 0.195250970602
Loss for  3708 th iteration => 0.195174182294
Loss for  3709 th iteration => 0.195097461922
Loss for  3710 th iteration => 0.195020807316
Loss for  3711 th iteration => 0.194944216527
Loss for  3712 th iteration => 0.194867687806
Loss for  3713 th iteration => 0.194791219582
Loss for  3714 th iteration => 0.194714810443
Loss for  3715 th iteration => 0.194638459118
Loss for  3716 th iteration => 0.194562164465
Loss for  3717 th iteration => 0.194485925453
Loss for  3718 th iteration => 0.194409741155
Loss for  3719 th iteration => 0.194333610732
Loss for  3720 th iteration => 0.19425753343
Loss for  3721 th iteration => 0.194181508564
Loss for  3722 th iteration => 0.194105535516
Loss for  3723 th iteration => 0.194029613727
Loss for  3724 th iteration => 0.19395374269
Loss for  3725 th iteration => 0.193877921942
Loss for  3726 th iteration => 0.193802151067
Loss for  3727 th iteration => 0.193726429682
Loss for  3728 th iteration => 0.193650757441
Loss for  3729 th iteration => 0.193575134027
Loss for  3730 th iteration => 0.19349955915
Loss for  3731 th iteration => 0.193424032544
Loss for  3732 th iteration => 0.193348553968
Loss for  3733 th iteration => 0.193273123197
Loss for  3734 th iteration => 0.193197740026
Loss for  3735 th iteration => 0.193122404265
Loss for  3736 th iteration => 0.19304711574
Loss for  3737 th iteration => 0.192971874289
Loss for  3738 th iteration => 0.19289667976
Loss for  3739 th iteration => 0.192821532015
Loss for  3740 th iteration => 0.192746430923
Loss for  3741 th iteration => 0.192671376363
Loss for  3742 th iteration => 0.192596368221
Loss for  3743 th iteration => 0.19252140639
Loss for  3744 th iteration => 0.192446490771
Loss for  3745 th iteration => 0.192371621268
Loss for  3746 th iteration => 0.192296797794
Loss for  3747 th iteration => 0.192222020262
Loss for  3748 th iteration => 0.192147288595
Loss for  3749 th iteration => 0.192072602714
Loss for  3750 th iteration => 0.191997962548
Loss for  3751 th iteration => 0.191923368028
Loss for  3752 th iteration => 0.191848819088
Loss for  3753 th iteration => 0.191774315663
Loss for  3754 th iteration => 0.191699857694
Loss for  3755 th iteration => 0.19162544512
Loss for  3756 th iteration => 0.191551077886
Loss for  3757 th iteration => 0.191476755937
Loss for  3758 th iteration => 0.191402479219
Loss for  3759 th iteration => 0.191328247682
Loss for  3760 th iteration => 0.191254061275
Loss for  3761 th iteration => 0.19117991995
Loss for  3762 th iteration => 0.191105823659
Loss for  3763 th iteration => 0.191031772356
Loss for  3764 th iteration => 0.190957765996
Loss for  3765 th iteration => 0.190883804534
Loss for  3766 th iteration => 0.190809887928
Loss for  3767 th iteration => 0.190736016135
Loss for  3768 th iteration => 0.190662189113
Loss for  3769 th iteration => 0.19058840682
Loss for  3770 th iteration => 0.190514669218
Loss for  3771 th iteration => 0.190440976265
Loss for  3772 th iteration => 0.190367327923
Loss for  3773 th iteration => 0.190293724154
Loss for  3774 th iteration => 0.190220164918
Loss for  3775 th iteration => 0.190146650178
Loss for  3776 th iteration => 0.190073179897
Loss for  3777 th iteration => 0.189999754038
Loss for  3778 th iteration => 0.189926372565
Loss for  3779 th iteration => 0.189853035441
Loss for  3780 th iteration => 0.18977974263
Loss for  3781 th iteration => 0.189706494097
Loss for  3782 th iteration => 0.189633289806
Loss for  3783 th iteration => 0.189560129723
Loss for  3784 th iteration => 0.189487013813
Loss for  3785 th iteration => 0.189413942041
Loss for  3786 th iteration => 0.189340914373
Loss for  3787 th iteration => 0.189267930774
Loss for  3788 th iteration => 0.189194991211
Loss for  3789 th iteration => 0.189122095651
Loss for  3790 th iteration => 0.189049244059
Loss for  3791 th iteration => 0.188976436402
Loss for  3792 th iteration => 0.188903672647
Loss for  3793 th iteration => 0.18883095276
Loss for  3794 th iteration => 0.18875827671
Loss for  3795 th iteration => 0.188685644462
Loss for  3796 th iteration => 0.188613055985
Loss for  3797 th iteration => 0.188540511245
Loss for  3798 th iteration => 0.18846801021
Loss for  3799 th iteration => 0.188395552848
Loss for  3800 th iteration => 0.188323139127
Loss for  3801 th iteration => 0.188250769013
Loss for  3802 th iteration => 0.188178442476
Loss for  3803 th iteration => 0.188106159483
Loss for  3804 th iteration => 0.188033920002
Loss for  3805 th iteration => 0.187961724002
Loss for  3806 th iteration => 0.18788957145
Loss for  3807 th iteration => 0.187817462315
Loss for  3808 th iteration => 0.187745396565
Loss for  3809 th iteration => 0.187673374169
Loss for  3810 th iteration => 0.187601395094
Loss for  3811 th iteration => 0.187529459311
Loss for  3812 th iteration => 0.187457566786
Loss for  3813 th iteration => 0.18738571749
Loss for  3814 th iteration => 0.18731391139
Loss for  3815 th iteration => 0.187242148455
Loss for  3816 th iteration => 0.187170428655
Loss for  3817 th iteration => 0.187098751957
Loss for  3818 th iteration => 0.187027118332
Loss for  3819 th iteration => 0.186955527747
Loss for  3820 th iteration => 0.186883980173
Loss for  3821 th iteration => 0.186812475577
Loss for  3822 th iteration => 0.186741013929
Loss for  3823 th iteration => 0.186669595199
Loss for  3824 th iteration => 0.186598219354
Loss for  3825 th iteration => 0.186526886365
Loss for  3826 th iteration => 0.186455596201
Loss for  3827 th iteration => 0.18638434883
Loss for  3828 th iteration => 0.186313144223
Loss for  3829 th iteration => 0.186241982349
Loss for  3830 th iteration => 0.186170863176
Loss for  3831 th iteration => 0.186099786675
Loss for  3832 th iteration => 0.186028752814
Loss for  3833 th iteration => 0.185957761564
Loss for  3834 th iteration => 0.185886812893
Loss for  3835 th iteration => 0.185815906771
Loss for  3836 th iteration => 0.185745043169
Loss for  3837 th iteration => 0.185674222054
Loss for  3838 th iteration => 0.185603443398
Loss for  3839 th iteration => 0.185532707169
Loss for  3840 th iteration => 0.185462252823
Loss for  3841 th iteration => 0.185391338741
Loss for  3842 th iteration => 0.185320728367
Loss for  3843 th iteration => 0.185250162163
Loss for  3844 th iteration => 0.185179639795
Loss for  3845 th iteration => 0.185109160967
Loss for  3846 th iteration => 0.185038725418
Loss for  3847 th iteration => 0.184968332916
Loss for  3848 th iteration => 0.184897983257
Loss for  3849 th iteration => 0.184827676263
Loss for  3850 th iteration => 0.184757411772
Loss for  3851 th iteration => 0.184687189645
Loss for  3852 th iteration => 0.184617009755
Loss for  3853 th iteration => 0.184546871992
Loss for  3854 th iteration => 0.184476776256
Loss for  3855 th iteration => 0.18440672246
Loss for  3856 th iteration => 0.184336710524
Loss for  3857 th iteration => 0.184266740379
Loss for  3858 th iteration => 0.184196811961
Loss for  3859 th iteration => 0.184126925212
Loss for  3860 th iteration => 0.184057080081
Loss for  3861 th iteration => 0.18398727652
Loss for  3862 th iteration => 0.183917514487
Loss for  3863 th iteration => 0.183847793941
Loss for  3864 th iteration => 0.183778114846
Loss for  3865 th iteration => 0.183708477168
Loss for  3866 th iteration => 0.183638880875
Loss for  3867 th iteration => 0.183569325936
Loss for  3868 th iteration => 0.183499812323
Loss for  3869 th iteration => 0.183430340009
Loss for  3870 th iteration => 0.183360908967
Loss for  3871 th iteration => 0.183291767588
Loss for  3872 th iteration => 0.18322224554
Loss for  3873 th iteration => 0.183152925894
Loss for  3874 th iteration => 0.183083650455
Loss for  3875 th iteration => 0.18301441877
Loss for  3876 th iteration => 0.182945230438
Loss for  3877 th iteration => 0.182876085105
Loss for  3878 th iteration => 0.18280698246
Loss for  3879 th iteration => 0.182737922227
Loss for  3880 th iteration => 0.182668904161
Loss for  3881 th iteration => 0.182599928049
Loss for  3882 th iteration => 0.1825309937
Loss for  3883 th iteration => 0.182462100945
Loss for  3884 th iteration => 0.182393249635
Loss for  3885 th iteration => 0.182324439639
Loss for  3886 th iteration => 0.182255670839
Loss for  3887 th iteration => 0.182186943132
Loss for  3888 th iteration => 0.182118256423
Loss for  3889 th iteration => 0.182049610632
Loss for  3890 th iteration => 0.181981005682
Loss for  3891 th iteration => 0.18191244151
Loss for  3892 th iteration => 0.181843918053
Loss for  3893 th iteration => 0.181775435259
Loss for  3894 th iteration => 0.181706993079
Loss for  3895 th iteration => 0.181638591467
Loss for  3896 th iteration => 0.181570230382
Loss for  3897 th iteration => 0.181501909788
Loss for  3898 th iteration => 0.181433669712
Loss for  3899 th iteration => 0.181365506761
Loss for  3900 th iteration => 0.181297290394
Loss for  3901 th iteration => 0.181229117906
Loss for  3902 th iteration => 0.18116098879
Loss for  3903 th iteration => 0.181092902596
Loss for  3904 th iteration => 0.181024858928
Loss for  3905 th iteration => 0.180956857437
Loss for  3906 th iteration => 0.180888897813
Loss for  3907 th iteration => 0.180820979783
Loss for  3908 th iteration => 0.180753103105
Loss for  3909 th iteration => 0.180685267567
Loss for  3910 th iteration => 0.180617472981
Loss for  3911 th iteration => 0.180549719178
Loss for  3912 th iteration => 0.180482006011
Loss for  3913 th iteration => 0.18041433335
Loss for  3914 th iteration => 0.180346701079
Loss for  3915 th iteration => 0.180279109093
Loss for  3916 th iteration => 0.1802115573
Loss for  3917 th iteration => 0.18014404562
Loss for  3918 th iteration => 0.180076573978
Loss for  3919 th iteration => 0.180009142308
Loss for  3920 th iteration => 0.179941750551
Loss for  3921 th iteration => 0.179874398654
Loss for  3922 th iteration => 0.179807086568
Loss for  3923 th iteration => 0.179739814248
Loss for  3924 th iteration => 0.179672583171
Loss for  3925 th iteration => 0.179605518754
Loss for  3926 th iteration => 0.179538346867
Loss for  3927 th iteration => 0.179471218254
Loss for  3928 th iteration => 0.179404132393
Loss for  3929 th iteration => 0.179337088822
Loss for  3930 th iteration => 0.179270087133
Loss for  3931 th iteration => 0.179203126966
Loss for  3932 th iteration => 0.179136208001
Loss for  3933 th iteration => 0.179069329959
Loss for  3934 th iteration => 0.179002492589
Loss for  3935 th iteration => 0.178935695673
Loss for  3936 th iteration => 0.178868939016
Loss for  3937 th iteration => 0.178802222445
Loss for  3938 th iteration => 0.178735545809
Loss for  3939 th iteration => 0.178668908972
Loss for  3940 th iteration => 0.178602311815
Loss for  3941 th iteration => 0.178535754232
Loss for  3942 th iteration => 0.178469236127
Loss for  3943 th iteration => 0.178402757416
Loss for  3944 th iteration => 0.178336318024
Loss for  3945 th iteration => 0.178269917882
Loss for  3946 th iteration => 0.178203556931
Loss for  3947 th iteration => 0.178137235114
Loss for  3948 th iteration => 0.178070952384
Loss for  3949 th iteration => 0.178004708693
Loss for  3950 th iteration => 0.177938666788
Loss for  3951 th iteration => 0.177872466456
Loss for  3952 th iteration => 0.177806508523
Loss for  3953 th iteration => 0.177740540312
Loss for  3954 th iteration => 0.177674441014
Loss for  3955 th iteration => 0.177608388437
Loss for  3956 th iteration => 0.177542381553
Loss for  3957 th iteration => 0.177476419453
Loss for  3958 th iteration => 0.177410501334
Loss for  3959 th iteration => 0.177344626487
Loss for  3960 th iteration => 0.177278794282
Loss for  3961 th iteration => 0.177213004164
Loss for  3962 th iteration => 0.177147255642
Loss for  3963 th iteration => 0.177081548284
Loss for  3964 th iteration => 0.177015881704
Loss for  3965 th iteration => 0.176950255565
Loss for  3966 th iteration => 0.176884669566
Loss for  3967 th iteration => 0.176819123444
Loss for  3968 th iteration => 0.17675377549
Loss for  3969 th iteration => 0.176688514449
Loss for  3970 th iteration => 0.176623486749
Loss for  3971 th iteration => 0.176558068726
Loss for  3972 th iteration => 0.176492633843
Loss for  3973 th iteration => 0.176427249431
Loss for  3974 th iteration => 0.176361913969
Loss for  3975 th iteration => 0.17629662611
Loss for  3976 th iteration => 0.17623138466
Loss for  3977 th iteration => 0.176166188566
Loss for  3978 th iteration => 0.176101036892
Loss for  3979 th iteration => 0.176035928813
Loss for  3980 th iteration => 0.175970863597
Loss for  3981 th iteration => 0.175905840598
Loss for  3982 th iteration => 0.175840859244
Loss for  3983 th iteration => 0.175776447246
Loss for  3984 th iteration => 0.175711555475
Loss for  3985 th iteration => 0.175647491574
Loss for  3986 th iteration => 0.175582307105
Loss for  3987 th iteration => 0.175517417105
Loss for  3988 th iteration => 0.175452580841
Loss for  3989 th iteration => 0.175387796371
Loss for  3990 th iteration => 0.175323061971
Loss for  3991 th iteration => 0.175258376117
Loss for  3992 th iteration => 0.175193737456
Loss for  3993 th iteration => 0.175129144794
Loss for  3994 th iteration => 0.175064597071
Loss for  3995 th iteration => 0.175000093351
Loss for  3996 th iteration => 0.174936422214
Loss for  3997 th iteration => 0.174872105661
Loss for  3998 th iteration => 0.174808116751
Loss for  3999 th iteration => 0.174743635804
Loss for  4000 th iteration => 0.174679213858
Loss for  4001 th iteration => 0.174614848328
Loss for  4002 th iteration => 0.174550536924
Loss for  4003 th iteration => 0.174486277619
Loss for  4004 th iteration => 0.174422068612
Loss for  4005 th iteration => 0.174357908313
Loss for  4006 th iteration => 0.174293795308
Loss for  4007 th iteration => 0.174230167201
Loss for  4008 th iteration => 0.174166583546
Loss for  4009 th iteration => 0.17410323311
Loss for  4010 th iteration => 0.174039127673
Loss for  4011 th iteration => 0.173975085509
Loss for  4012 th iteration => 0.173911103516
Loss for  4013 th iteration => 0.173847178945
Loss for  4014 th iteration => 0.173783309356
Loss for  4015 th iteration => 0.173719492589
Loss for  4016 th iteration => 0.173655726728
Loss for  4017 th iteration => 0.17359215151
Loss for  4018 th iteration => 0.173529312789
Loss for  4019 th iteration => 0.173466621718
Loss for  4020 th iteration => 0.173402695614
Loss for  4021 th iteration => 0.173338987947
Loss for  4022 th iteration => 0.173275344768
Loss for  4023 th iteration => 0.173211762806
Loss for  4024 th iteration => 0.173148239162
Loss for  4025 th iteration => 0.173084771265
Loss for  4026 th iteration => 0.173021356836
Loss for  4027 th iteration => 0.17295874499
Loss for  4028 th iteration => 0.172896045093
Loss for  4029 th iteration => 0.172833245707
Loss for  4030 th iteration => 0.172769821285
Loss for  4031 th iteration => 0.172706467021
Loss for  4032 th iteration => 0.172643178979
Loss for  4033 th iteration => 0.172579953664
Loss for  4034 th iteration => 0.172516787979
Loss for  4035 th iteration => 0.172453679177
Loss for  4036 th iteration => 0.172391440945
Loss for  4037 th iteration => 0.172329154228
Loss for  4038 th iteration => 0.172266650278
Loss for  4039 th iteration => 0.172203525302
Loss for  4040 th iteration => 0.172140473963
Loss for  4041 th iteration => 0.172077491909
Loss for  4042 th iteration => 0.172014575274
Loss for  4043 th iteration => 0.171951720633
Loss for  4044 th iteration => 0.171888953369
Loss for  4045 th iteration => 0.171827475365
Loss for  4046 th iteration => 0.171765902754
Loss for  4047 th iteration => 0.171702850517
Loss for  4048 th iteration => 0.171640045505
Loss for  4049 th iteration => 0.171577314951
Loss for  4050 th iteration => 0.171514654387
Loss for  4051 th iteration => 0.171452059847
Loss for  4052 th iteration => 0.171389527811
Loss for  4053 th iteration => 0.171328727831
Loss for  4054 th iteration => 0.171266888024
Loss for  4055 th iteration => 0.171204471392
Loss for  4056 th iteration => 0.17114191631
Loss for  4057 th iteration => 0.171079440216
Loss for  4058 th iteration => 0.17101703811
Loss for  4059 th iteration => 0.170954705554
Loss for  4060 th iteration => 0.170892584582
Loss for  4061 th iteration => 0.170831732159
Loss for  4062 th iteration => 0.170770520001
Loss for  4063 th iteration => 0.170708159558
Loss for  4064 th iteration => 0.170645884666
Loss for  4065 th iteration => 0.17058368957
Loss for  4066 th iteration => 0.170521569159
Loss for  4067 th iteration => 0.170459518899
Loss for  4068 th iteration => 0.170399085587
Loss for  4069 th iteration => 0.17033799006
Loss for  4070 th iteration => 0.17027619878
Loss for  4071 th iteration => 0.170214120598
Loss for  4072 th iteration => 0.17015212836
Loss for  4073 th iteration => 0.170090216247
Loss for  4074 th iteration => 0.170028379094
Loss for  4075 th iteration => 0.169967762286
Loss for  4076 th iteration => 0.169907155554
Loss for  4077 th iteration => 0.169845865948
Loss for  4078 th iteration => 0.169783999568
Loss for  4079 th iteration => 0.169722223368
Loss for  4080 th iteration => 0.169660531037
Loss for  4081 th iteration => 0.169598916974
Loss for  4082 th iteration => 0.169538751134
Loss for  4083 th iteration => 0.169478357979
Loss for  4084 th iteration => 0.169417181348
Loss for  4085 th iteration => 0.16935553791
Loss for  4086 th iteration => 0.16929398745
Loss for  4087 th iteration => 0.169232523328
Loss for  4088 th iteration => 0.169171139649
Loss for  4089 th iteration => 0.169111850854
Loss for  4090 th iteration => 0.169051487507
Loss for  4091 th iteration => 0.168990162538
Loss for  4092 th iteration => 0.16892874979
Loss for  4093 th iteration => 0.16886743178
Loss for  4094 th iteration => 0.168806201656
Loss for  4095 th iteration => 0.168745821506
Loss for  4096 th iteration => 0.168686254075
Loss for  4097 th iteration => 0.168625915559
Loss for  4098 th iteration => 0.168564657654
Loss for  4099 th iteration => 0.168503500783
Loss for  4100 th iteration => 0.168442437377
Loss for  4101 th iteration => 0.168381694754
Loss for  4102 th iteration => 0.168322608845
Loss for  4103 th iteration => 0.168262813465
Loss for  4104 th iteration => 0.168201724402
Loss for  4105 th iteration => 0.168140740948
Loss for  4106 th iteration => 0.168079855009
Loss for  4107 th iteration => 0.168019272605
Loss for  4108 th iteration => 0.167960444652
Loss for  4109 th iteration => 0.16790087533
Loss for  4110 th iteration => 0.167839965914
Loss for  4111 th iteration => 0.167779165335
Loss for  4112 th iteration => 0.167718465128
Loss for  4113 th iteration => 0.167658416261
Loss for  4114 th iteration => 0.167599687602
Loss for  4115 th iteration => 0.167540117863
Loss for  4116 th iteration => 0.167479396115
Loss for  4117 th iteration => 0.167418785413
Loss for  4118 th iteration => 0.167358277031
Loss for  4119 th iteration => 0.167299027124
Loss for  4120 th iteration => 0.16724028551
Loss for  4121 th iteration => 0.167180553901
Loss for  4122 th iteration => 0.167120025594
Loss for  4123 th iteration => 0.167059609779
Loss for  4124 th iteration => 0.166999297558
Loss for  4125 th iteration => 0.16694103426
Loss for  4126 th iteration => 0.166882200654
Loss for  4127 th iteration => 0.166822192377
Loss for  4128 th iteration => 0.166761861542
Loss for  4129 th iteration => 0.166701644082
Loss for  4130 th iteration => 0.166641837854
Loss for  4131 th iteration => 0.166583901891
Loss for  4132 th iteration => 0.166525077644
Loss for  4133 th iteration => 0.166464848646
Loss for  4134 th iteration => 0.166404740793
Loss for  4135 th iteration => 0.166344744203
Loss for  4136 th iteration => 0.166286823885
Loss for  4137 th iteration => 0.166228419348
Loss for  4138 th iteration => 0.166168779753
Loss for  4139 th iteration => 0.166108770723
Loss for  4140 th iteration => 0.166048880983
Loss for  4141 th iteration => 0.165989976617
Loss for  4142 th iteration => 0.165932232043
Loss for  4143 th iteration => 0.165873436922
Loss for  4144 th iteration => 0.165813541554
Loss for  4145 th iteration => 0.165753771733
Loss for  4146 th iteration => 0.165694402152
Loss for  4147 th iteration => 0.165637098404
Loss for  4148 th iteration => 0.165578836565
Loss for  4149 th iteration => 0.165519067167
Loss for  4150 th iteration => 0.165459428102
Loss for  4151 th iteration => 0.165399988138
Loss for  4152 th iteration => 0.165342959769
Loss for  4153 th iteration => 0.165284994974
Loss for  4154 th iteration => 0.165225361442
Loss for  4155 th iteration => 0.165165861835
Loss for  4156 th iteration => 0.165106651162
Loss for  4157 th iteration => 0.165049773414
Loss for  4158 th iteration => 0.164991926465
Loss for  4159 th iteration => 0.164932436573
Loss for  4160 th iteration => 0.164873083249
Loss for  4161 th iteration => 0.164814328675
Loss for  4162 th iteration => 0.164757507646
Loss for  4163 th iteration => 0.164699642743
Loss for  4164 th iteration => 0.164640302488
Loss for  4165 th iteration => 0.164581100696
Loss for  4166 th iteration => 0.164522973337
Loss for  4167 th iteration => 0.164466138534
Loss for  4168 th iteration => 0.164408152897
Loss for  4169 th iteration => 0.164348966828
Loss for  4170 th iteration => 0.164289920535
Loss for  4171 th iteration => 0.164232548985
Loss for  4172 th iteration => 0.164175647686
Loss for  4173 th iteration => 0.164117463652
Loss for  4174 th iteration => 0.164058435162
Loss for  4175 th iteration => 0.163999547311
Loss for  4176 th iteration => 0.163943027735
Loss for  4177 th iteration => 0.163886020708
Loss for  4178 th iteration => 0.163827579725
Loss for  4179 th iteration => 0.163768711295
Loss for  4180 th iteration => 0.163709984022
Loss for  4181 th iteration => 0.163654387874
Loss for  4182 th iteration => 0.163597246128
Loss for  4183 th iteration => 0.163538504182
Loss for  4184 th iteration => 0.163479797583
Loss for  4185 th iteration => 0.163421985014
Loss for  4186 th iteration => 0.163365925782
Loss for  4187 th iteration => 0.163308685289
Loss for  4188 th iteration => 0.163250015987
Loss for  4189 th iteration => 0.163191498587
Loss for  4190 th iteration => 0.163135402178
Loss for  4191 th iteration => 0.16307888466
Loss for  4192 th iteration => 0.16302087715
Loss for  4193 th iteration => 0.162962393933
Loss for  4194 th iteration => 0.162904580505
Loss for  4195 th iteration => 0.162848936225
Loss for  4196 th iteration => 0.162792130858
Loss for  4197 th iteration => 0.162733697297
Loss for  4198 th iteration => 0.162675422038
Loss for  4199 th iteration => 0.162619804812
Loss for  4200 th iteration => 0.162563508785
Loss for  4201 th iteration => 0.162505654543
Loss for  4202 th iteration => 0.162447422548
Loss for  4203 th iteration => 0.162390397343
Loss for  4204 th iteration => 0.162334846546
Loss for  4205 th iteration => 0.162278008921
Loss for  4206 th iteration => 0.162219834603
Loss for  4207 th iteration => 0.162161844381
Loss for  4208 th iteration => 0.162106855593
Loss for  4209 th iteration => 0.162050773212
Loss for  4210 th iteration => 0.161992669176
Loss for  4211 th iteration => 0.161934733511
Loss for  4212 th iteration => 0.161879663376
Loss for  4213 th iteration => 0.161823734098
Loss for  4214 th iteration => 0.161766180778
Loss for  4215 th iteration => 0.1617083032
Loss for  4216 th iteration => 0.16165228924
Loss for  4217 th iteration => 0.161596914382
Loss for  4218 th iteration => 0.161540100581
Loss for  4219 th iteration => 0.161482293493
Loss for  4220 th iteration => 0.161425687701
Loss for  4221 th iteration => 0.161370723025
Loss for  4222 th iteration => 0.161314441866
Loss for  4223 th iteration => 0.161256715867
Loss for  4224 th iteration => 0.161199802565
Loss for  4225 th iteration => 0.161145132425
Loss for  4226 th iteration => 0.161089217213
Loss for  4227 th iteration => 0.161031581212
Loss for  4228 th iteration => 0.160974589692
Loss for  4229 th iteration => 0.160920121425
Loss for  4230 th iteration => 0.16086443789
Loss for  4231 th iteration => 0.160806899286
Loss for  4232 th iteration => 0.160750014164
Loss for  4233 th iteration => 0.160695673589
Loss for  4234 th iteration => 0.160640113624
Loss for  4235 th iteration => 0.160582678495
Loss for  4236 th iteration => 0.160526048168
Loss for  4237 th iteration => 0.160471775976
Loss for  4238 th iteration => 0.16041625258
Loss for  4239 th iteration => 0.160358925878
Loss for  4240 th iteration => 0.160302669408
Loss for  4241 th iteration => 0.160248418257
Loss for  4242 th iteration => 0.160192861458
Loss for  4243 th iteration => 0.160135647183
Loss for  4244 th iteration => 0.160079859896
Loss for  4245 th iteration => 0.160025592079
Loss for  4246 th iteration => 0.159969945642
Loss for  4247 th iteration => 0.159912847001
Loss for  4248 th iteration => 0.159857605028
Loss for  4249 th iteration => 0.159803290597
Loss for  4250 th iteration => 0.159747509367
Loss for  4251 th iteration => 0.159690528909
Loss for  4252 th iteration => 0.159635892868
Loss for  4253 th iteration => 0.159581508123
Loss for  4254 th iteration => 0.159525555886
Loss for  4255 th iteration => 0.159468695617
Loss for  4256 th iteration => 0.159414713597
Loss for  4257 th iteration => 0.159360239867
Loss for  4258 th iteration => 0.159304087621
Loss for  4259 th iteration => 0.159247349103
Loss for  4260 th iteration => 0.159194059074
Loss for  4261 th iteration => 0.15913948174
Loss for  4262 th iteration => 0.159083106302
Loss for  4263 th iteration => 0.159026594324
Loss for  4264 th iteration => 0.158973395151
Loss for  4265 th iteration => 0.158919012014
Loss for  4266 th iteration => 0.158862346561
Loss for  4267 th iteration => 0.15880729414
Loss for  4268 th iteration => 0.158753688606
Loss for  4269 th iteration => 0.158698642606
Loss for  4270 th iteration => 0.158642129315
Loss for  4271 th iteration => 0.15858841664
Loss for  4272 th iteration => 0.15853445242
Loss for  4273 th iteration => 0.158478810056
Loss for  4274 th iteration => 0.158422443956
Loss for  4275 th iteration => 0.158369973614
Loss for  4276 th iteration => 0.158315689631
Loss for  4277 th iteration => 0.1582595048
Loss for  4278 th iteration => 0.158204176714
Loss for  4279 th iteration => 0.158151121629
Loss for  4280 th iteration => 0.158096711229
Loss for  4281 th iteration => 0.158040448467
Loss for  4282 th iteration => 0.157986949497
Loss for  4283 th iteration => 0.157933345757
Loss for  4284 th iteration => 0.157878058262
Loss for  4285 th iteration => 0.157822065572
Loss for  4286 th iteration => 0.15776955773
Loss for  4287 th iteration => 0.157716142471
Loss for  4288 th iteration => 0.157659798744
Loss for  4289 th iteration => 0.157605748821
Loss for  4290 th iteration => 0.157552648837
Loss for  4291 th iteration => 0.157497965019
Loss for  4292 th iteration => 0.15744201192
Loss for  4293 th iteration => 0.157389972684
Loss for  4294 th iteration => 0.157336255056
Loss for  4295 th iteration => 0.15728058811
Loss for  4296 th iteration => 0.15722619066
Loss for  4297 th iteration => 0.157173498929
Loss for  4298 th iteration => 0.15711931042
Loss for  4299 th iteration => 0.157063506129
Loss for  4300 th iteration => 0.157011455757
Loss for  4301 th iteration => 0.156958017473
Loss for  4302 th iteration => 0.156902654635
Loss for  4303 th iteration => 0.156848462061
Loss for  4304 th iteration => 0.156795962606
Loss for  4305 th iteration => 0.156741951158
Loss for  4306 th iteration => 0.156686316768
Loss for  4307 th iteration => 0.156634698493
Loss for  4308 th iteration => 0.156581366325
Loss for  4309 th iteration => 0.15652605112
Loss for  4310 th iteration => 0.15647244224
Loss for  4311 th iteration => 0.156419988454
Loss for  4312 th iteration => 0.156365935186
Loss for  4313 th iteration => 0.156310706469
Loss for  4314 th iteration => 0.156258981926
Loss for  4315 th iteration => 0.156205976682
Loss for  4316 th iteration => 0.156150526737
Loss for  4317 th iteration => 0.156098448585
Loss for  4318 th iteration => 0.156045673575
Loss for  4319 th iteration => 0.15599104075
Loss for  4320 th iteration => 0.15593733613
Loss for  4321 th iteration => 0.155885294192
Loss for  4322 th iteration => 0.155831711228
Loss for  4323 th iteration => 0.155776698884
Loss for  4324 th iteration => 0.155725263093
Loss for  4325 th iteration => 0.155672547082
Loss for  4326 th iteration => 0.15561733151
Loss for  4327 th iteration => 0.155565798418
Loss for  4328 th iteration => 0.155513189995
Loss for  4329 th iteration => 0.155458661418
Loss for  4330 th iteration => 0.155405720116
Loss for  4331 th iteration => 0.155353759024
Loss for  4332 th iteration => 0.155300152719
Loss for  4333 th iteration => 0.155246090071
Loss for  4334 th iteration => 0.155194662781
Loss for  4335 th iteration => 0.155141814624
Loss for  4336 th iteration => 0.155086876941
Loss for  4337 th iteration => 0.155036000407
Loss for  4338 th iteration => 0.154983892036
Loss for  4339 th iteration => 0.154928851586
Loss for  4340 th iteration => 0.154877571733
Loss for  4341 th iteration => 0.154825363482
Loss for  4342 th iteration => 0.154771206146
Loss for  4343 th iteration => 0.154718858496
Loss for  4344 th iteration => 0.154667164389
Loss for  4345 th iteration => 0.154613732072
Loss for  4346 th iteration => 0.154560554039
Loss for  4347 th iteration => 0.154509281042
Loss for  4348 th iteration => 0.154456438523
Loss for  4349 th iteration => 0.154402633748
Loss for  4350 th iteration => 0.154351702634
Loss for  4351 th iteration => 0.15429933427
Loss for  4352 th iteration => 0.154245077266
Loss for  4353 th iteration => 0.15419442058
Loss for  4354 th iteration => 0.154142427485
Loss for  4355 th iteration => 0.154087885301
Loss for  4356 th iteration => 0.154038003003
Loss for  4357 th iteration => 0.153985846064
Loss for  4358 th iteration => 0.153931540442
Loss for  4359 th iteration => 0.153880847243
Loss for  4360 th iteration => 0.153829105786
Loss for  4361 th iteration => 0.153775375322
Loss for  4362 th iteration => 0.1537240704
Loss for  4363 th iteration => 0.15367266698
Loss for  4364 th iteration => 0.153619398565
Loss for  4365 th iteration => 0.153567652637
Loss for  4366 th iteration => 0.153516521297
Loss for  4367 th iteration => 0.153463618192
Loss for  4368 th iteration => 0.153411577454
Loss for  4369 th iteration => 0.153360662038
Loss for  4370 th iteration => 0.153308041514
Loss for  4371 th iteration => 0.153255831061
Loss for  4372 th iteration => 0.153205083776
Loss for  4373 th iteration => 0.153152675097
Loss for  4374 th iteration => 0.153100401881
Loss for  4375 th iteration => 0.153049782073
Loss for  4376 th iteration => 0.152997524764
Loss for  4377 th iteration => 0.152945280153
Loss for  4378 th iteration => 0.152894753259
Loss for  4379 th iteration => 0.152842595626
Loss for  4380 th iteration => 0.152790457607
Loss for  4381 th iteration => 0.152739994269
Loss for  4382 th iteration => 0.152687892133
Loss for  4383 th iteration => 0.15263592721
Loss for  4384 th iteration => 0.152585502515
Loss for  4385 th iteration => 0.152533418125
Loss for  4386 th iteration => 0.152481682956
Loss for  4387 th iteration => 0.152431275784
Loss for  4388 th iteration => 0.152379176894
Loss for  4389 th iteration => 0.152327719694
Loss for  4390 th iteration => 0.15227731217
Loss for  4391 th iteration => 0.152225171239
Loss for  4392 th iteration => 0.152174032987
Loss for  4393 th iteration => 0.152123610004
Loss for  4394 th iteration => 0.152071403522
Loss for  4395 th iteration => 0.152020618997
Loss for  4396 th iteration => 0.151970167818
Loss for  4397 th iteration => 0.15191787572
Loss for  4398 th iteration => 0.151867474392
Loss for  4399 th iteration => 0.1518169843
Loss for  4400 th iteration => 0.151764589473
Loss for  4401 th iteration => 0.151714596264
Loss for  4402 th iteration => 0.151664058269
Loss for  4403 th iteration => 0.151611546123
Loss for  4404 th iteration => 0.151561982061
Loss for  4405 th iteration => 0.151511388653
Loss for  4406 th iteration => 0.151458746758
Loss for  4407 th iteration => 0.151409629537
Loss for  4408 th iteration => 0.151358974464
Loss for  4409 th iteration => 0.151306192239
Loss for  4410 th iteration => 0.151257536702
Loss for  4411 th iteration => 0.151206814791
Loss for  4412 th iteration => 0.151153994428
Loss for  4413 th iteration => 0.151105130871
Loss for  4414 th iteration => 0.151054974584
Loss for  4415 th iteration => 0.151002522828
Loss for  4416 th iteration => 0.15095334175
Loss for  4417 th iteration => 0.150902636984
Loss for  4418 th iteration => 0.150851419062
Loss for  4419 th iteration => 0.150801922038
Loss for  4420 th iteration => 0.150750698956
Loss for  4421 th iteration => 0.150700526789
Loss for  4422 th iteration => 0.150650745771
Loss for  4423 th iteration => 0.150599049874
Loss for  4424 th iteration => 0.15054985058
Loss for  4425 th iteration => 0.150499813106
Loss for  4426 th iteration => 0.1504476834
Loss for  4427 th iteration => 0.150399394363
Loss for  4428 th iteration => 0.150349124186
Loss for  4429 th iteration => 0.150297007067
Loss for  4430 th iteration => 0.150248371781
Loss for  4431 th iteration => 0.150198300686
Loss for  4432 th iteration => 0.150147237798
Loss for  4433 th iteration => 0.150098144101
Loss for  4434 th iteration => 0.150047337603
Loss for  4435 th iteration => 0.149997654824
Loss for  4436 th iteration => 0.149948155766
Loss for  4437 th iteration => 0.149896690729
Loss for  4438 th iteration => 0.149848265349
Loss for  4439 th iteration => 0.149798406556
Loss for  4440 th iteration => 0.149746636589
Loss for  4441 th iteration => 0.14969832501
Loss for  4442 th iteration => 0.149648952258
Loss for  4443 th iteration => 0.149597748784
Loss for  4444 th iteration => 0.149549005157
Loss for  4445 th iteration => 0.149498534349
Loss for  4446 th iteration => 0.149449181031
Loss for  4447 th iteration => 0.149399936887
Loss for  4448 th iteration => 0.149348661633
Loss for  4449 th iteration => 0.149300790592
Loss for  4450 th iteration => 0.149251105647
Loss for  4451 th iteration => 0.149199864939
Loss for  4452 th iteration => 0.149151666123
Loss for  4453 th iteration => 0.149101913546
Loss for  4454 th iteration => 0.14905207491
Loss for  4455 th iteration => 0.149003263487
Loss for  4456 th iteration => 0.14895253619
Loss for  4457 th iteration => 0.148904445827
Loss for  4458 th iteration => 0.148855097819
Loss for  4459 th iteration => 0.148804064653
Loss for  4460 th iteration => 0.148756136965
Loss for  4461 th iteration => 0.148706690276
Loss for  4462 th iteration => 0.148657000633
Loss for  4463 th iteration => 0.148608388579
Loss for  4464 th iteration => 0.148557856382
Loss for  4465 th iteration => 0.148510087474
Loss for  4466 th iteration => 0.148460877604
Loss for  4467 th iteration => 0.148410212035
Loss for  4468 th iteration => 0.148362377025
Loss for  4469 th iteration => 0.148312962788
Loss for  4470 th iteration => 0.148263840583
Loss for  4471 th iteration => 0.148215276102
Loss for  4472 th iteration => 0.148164705102
Loss for  4473 th iteration => 0.148117613528
Loss for  4474 th iteration => 0.148068412844
Loss for  4475 th iteration => 0.148018218037
Loss for  4476 th iteration => 0.147970360234
Loss for  4477 th iteration => 0.147920797359
Loss for  4478 th iteration => 0.147872516701
Loss for  4479 th iteration => 0.147823901945
Loss for  4480 th iteration => 0.147773512877
Loss for  4481 th iteration => 0.147726164147
Loss for  4482 th iteration => 0.147677315833
Loss for  4483 th iteration => 0.147628301308
Loss for  4484 th iteration => 0.147580103471
Loss for  4485 th iteration => 0.147529912946
Loss for  4486 th iteration => 0.147483220079
Loss for  4487 th iteration => 0.147434283807
Loss for  4488 th iteration => 0.147384636531
Loss for  4489 th iteration => 0.14733696787
Loss for  4490 th iteration => 0.147287514262
Loss for  4491 th iteration => 0.147240045305
Loss for  4492 th iteration => 0.147191544053
Loss for  4493 th iteration => 0.147141823072
Loss for  4494 th iteration => 0.147094528252
Loss for  4495 th iteration => 0.147045579172
Loss for  4496 th iteration => 0.146997695105
Loss for  4497 th iteration => 0.146949496046
Loss for  4498 th iteration => 0.146899810488
Loss for  4499 th iteration => 0.146852769933
Loss for  4500 th iteration => 0.146804144989
Loss for  4501 th iteration => 0.146756126066
Loss for  4502 th iteration => 0.146708127018
Loss for  4503 th iteration => 0.146658561125
Loss for  4504 th iteration => 0.146611682908
Loss for  4505 th iteration => 0.146563242112
Loss for  4506 th iteration => 0.146515305458
Loss for  4507 th iteration => 0.146467427954
Loss for  4508 th iteration => 0.146418046464
Loss for  4509 th iteration => 0.146371260058
Loss for  4510 th iteration => 0.146322894807
Loss for  4511 th iteration => 0.146275208271
Loss for  4512 th iteration => 0.146227392196
Loss for  4513 th iteration => 0.146178244607
Loss for  4514 th iteration => 0.146131496051
Loss for  4515 th iteration => 0.146083122125
Loss for  4516 th iteration => 0.146035815134
Loss for  4517 th iteration => 0.145988014587
Loss for  4518 th iteration => 0.145939138503
Loss for  4519 th iteration => 0.145892386665
Loss for  4520 th iteration => 0.145843938786
Loss for  4521 th iteration => 0.145797110842
Loss for  4522 th iteration => 0.145749290933
Loss for  4523 th iteration => 0.145700714691
Loss for  4524 th iteration => 0.14565392838
Loss for  4525 th iteration => 0.145605355982
Loss for  4526 th iteration => 0.145559083298
Loss for  4527 th iteration => 0.145511217665
Loss for  4528 th iteration => 0.145462962393
Loss for  4529 th iteration => 0.145416118116
Loss for  4530 th iteration => 0.145367382066
Loss for  4531 th iteration => 0.14532172274
Loss for  4532 th iteration => 0.14527379163
Loss for  4533 th iteration => 0.145225872852
Loss for  4534 th iteration => 0.145178953082
Loss for  4535 th iteration => 0.145130275865
Loss for  4536 th iteration => 0.145084295831
Loss for  4537 th iteration => 0.145037034469
Loss for  4538 th iteration => 0.144989573828
Loss for  4539 th iteration => 0.144942448218
Loss for  4540 th iteration => 0.14489422364
Loss for  4541 th iteration => 0.14484801018
Loss for  4542 th iteration => 0.144800029986
Loss for  4543 th iteration => 0.144753999844
Loss for  4544 th iteration => 0.144706576036
Loss for  4545 th iteration => 0.144658885603
Loss for  4546 th iteration => 0.144612386994
Loss for  4547 th iteration => 0.14456400564
Loss for  4548 th iteration => 0.144518434928
Loss for  4549 th iteration => 0.14447136066
Loss for  4550 th iteration => 0.144424279036
Loss for  4551 th iteration => 0.144377414759
Loss for  4552 th iteration => 0.144329622455
Loss for  4553 th iteration => 0.144283609002
Loss for  4554 th iteration => 0.144235721716
Loss for  4555 th iteration => 0.144190353755
Loss for  4556 th iteration => 0.144143069292
Loss for  4557 th iteration => 0.144095914602
Loss for  4558 th iteration => 0.144049506373
Loss for  4559 th iteration => 0.14400169904
Loss for  4560 th iteration => 0.143956075864
Loss for  4561 th iteration => 0.143908693487
Loss for  4562 th iteration => 0.143863009351
Loss for  4563 th iteration => 0.14381604008
Loss for  4564 th iteration => 0.143768992707
Loss for  4565 th iteration => 0.143722847735
Loss for  4566 th iteration => 0.143675189166
Loss for  4567 th iteration => 0.143629782568
Loss for  4568 th iteration => 0.143582638534
Loss for  4569 th iteration => 0.143537056538
Loss for  4570 th iteration => 0.143490253849
Loss for  4571 th iteration => 0.143443439413
Loss for  4572 th iteration => 0.14339742393
Loss for  4573 th iteration => 0.14335002791
Loss for  4574 th iteration => 0.143304717527
Loss for  4575 th iteration => 0.143257624091
Loss for  4576 th iteration => 0.14321243732
Loss for  4577 th iteration => 0.143165698447
Loss for  4578 th iteration => 0.143119203593
Loss for  4579 th iteration => 0.143073225262
Loss for  4580 th iteration => 0.143026170027
Loss for  4581 th iteration => 0.142980872943
Loss for  4582 th iteration => 0.142933697964
Loss for  4583 th iteration => 0.14288911075
Loss for  4584 th iteration => 0.142842365127
Loss for  4585 th iteration => 0.142796248855
Loss for  4586 th iteration => 0.142750244443
Loss for  4587 th iteration => 0.142703583034
Loss for  4588 th iteration => 0.142658242641
Loss for  4589 th iteration => 0.142611099952
Loss for  4590 th iteration => 0.142566364183
Loss for  4591 th iteration => 0.142520255965
Loss for  4592 th iteration => 0.142474664286
Loss for  4593 th iteration => 0.142428480436
Loss for  4594 th iteration => 0.142382343257
Loss for  4595 th iteration => 0.142336823349
Loss for  4596 th iteration => 0.142290204291
Loss for  4597 th iteration => 0.142245278809
Loss for  4598 th iteration => 0.142198400071
Loss for  4599 th iteration => 0.142154371643
Loss for  4600 th iteration => 0.142107908801
Loss for  4601 th iteration => 0.142062384019
Loss for  4602 th iteration => 0.142016592948
Loss for  4603 th iteration => 0.141970579085
Loss for  4604 th iteration => 0.141925389766
Loss for  4605 th iteration => 0.141878945326
Loss for  4606 th iteration => 0.141834294465
Loss for  4607 th iteration => 0.141787709424
Loss for  4608 th iteration => 0.141743798733
Loss for  4609 th iteration => 0.141697550932
Loss for  4610 th iteration => 0.141652304461
Loss for  4611 th iteration => 0.141606683773
Loss for  4612 th iteration => 0.141560984722
Loss for  4613 th iteration => 0.141515925679
Loss for  4614 th iteration => 0.141469829108
Loss for  4615 th iteration => 0.141425272519
Loss for  4616 th iteration => 0.141378828702
Loss for  4617 th iteration => 0.141334740891
Loss for  4618 th iteration => 0.141289164736
Loss for  4619 th iteration => 0.141244432866
Loss for  4620 th iteration => 0.141198736701
Loss for  4621 th iteration => 0.141153564587
Loss for  4622 th iteration => 0.141108416171
Loss for  4623 th iteration => 0.141062857498
Loss for  4624 th iteration => 0.141018199239
Loss for  4625 th iteration => 0.14097230302
Loss for  4626 th iteration => 0.140928082835
Loss for  4627 th iteration => 0.140881893777
Loss for  4628 th iteration => 0.140838097925
Loss for  4629 th iteration => 0.140792723989
Loss for  4630 th iteration => 0.140748312257
Loss for  4631 th iteration => 0.140702834372
Loss for  4632 th iteration => 0.140658025051
Loss for  4633 th iteration => 0.14061304847
Loss for  4634 th iteration => 0.140567891208
Loss for  4635 th iteration => 0.140523363069
Loss for  4636 th iteration => 0.140477903164
Loss for  4637 th iteration => 0.140433775653
Loss for  4638 th iteration => 0.140388054404
Loss for  4639 th iteration => 0.140344284268
Loss for  4640 th iteration => 0.140298499385
Loss for  4641 th iteration => 0.14025535263
Loss for  4642 th iteration => 0.140209797669
Loss for  4643 th iteration => 0.140165611525
Loss for  4644 th iteration => 0.140120538501
Loss for  4645 th iteration => 0.140076020008
Loss for  4646 th iteration => 0.140031378225
Loss for  4647 th iteration => 0.139986570952
Loss for  4648 th iteration => 0.139942314545
Loss for  4649 th iteration => 0.13989725821
Loss for  4650 th iteration => 0.139853345687
Loss for  4651 th iteration => 0.139808076465
Loss for  4652 th iteration => 0.139764470287
Loss for  4653 th iteration => 0.139719021109
Loss for  4654 th iteration => 0.139675687314
Loss for  4655 th iteration => 0.139630368338
Loss for  4656 th iteration => 0.139587403287
Loss for  4657 th iteration => 0.139542195086
Loss for  4658 th iteration => 0.139498444376
Loss for  4659 th iteration => 0.139453649912
Loss for  4660 th iteration => 0.139409624421
Loss for  4661 th iteration => 0.139365199856
Loss for  4662 th iteration => 0.139320937647
Loss for  4663 th iteration => 0.139276843315
Loss for  4664 th iteration => 0.139232379056
Loss for  4665 th iteration => 0.139188579064
Loss for  4666 th iteration => 0.13914394431
Loss for  4667 th iteration => 0.139100406179
Loss for  4668 th iteration => 0.139055629632
Loss for  4669 th iteration => 0.139012323974
Loss for  4670 th iteration => 0.138967431726
Loss for  4671 th iteration => 0.138924331952
Loss for  4672 th iteration => 0.138879347707
Loss for  4673 th iteration => 0.13883642976
Loss for  4674 th iteration => 0.13879148879
Loss for  4675 th iteration => 0.138749071761
Loss for  4676 th iteration => 0.138704187995
Loss for  4677 th iteration => 0.138661074852
Loss for  4678 th iteration => 0.138616531733
Loss for  4679 th iteration => 0.138573206026
Loss for  4680 th iteration => 0.138528967098
Loss for  4681 th iteration => 0.138485460832
Loss for  4682 th iteration => 0.1384414931
Loss for  4683 th iteration => 0.138397835399
Loss for  4684 th iteration => 0.138354109001
Loss for  4685 th iteration => 0.138310326352
Loss for  4686 th iteration => 0.138266814261
Loss for  4687 th iteration => 0.138222930736
Loss for  4688 th iteration => 0.138179608495
Loss for  4689 th iteration => 0.138135645959
Loss for  4690 th iteration => 0.138092491436
Loss for  4691 th iteration => 0.138048469741
Loss for  4692 th iteration => 0.138005462913
Loss for  4693 th iteration => 0.137961400069
Loss for  4694 th iteration => 0.137918522823
Loss for  4695 th iteration => 0.137874435164
Loss for  4696 th iteration => 0.137831671114
Loss for  4697 th iteration => 0.137787573444
Loss for  4698 th iteration => 0.137744907775
Loss for  4699 th iteration => 0.137700813501
Loss for  4700 th iteration => 0.137658232818
Loss for  4701 th iteration => 0.13761415408
Loss for  4702 th iteration => 0.137571646276
Loss for  4703 th iteration => 0.137527594054
Loss for  4704 th iteration => 0.13748514819
Loss for  4705 th iteration => 0.137441132412
Loss for  4706 th iteration => 0.137398738607
Loss for  4707 th iteration => 0.137354768244
Loss for  4708 th iteration => 0.137312417577
Loss for  4709 th iteration => 0.137268500727
Loss for  4710 th iteration => 0.137226185147
Loss for  4711 th iteration => 0.137182329114
Loss for  4712 th iteration => 0.137140041359
Loss for  4713 th iteration => 0.137096252728
Loss for  4714 th iteration => 0.137053986251
Loss for  4715 th iteration => 0.137010270949
Loss for  4716 th iteration => 0.136968019853
Loss for  4717 th iteration => 0.136924383214
Loss for  4718 th iteration => 0.136882142188
Loss for  4719 th iteration => 0.136838589003
Loss for  4720 th iteration => 0.136796353271
Loss for  4721 th iteration => 0.136752887838
Loss for  4722 th iteration => 0.136710653108
Loss for  4723 th iteration => 0.136667279281
Loss for  4724 th iteration => 0.136625041698
Loss for  4725 th iteration => 0.136581762923
Loss for  4726 th iteration => 0.13653951903
Loss for  4727 th iteration => 0.136496338386
Loss for  4728 th iteration => 0.136454085088
Loss for  4729 th iteration => 0.136411005318
Loss for  4730 th iteration => 0.136368739847
Loss for  4731 th iteration => 0.13632576339
Loss for  4732 th iteration => 0.136283483274
Loss for  4733 th iteration => 0.136240612295
Loss for  4734 th iteration => 0.13619831533
Loss for  4735 th iteration => 0.136155551744
Loss for  4736 th iteration => 0.136113235971
Loss for  4737 th iteration => 0.136070581463
Loss for  4738 th iteration => 0.136028245144
Loss for  4739 th iteration => 0.135985701195
Loss for  4740 th iteration => 0.135943342793
Loss for  4741 th iteration => 0.135900910697
Loss for  4742 th iteration => 0.135858528856
Loss for  4743 th iteration => 0.135816209738
Loss for  4744 th iteration => 0.135773803265
Loss for  4745 th iteration => 0.135731598095
Loss for  4746 th iteration => 0.13568916595
Loss for  4747 th iteration => 0.13564707556
Loss for  4748 th iteration => 0.135604616836
Loss for  4749 th iteration => 0.13556264193
Loss for  4750 th iteration => 0.135520155842
Loss for  4751 th iteration => 0.135478297011
Loss for  4752 th iteration => 0.135435782886
Loss for  4753 th iteration => 0.135394040617
Loss for  4754 th iteration => 0.135351497883
Loss for  4755 th iteration => 0.135309872569
Loss for  4756 th iteration => 0.135267300744
Loss for  4757 th iteration => 0.135225792692
Loss for  4758 th iteration => 0.135183191377
Loss for  4759 th iteration => 0.135141800818
Loss for  4760 th iteration => 0.13509916969
Loss for  4761 th iteration => 0.135057896784
Loss for  4762 th iteration => 0.135015235586
Loss for  4763 th iteration => 0.134974080431
Loss for  4764 th iteration => 0.134931388967
Loss for  4765 th iteration => 0.134890351605
Loss for  4766 th iteration => 0.134847629734
Loss for  4767 th iteration => 0.134806710152
Loss for  4768 th iteration => 0.134763957785
Loss for  4769 th iteration => 0.134723155927
Loss for  4770 th iteration => 0.134680373018
Loss for  4771 th iteration => 0.134639688784
Loss for  4772 th iteration => 0.134596875329
Loss for  4773 th iteration => 0.134556308582
Loss for  4774 th iteration => 0.134513469893
Loss for  4775 th iteration => 0.134472750137
Loss for  4776 th iteration => 0.134430652388
Loss for  4777 th iteration => 0.134389400681
Loss for  4778 th iteration => 0.134347209264
Loss for  4779 th iteration => 0.134306121314
Loss for  4780 th iteration => 0.134264228542
Loss for  4781 th iteration => 0.13422293774
Loss for  4782 th iteration => 0.134181326521
Loss for  4783 th iteration => 0.134139848769
Loss for  4784 th iteration => 0.134098503566
Loss for  4785 th iteration => 0.134056853353
Loss for  4786 th iteration => 0.134015760031
Loss for  4787 th iteration => 0.133973950562
Loss for  4788 th iteration => 0.133933096257
Loss for  4789 th iteration => 0.13389113957
Loss for  4790 th iteration => 0.133850512567
Loss for  4791 th iteration => 0.133808419639
Loss for  4792 th iteration => 0.133768009265
Loss for  4793 th iteration => 0.133725790106
Loss for  4794 th iteration => 0.133685586634
Loss for  4795 th iteration => 0.133643250376
Loss for  4796 th iteration => 0.133603244932
Loss for  4797 th iteration => 0.133561147074
Loss for  4798 th iteration => 0.133520618351
Loss for  4799 th iteration => 0.133478879387
Loss for  4800 th iteration => 0.133438196678
Loss for  4801 th iteration => 0.133396828765
Loss for  4802 th iteration => 0.133355875275
Loss for  4803 th iteration => 0.133314854068
Loss for  4804 th iteration => 0.133273652183
Loss for  4805 th iteration => 0.133232955584
Loss for  4806 th iteration => 0.133191525697
Loss for  4807 th iteration => 0.13315113364
Loss for  4808 th iteration => 0.133109494326
Loss for  4809 th iteration => 0.133069388581
Loss for  4810 th iteration => 0.133027556765
Loss for  4811 th iteration => 0.132987720762
Loss for  4812 th iteration => 0.132945769336
Loss for  4813 th iteration => 0.132905895444
Loss for  4814 th iteration => 0.13286442784
Loss for  4815 th iteration => 0.13282407244
Loss for  4816 th iteration => 0.132783031114
Loss for  4817 th iteration => 0.132742353875
Loss for  4818 th iteration => 0.13270170895
Loss for  4819 th iteration => 0.132660737082
Loss for  4820 th iteration => 0.132620461468
Loss for  4821 th iteration => 0.132579219757
Loss for  4822 th iteration => 0.132539288877
Loss for  4823 th iteration => 0.132497799899
Loss for  4824 th iteration => 0.132458191447
Loss for  4825 th iteration => 0.132416495801
Loss for  4826 th iteration => 0.132376964768
Loss for  4827 th iteration => 0.132335801765
Loss for  4828 th iteration => 0.132295658224
Loss for  4829 th iteration => 0.132254961683
Loss for  4830 th iteration => 0.132214459482
Loss for  4831 th iteration => 0.132174195451
Loss for  4832 th iteration => 0.132133365283
Loss for  4833 th iteration => 0.132093502997
Loss for  4834 th iteration => 0.132052372826
Loss for  4835 th iteration => 0.132012884389
Loss for  4836 th iteration => 0.131971479689
Loss for  4837 th iteration => 0.131932339797
Loss for  4838 th iteration => 0.131891300581
Loss for  4839 th iteration => 0.131851456695
Loss for  4840 th iteration => 0.131810807941
Loss for  4841 th iteration => 0.131770676282
Loss for  4842 th iteration => 0.131730512774
Loss for  4843 th iteration => 0.131690005176
Loss for  4844 th iteration => 0.131650290923
Loss for  4845 th iteration => 0.131609439793
Loss for  4846 th iteration => 0.131570142191
Loss for  4847 th iteration => 0.131528977056
Loss for  4848 th iteration => 0.131490066552
Loss for  4849 th iteration => 0.131449314106
Loss for  4850 th iteration => 0.13140963626
Loss for  4851 th iteration => 0.131369274759
Loss for  4852 th iteration => 0.131329286557
Loss for  4853 th iteration => 0.131289439447
Loss for  4854 th iteration => 0.131249048903
Loss for  4855 th iteration => 0.131209677208
Loss for  4856 th iteration => 0.131168919222
Loss for  4857 th iteration => 0.131129987649
Loss for  4858 th iteration => 0.131089125066
Loss for  4859 th iteration => 0.131050131674
Loss for  4860 th iteration => 0.131009821601
Loss for  4861 th iteration => 0.130970116917
Loss for  4862 th iteration => 0.130930362331
Loss for  4863 th iteration => 0.130890218525
Loss for  4864 th iteration => 0.130850976351
Loss for  4865 th iteration => 0.13081043171
Loss for  4866 th iteration => 0.130771662963
Loss for  4867 th iteration => 0.130730988423
Loss for  4868 th iteration => 0.130692191205
Loss for  4869 th iteration => 0.130652093229
Loss for  4870 th iteration => 0.130612520963
Loss for  4871 th iteration => 0.130573003709
Loss for  4872 th iteration => 0.130532969835
Loss for  4873 th iteration => 0.130493987646
Loss for  4874 th iteration => 0.130453532546
Loss for  4875 th iteration => 0.130415044113
Loss for  4876 th iteration => 0.130374752115
Loss for  4877 th iteration => 0.130335835095
Loss for  4878 th iteration => 0.130296051391
Loss for  4879 th iteration => 0.130256515492
Loss for  4880 th iteration => 0.130217327336
Loss for  4881 th iteration => 0.130177316616
Loss for  4882 th iteration => 0.130138676771
Loss for  4883 th iteration => 0.13009823288
Loss for  4884 th iteration => 0.130060146733
Loss for  4885 th iteration => 0.130020125184
Loss for  4886 th iteration => 0.129981035232
Loss for  4887 th iteration => 0.129941687688
Loss for  4888 th iteration => 0.129902075705
Loss for  4889 th iteration => 0.129863325024
Loss for  4890 th iteration => 0.129823236905
Loss for  4891 th iteration => 0.129785035579
Loss for  4892 th iteration => 0.129745112333
Loss for  4893 th iteration => 0.129706492612
Loss for  4894 th iteration => 0.129667079033
Loss for  4895 th iteration => 0.129627775919
Loss for  4896 th iteration => 0.129588999865
Loss for  4897 th iteration => 0.129549184953
Loss for  4898 th iteration => 0.129510994877
Loss for  4899 th iteration => 0.129471031068
Loss for  4900 th iteration => 0.129432850813
Loss for  4901 th iteration => 0.129393499201
Loss for  4902 th iteration => 0.129354385751
Loss for  4903 th iteration => 0.129315699855
Loss for  4904 th iteration => 0.129276050035
Loss for  4905 th iteration => 0.129237975468
Loss for  4906 th iteration => 0.129198140342
Loss for  4907 th iteration => 0.129160124783
Loss for  4908 th iteration => 0.129120926116
Loss for  4909 th iteration => 0.129081917896
Loss for  4910 th iteration => 0.129043404109
Loss for  4911 th iteration => 0.129003842886
Loss for  4912 th iteration => 0.128965957621
Loss for  4913 th iteration => 0.128926372787
Loss for  4914 th iteration => 0.128888325639
Loss for  4915 th iteration => 0.128849344091
Loss for  4916 th iteration => 0.128810381573
Loss for  4917 th iteration => 0.128772097551
Loss for  4918 th iteration => 0.128732571078
Loss for  4919 th iteration => 0.128694926889
Loss for  4920 th iteration => 0.128655679565
Loss for  4921 th iteration => 0.128617460961
Loss for  4922 th iteration => 0.128578741354
Loss for  4923 th iteration => 0.128539782853
Loss for  4924 th iteration => 0.128501768722
Loss for  4925 th iteration => 0.128462239381
Loss for  4926 th iteration => 0.128424872158
Loss for  4927 th iteration => 0.128386024755
Loss for  4928 th iteration => 0.128347535385
Loss for  4929 th iteration => 0.12830910859
Loss for  4930 th iteration => 0.128270125209
Loss for  4931 th iteration => 0.128232408469
Loss for  4932 th iteration => 0.128192948694
Loss for  4933 th iteration => 0.128155691899
Loss for  4934 th iteration => 0.128116995122
Loss for  4935 th iteration => 0.128078424771
Loss for  4936 th iteration => 0.128040487664
Loss for  4937 th iteration => 0.128001300153
Loss for  4938 th iteration => 0.127964058687
Loss for  4939 th iteration => 0.127925318887
Loss for  4940 th iteration => 0.12788727619
Loss for  4941 th iteration => 0.12784904523
Loss for  4942 th iteration => 0.127810292241
Loss for  4943 th iteration => 0.127772808405
Loss for  4944 th iteration => 0.127733631368
Loss for  4945 th iteration => 0.127696537219
Loss for  4946 th iteration => 0.127658129678
Loss for  4947 th iteration => 0.127619702821
Loss for  4948 th iteration => 0.127582082681
Loss for  4949 th iteration => 0.12754301518
Loss for  4950 th iteration => 0.12750611553
Loss for  4951 th iteration => 0.127467839153
Loss for  4952 th iteration => 0.12742967139
Loss for  4953 th iteration => 0.127391824151
Loss for  4954 th iteration => 0.127353128608
Loss for  4955 th iteration => 0.127316046514
Loss for  4956 th iteration => 0.127277433629
Loss for  4957 th iteration => 0.127240052451
Loss for  4958 th iteration => 0.127202080492
Loss for  4959 th iteration => 0.127163662695
Loss for  4960 th iteration => 0.12712649071
Loss for  4961 th iteration => 0.127087689499
Loss for  4962 th iteration => 0.127050855318
Loss for  4963 th iteration => 0.127012839629
Loss for  4964 th iteration => 0.126974625523
Loss for  4965 th iteration => 0.126937436784
Loss for  4966 th iteration => 0.126898564781
Loss for  4967 th iteration => 0.126862088338
Loss for  4968 th iteration => 0.126824092814
Loss for  4969 th iteration => 0.126786024238
Loss for  4970 th iteration => 0.12674887638
Loss for  4971 th iteration => 0.126710116707
Loss for  4972 th iteration => 0.126673942856
Loss for  4973 th iteration => 0.126635805774
Loss for  4974 th iteration => 0.126597951281
Loss for  4975 th iteration => 0.126560773727
Loss for  4976 th iteration => 0.126522206029
Loss for  4977 th iteration => 0.126485886197
Loss for  4978 th iteration => 0.126448007761
Loss for  4979 th iteration => 0.126410309887
Loss for  4980 th iteration => 0.126373159876
Loss for  4981 th iteration => 0.126334731934
Loss for  4982 th iteration => 0.126298406623
Loss for  4983 th iteration => 0.126260692693
Loss for  4984 th iteration => 0.12622310632
Loss for  4985 th iteration => 0.126186028866
Loss for  4986 th iteration => 0.126147699702
Loss for  4987 th iteration => 0.126111479818
Loss for  4988 th iteration => 0.126073855871
Loss for  4989 th iteration => 0.126036345633
Loss for  4990 th iteration => 0.125999376021
Loss for  4991 th iteration => 0.125961113537
Loss for  4992 th iteration => 0.125925086489
Loss for  4993 th iteration => 0.125887493534
Loss for  4994 th iteration => 0.12585003177
Loss for  4995 th iteration => 0.125813197553
Loss for  4996 th iteration => 0.125774976667
Loss for  4997 th iteration => 0.125739211221
Loss for  4998 th iteration => 0.125701602562
Loss for  4999 th iteration => 0.125664167698

Prediction

In [19]:
##Predictions


def prediction(X,W1,b1,W2,b2):
    Z1pred = np.dot(W1,X) + b1
    A1pred = relu(Z1pred,play = "forward")
    Z2pred = np.dot(W2,A1pred) + b2
    A2pred = sigmoid(Z2pred,play = "forward")
    prediction = []


    for i in range(A2pred.shape[1]):
        if (A2pred[0][i]>0.5):
            prediction.append(1)
        elif (A2pred[0][i] <= 0.5):
            prediction.append(0)
    N = len(prediction)
    prediction = np.array(prediction)
    prediction =prediction.reshape(1,N)
    
    return prediction

Prediction for Train data

In [20]:
predictionTrain = prediction(X,W1,b1,W2,b2)
In [21]:
Truelabel = y

Predicted Label and True label

In [13]:
print "prediction label for traindata:",predictionTrain
print "True label for train data:", Truelabel
prediction label for traindata: [[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
  0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
  1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1]]
True label for train data: [[ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.
   0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.
   0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  1.  1.  1.  1.
   1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
   1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
   1.  1.  1.  1.  1.  1.  1.  1.  1.  1.]]

(We should actually predict for unseen data which is not in our training set. Creation of test data and prediction is given below)

Creating Test data

In [22]:
Tc1 = [3.5,4]
Tc2 = [11.5,12]
TClass1 = np.matlib.repmat(Tc1, no,1) + np.random.randn(no,len(c1))
TClass2 = np.matlib.repmat(Tc2, no,1) + np.random.randn(no,len(c2))
TData = np.append(TClass1,TClass2,axis = 0)
Testlabel  = np.append(np.zeros((no,1)),np.ones((no,1)),axis = 0)
X2 = TData.T
y2 = Testlabel.T

Plotting the test data

In [23]:
import matplotlib.pyplot as plt
plt.plot(TClass1[:,0],TClass1[:,1],'ro')
plt.plot(TClass2[:,0],TClass2[:,1],'bo')

plt.ylabel('Data')
plt.show()

Prediction for Test data

In [24]:
predictionTest = prediction(X2,W1,b1,W2,b2)
print "prediction label for testndata:",predictionTest
print "True label for test data:",y2
prediction label for testndata: [[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
  0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
  1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1]]
True label for test data: [[ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.
   0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.
   0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  1.  1.  1.  1.
   1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
   1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.
   1.  1.  1.  1.  1.  1.  1.  1.  1.  1.]]

Comments