Plot a training loss graph

hi
i used deepspeech and did the training to the end and get the wer result
i saved the the run log in a text file like below
i need to plot the train loss and dev loss , how i can do this with my thext file that i saved after i finished my run
?

Please don’t post images.

  1. Either write them manually into Excel or something.

  2. Use Tensorboard for the TF logs.

thanx
next time i will care about image
?but how i can use Tensorboard
is there a command
please guied me in steps

Please take the time to read the links we provide …

which link?
link of policy?

How to ask questions here, that you just clicked. It talks about searching before asking. Please do so and ask if anything is still unclear.

help yourself:

$ git grep -i tensorboard
training/deepspeech_training/util/flags.py:    f.DEFINE_string('summary_dir', '', 'target directory for TensorBoard summaries - defaults to directory "deepspeech/summaries" within user\'s data home specified by the XDG Base Directory Specification')

ok i try it manaually in Excel
but because # of steps in valiation ( i have 11 steps )not equal to number of steps in training
(i have 190 steps)
i cant draw well in Excel
so can i try the last step only from each epoch for training (step # 190 ) and valiadation ( step # 11) to be able to draw them in Excel
?
is this right
like this:
epoch dev_loss train_loss
281.13 287.76 1

and complete like this for all the epochs

Please read more about deep learning in a book, website, … This is totally normal, you have more training than validation material. And read about overfitting.

ok that’s normal and i know this
i just ask to draw in excel
i want to draw the train loss and validiation loss in the same graph
so it’s right to take the last step in each epoch for train and validation
this my question

I have no idea what you mean. At the end of each epoch you get a loss for train and dev. Those two values should be almost the same. If not, you got other problems. I therefore see no problem at all in drawing them in one graph.

Please post all your losses as a list if you need more help.

i want to take the loss value of step 189 in train for each epoch and the dev loss in step 11 in each epoch
then draw with these values for each epochs in excel

my loss

I STARTING
Epoch 0 Training Steps: 0 Loss: 0
Epoch 0 Training Steps: 1 Loss: 321.413727
Epoch 0 Training Steps: 2 Loss: 274.250114
Epoch 0 Training Steps: 3 Loss: 228.34049
Epoch 0 Training Steps: 4 Loss: 214.206993
Epoch 0 Training Steps: 5 Loss: 208.963528
Epoch 0 Training Steps: 6 Loss: 196.165075
Epoch 0 Training Steps: 7 Loss: 188.474908
Epoch 0 Training Steps: 8 Loss: 184.570635
Epoch 0 Training Steps: 9 Loss: 180.292511
Epoch 0 Training Steps: 10 Loss: 176.097978
Epoch 0 Training Steps: 11 Loss: 172.2453
Epoch 0 Training Steps: 12 Loss: 172.627689
Epoch 0 Training Steps: 13 Loss: 171.214926
Epoch 0 Training Steps: 14 Loss: 169.700439
Epoch 0 Training Steps: 15 Loss: 167.763425
Epoch 0 Training Steps: 16 Loss: 168.02154
Epoch 0 Training Steps: 17 Loss: 166.868439
Epoch 0 Training Steps: 18 Loss: 166.02983
Epoch 0 Training Steps: 19 Loss: 165.570595
Epoch 0 Training Steps: 20 Loss: 166.347802
Epoch 0 Training Steps: 21 Loss: 165.931118
Epoch 0 Training Steps: 22 Loss: 166.198478
Epoch 0 Training Steps: 23 Loss: 165.929836
Epoch 0 Training Steps: 24 Loss: 166.504939
Epoch 0 Training Steps: 25 Loss: 166.984387
Epoch 0 Training Steps: 26 Loss: 167.228056
Epoch 0 Training Steps: 27 Loss: 167.864794
Epoch 0 Training Steps: 28 Loss: 167.890928
Epoch 0 Training Steps: 29 Loss: 168.119032
Epoch 0 Training Steps: 30 Loss: 168.249385
Epoch 0 Training Steps: 31 Loss: 168.954264
Epoch 0 Training Steps: 32 Loss: 169.742381
Epoch 0 Training Steps: 33 Loss: 171.271692
Epoch 0 Training Steps: 34 Loss: 171.657708
Epoch 0 Training Steps: 35 Loss: 172.394131
Epoch 0 Training Steps: 36 Loss: 173.379232
Epoch 0 Training Steps: 37 Loss: 173.88664
Epoch 0 Training Steps: 38 Loss: 174.682091
Epoch 0 Training Steps: 39 Loss: 175.717712
Epoch 0 Training Steps: 40 Loss: 176.827482
Epoch 0 Training Steps: 41 Loss: 177.491054
Epoch 0 Training Steps: 42 Loss: 178.152844
Epoch 0 Training Steps: 43 Loss: 180.227668
Epoch 0 Training Steps: 44 Loss: 180.763937
Epoch 0 Training Steps: 45 Loss: 181.684411
Epoch 0 Training Steps: 46 Loss: 182.522912
Epoch 0 Training Steps: 47 Loss: 182.713796
Epoch 0 Training Steps: 48 Loss: 183.554088
Epoch 0 Training Steps: 49 Loss: 184.815238
Epoch 0 Training Steps: 50 Loss: 185.667325
Epoch 0 Training Steps: 51 Loss: 186.256064
Epoch 0 Training Steps: 52 Loss: 187.207739
Epoch 0 Training Steps: 53 Loss: 187.980089
Epoch 0 Training Steps: 54 Loss: 189.253979
Epoch 0 Training Steps: 55 Loss: 189.837776
Epoch 0 Training Steps: 56 Loss: 190.47533
Epoch 0 Training Steps: 57 Loss: 191.604262
Epoch 0 Training Steps: 58 Loss: 192.498031
Epoch 0 Training Steps: 59 Loss: 193.232261
Epoch 0 Training Steps: 60 Loss: 193.997013
Epoch 0 Training Steps: 61 Loss: 195.137949
Epoch 0 Training Steps: 62 Loss: 195.657982
Epoch 0 Training Steps: 63 Loss: 196.571748
Epoch 0 Training Steps: 64 Loss: 197.321499
Epoch 0 Training Steps: 65 Loss: 198.016298
Epoch 0 Training Steps: 66 Loss: 199.045432
Epoch 0 Training Steps: 67 Loss: 199.629425
Epoch 0 Training Steps: 68 Loss: 200.057538
Epoch 0 Training Steps: 69 Loss: 200.296114
Epoch 0 Training Steps: 70 Loss: 201.072869
Epoch 0 Training Steps: 71 Loss: 202.161726
Epoch 0 Training Steps: 72 Loss: 203.318399
Epoch 0 Training Steps: 73 Loss: 203.950737
Epoch 0 Training Steps: 74 Loss: 204.757916
Epoch 0 Training Steps: 75 Loss: 205.641084
Epoch 0 Training Steps: 76 Loss: 206.276845
Epoch 0 Training Steps: 77 Loss: 206.804886
Epoch 0 Training Steps: 78 Loss: 207.704927
Epoch 0 Training Steps: 79 Loss: 208.565614
Epoch 0 Training Steps: 80 Loss: 209.202545
Epoch 0 Training Steps: 81 Loss: 209.647832
Epoch 0 Training Steps: 82 Loss: 210.59161
Epoch 0 Training Steps: 83 Loss: 211.53423
Epoch 0 Training Steps: 84 Loss: 212.042662
Epoch 0 Training Steps: 85 Loss: 212.994805
Epoch 0 Training Steps: 86 Loss: 213.555281
Epoch 0 Training Steps: 87 Loss: 214.340172
Epoch 0 Training Steps: 88 Loss: 214.901834
Epoch 0 Training Steps: 89 Loss: 215.380082
Epoch 0 Training Steps: 90 Loss: 216.007
Epoch 0 Training Steps: 91 Loss: 216.689602
Epoch 0 Training Steps: 92 Loss: 217.729824
Epoch 0 Training Steps: 93 Loss: 218.747647
Epoch 0 Training Steps: 94 Loss: 219.555463
Epoch 0 Training Steps: 95 Loss: 220.19487
Epoch 0 Training Steps: 96 Loss: 220.981938
Epoch 0 Training Steps: 97 Loss: 221.674749
Epoch 0 Training Steps: 98 Loss: 222.2959
Epoch 0 Training Steps: 99 Loss: 222.708379
Epoch 0 Training Steps: 100 Loss: 223.552934
Epoch 0 Training Steps: 101 Loss: 224.617271
Epoch 0 Training Steps: 102 Loss: 225.368554
Epoch 0 Training Steps: 103 Loss: 225.779093
Epoch 0 Training Steps: 104 Loss: 226.112573
Epoch 0 Training Steps: 105 Loss: 227.088227
Epoch 0 Training Steps: 106 Loss: 228.005511
Epoch 0 Training Steps: 107 Loss: 228.628655
Epoch 0 Training Steps: 108 Loss: 228.989545
Epoch 0 Training Steps: 109 Loss: 229.576564
Epoch 0 Training Steps: 110 Loss: 230.278917
Epoch 0 Training Steps: 111 Loss: 230.986006
Epoch 0 Training Steps: 112 Loss: 232.104331
Epoch 0 Training Steps: 113 Loss: 232.767508
Epoch 0 Training Steps: 114 Loss: 233.622867
Epoch 0 Training Steps: 115 Loss: 234.132088
Epoch 0 Training Steps: 116 Loss: 234.709264
Epoch 0 Training Steps: 117 Loss: 235.154217
Epoch 0 Training Steps: 118 Loss: 235.8605
Epoch 0 Training Steps: 119 Loss: 236.522191
Epoch 0 Training Steps: 120 Loss: 237.046127
Epoch 0 Training Steps: 121 Loss: 237.819763
Epoch 0 Training Steps: 122 Loss: 238.326473
Epoch 0 Training Steps: 123 Loss: 238.806476
Epoch 0 Training Steps: 124 Loss: 239.872133
Epoch 0 Training Steps: 125 Loss: 240.310268
Epoch 0 Training Steps: 126 Loss: 241.310326
Epoch 0 Training Steps: 127 Loss: 241.970896
Epoch 0 Training Steps: 128 Loss: 242.311593
Epoch 0 Training Steps: 129 Loss: 242.914341
Epoch 0 Training Steps: 130 Loss: 243.5522
Epoch 0 Training Steps: 131 Loss: 244.06975
Epoch 0 Training Steps: 132 Loss: 244.686553
Epoch 0 Training Steps: 133 Loss: 245.319197
Epoch 0 Training Steps: 134 Loss: 246.175137
Epoch 0 Training Steps: 135 Loss: 246.983793
Epoch 0 Training Steps: 136 Loss: 247.68559
Epoch 0 Training Steps: 137 Loss: 248.654289
Epoch 0 Training Steps: 138 Loss: 248.971996
Epoch 0 Training Steps: 139 Loss: 249.722913
Epoch 0 Training Steps: 140 Loss: 250.562798
Epoch 0 Training Steps: 141 Loss: 251.229299
Epoch 0 Training Steps: 142 Loss: 252.088471
Epoch 0 Training Steps: 143 Loss: 252.475349
Epoch 0 Training Steps: 144 Loss: 253.000376
Epoch 0 Training Steps: 145 Loss: 253.820551
Epoch 0 Training Steps: 146 Loss: 254.543652
Epoch 0 Training Steps: 147 Loss: 255.558965
Epoch 0 Training Steps: 148 Loss: 256.409514
Epoch 0 Training Steps: 149 Loss: 256.8467
Epoch 0 Training Steps: 150 Loss: 257.505127
Epoch 0 Training Steps: 151 Loss: 258.138038
Epoch 0 Training Steps: 152 Loss: 258.769062
Epoch 0 Training Steps: 153 Loss: 259.715915
Epoch 0 Training Steps: 154 Loss: 260.289005
Epoch 0 Training Steps: 155 Loss: 261.089299
Epoch 0 Training Steps: 156 Loss: 261.823436
Epoch 0 Training Steps: 157 Loss: 262.680497
Epoch 0 Training Steps: 158 Loss: 263.436903
Epoch 0 Training Steps: 159 Loss: 264.142053
Epoch 0 Training Steps: 160 Loss: 264.542494
Epoch 0 Training Steps: 161 Loss: 265.214357
Epoch 0 Training Steps: 162 Loss: 265.863622
Epoch 0 Training Steps: 163 Loss: 266.352972
Epoch 0 Training Steps: 164 Loss: 266.940236
Epoch 0 Training Steps: 165 Loss: 267.257496
Epoch 0 Training Steps: 166 Loss: 267.829717
Epoch 0 Training Steps: 167 Loss: 268.251283
Epoch 0 Training Steps: 168 Loss: 268.727684
Epoch 0 Training Steps: 169 Loss: 269.405363
Epoch 0 Training Steps: 170 Loss: 270.175061
Epoch 0 Training Steps: 171 Loss: 270.617402
Epoch 0 Training Steps: 172 Loss: 271.275992
Epoch 0 Training Steps: 173 Loss: 271.883822
Epoch 0 Training Steps: 174 Loss: 272.443641
Epoch 0 Training Steps: 175 Loss: 272.816036
Epoch 0 Training Steps: 176 Loss: 273.475419
Epoch 0 Training Steps: 177 Loss: 273.821474
Epoch 0 Training Steps: 178 Loss: 274.248569
Epoch 0 Training Steps: 179 Loss: 274.570182
Epoch 0 Training Steps: 180 Loss: 275.009499
Epoch 0 Training Steps: 181 Loss: 275.850796
Epoch 0 Training Steps: 182 Loss: 276.580054
Epoch 0 Training Steps: 183 Loss: 276.538092
Epoch 0 Training Steps: 184 Loss: 276.722951
Epoch 0 Training Steps: 185 Loss: 277.158755
Epoch 0 Training Steps: 186 Loss: 277.767248
Epoch 0 Training Steps: 187 Loss: 278.458153
Epoch 0 Training Steps: 188 Loss: 279.663077
Epoch 0 Training Steps: 189 Loss: 281.13091
Epoch 0 Training Steps: 189 Loss: 281.13091
Epoch 0 Validation Steps: 0 Loss: 0
Epoch 0 Validation Steps: 1 Loss: 121.765717
Epoch 0 Validation Steps: 2 Loss: 145.615875
Epoch 0 Validation Steps: 3 Loss: 168.556376
Epoch 0 Validation Steps: 4 Loss: 188.541576
Epoch 0 Validation Steps: 5 Loss: 203.227158
Epoch 0 Validation Steps: 6 Loss: 215.930758
Epoch 0 Validation Steps: 7 Loss: 231.736989
Epoch 0 Validation Steps: 8 Loss: 245.324053
Epoch 0 Validation Steps: 9 Loss: 260.627809
Epoch 0 Validation Steps: 10 Loss: 273.784914
Epoch 0 Validation Steps: 11 Loss: 287.678149
Epoch 0 Validation Steps: 11 Loss: 287.678149
I Saved best to: /media/suhad/Backup/Female/Female_Model/check/best_dev-189
Epoch 1 Training Steps: 0 Loss: 0
Epoch 1 Training Steps: 1 Loss: 93.885338
Epoch 1 Training Steps: 2 Loss: 105.46907
Epoch 1 Training Steps: 3 Loss: 107.689229
Epoch 1 Training Steps: 4 Loss: 108.453352
Epoch 1 Training Steps: 5 Loss: 110.235846
Epoch 1 Training Steps: 6 Loss: 110.105855
Epoch 1 Training Steps: 7 Loss: 111.845933
Epoch 1 Training Steps: 8 Loss: 113.319835
Epoch 1 Training Steps: 9 Loss: 113.653712
Epoch 1 Training Steps: 10 Loss: 114.393449
Epoch 1 Training Steps: 11 Loss: 114.510318
Epoch 1 Training Steps: 12 Loss: 117.430259
Epoch 1 Training Steps: 13 Loss: 118.39091
Epoch 1 Training Steps: 14 Loss: 119.480721
Epoch 1 Training Steps: 15 Loss: 119.997499
Epoch 1 Training Steps: 16 Loss: 122.1861
Epoch 1 Training Steps: 17 Loss: 122.610662
Epoch 1 Training Steps: 18 Loss: 123.250743
Epoch 1 Training Steps: 19 Loss: 124.257875
Epoch 1 Training Steps: 20 Loss: 126.287601
Epoch 1 Training Steps: 21 Loss: 127.017939
Epoch 1 Training Steps: 22 Loss: 128.295131
Epoch 1 Training Steps: 23 Loss: 128.994421
Epoch 1 Training Steps: 24 Loss: 130.484928
Epoch 1 Training Steps: 25 Loss: 131.705511
Epoch 1 Training Steps: 26 Loss: 132.590434
Epoch 1 Training Steps: 27 Loss: 133.795081
Epoch 1 Training Steps: 28 Loss: 134.48608
Epoch 1 Training Steps: 29 Loss: 135.280229
Epoch 1 Training Steps: 30 Loss: 135.958721
Epoch 1 Training Steps: 31 Loss: 137.018428
Epoch 1 Training Steps: 32 Loss: 138.153359
Epoch 1 Training Steps: 33 Loss: 140.187495
Epoch 1 Training Steps: 34 Loss: 140.887085
Epoch 1 Training Steps: 35 Loss: 141.804683
Epoch 1 Training Steps: 36 Loss: 143.052915
Epoch 1 Training Steps: 37 Loss: 143.802977
Epoch 1 Training Steps: 38 Loss: 144.887164
Epoch 1 Training Steps: 39 Loss: 146.170584
Epoch 1 Training Steps: 40 Loss: 147.38827
Epoch 1 Training Steps: 41 Loss: 148.224903
Epoch 1 Training Steps: 42 Loss: 149.100671
Epoch 1 Training Steps: 43 Loss: 151.412516
Epoch 1 Training Steps: 44 Loss: 151.983169
Epoch 1 Training Steps: 45 Loss: 152.98084
Epoch 1 Training Steps: 46 Loss: 153.905083
Epoch 1 Training Steps: 47 Loss: 154.180877
Epoch 1 Training Steps: 48 Loss: 155.030491
Epoch 1 Training Steps: 49 Loss: 156.210837
Epoch 1 Training Steps: 50 Loss: 157.13859
Epoch 1 Training Steps: 51 Loss: 157.853126
Epoch 1 Training Steps: 52 Loss: 158.792322
Epoch 1 Training Steps: 53 Loss: 159.630391
Epoch 1 Training Steps: 54 Loss: 160.958859
Epoch 1 Training Steps: 55 Loss: 161.536199
Epoch 1 Training Steps: 56 Loss: 162.181854
Epoch 1 Training Steps: 57 Loss: 163.210989
Epoch 1 Training Steps: 58 Loss: 164.012153
Epoch 1 Training Steps: 59 Loss: 164.711438
Epoch 1 Training Steps: 60 Loss: 165.398431
Epoch 1 Training Steps: 61 Loss: 166.497156
Epoch 1 Training Steps: 62 Loss: 166.949616
Epoch 1 Training Steps: 63 Loss: 167.791386
Epoch 1 Training Steps: 64 Loss: 168.424391
Epoch 1 Training Steps: 65 Loss: 169.032524
Epoch 1 Training Steps: 66 Loss: 169.969604
Epoch 1 Training Steps: 67 Loss: 170.476334
Epoch 1 Training Steps: 68 Loss: 170.872876
Epoch 1 Training Steps: 69 Loss: 171.016887
Epoch 1 Training Steps: 70 Loss: 171.71477
Epoch 1 Training Steps: 71 Loss: 172.731642
Epoch 1 Training Steps: 72 Loss: 173.824774
Epoch 1 Training Steps: 73 Loss: 174.48142
Epoch 1 Training Steps: 74 Loss: 175.198805
Epoch 1 Training Steps: 75 Loss: 175.902579
Epoch 1 Training Steps: 76 Loss: 176.417507
Epoch 1 Training Steps: 77 Loss: 176.848157
Epoch 1 Training Steps: 78 Loss: 177.638632
Epoch 1 Training Steps: 79 Loss: 178.379598
Epoch 1 Training Steps: 80 Loss: 178.950134
Epoch 1 Training Steps: 81 Loss: 179.240046
Epoch 1 Training Steps: 82 Loss: 180.042554
Epoch 1 Training Steps: 83 Loss: 180.740483
Epoch 1 Training Steps: 84 Loss: 181.106959
Epoch 1 Training Steps: 85 Loss: 181.832513
Epoch 1 Training Steps: 86 Loss: 182.245962
Epoch 1 Training Steps: 87 Loss: 182.959026
Epoch 1 Training Steps: 88 Loss: 183.329708
Epoch 1 Training Steps: 89 Loss: 183.716621
Epoch 1 Training Steps: 90 Loss: 184.30786
Epoch 1 Training Steps: 91 Loss: 184.786527
Epoch 1 Training Steps: 92 Loss: 185.534353
Epoch 1 Training Steps: 93 Loss: 186.373447
Epoch 1 Training Steps: 94 Loss: 187.028669
Epoch 1 Training Steps: 95 Loss: 187.432783
Epoch 1 Training Steps: 96 Loss: 188.049592
Epoch 1 Training Steps: 97 Loss: 188.550222
Epoch 1 Training Steps: 98 Loss: 188.9246
Epoch 1 Training Steps: 99 Loss: 189.203066
Epoch 1 Training Steps: 100 Loss: 189.891383
Epoch 1 Training Steps: 101 Loss: 190.601805
Epoch 1 Training Steps: 102 Loss: 191.126005
Epoch 1 Training Steps: 103 Loss: 191.316733
Epoch 1 Training Steps: 104 Loss: 191.414879
Epoch 1 Training Steps: 105 Loss: 192.088687
Epoch 1 Training Steps: 106 Loss: 192.7903
Epoch 1 Training Steps: 107 Loss: 193.233029
Epoch 1 Training Steps: 108 Loss: 193.340453
Epoch 1 Training Steps: 109 Loss: 193.653095
Epoch 1 Training Steps: 110 Loss: 194.087426
Epoch 1 Training Steps: 111 Loss: 194.498201
Epoch 1 Training Steps: 112 Loss: 195.194921
Epoch 1 Training Steps: 113 Loss: 195.575221
Epoch 1 Training Steps: 114 Loss: 196.097759
Epoch 1 Training Steps: 115 Loss: 196.351289
Epoch 1 Training Steps: 116 Loss: 196.627336
Epoch 1 Training Steps: 117 Loss: 196.79859
Epoch 1 Training Steps: 118 Loss: 197.204077
Epoch 1 Training Steps: 119 Loss: 197.498634
Epoch 1 Training Steps: 120 Loss: 197.694531
Epoch 1 Training Steps: 121 Loss: 198.156382
Epoch 1 Training Steps: 122 Loss: 198.331369
Epoch 1 Training Steps: 123 Loss: 198.567763
Epoch 1 Training Steps: 124 Loss: 199.285493
Epoch 1 Training Steps: 125 Loss: 199.465777
Epoch 1 Training Steps: 126 Loss: 200.146315
Epoch 1 Training Steps: 127 Loss: 200.504977
Epoch 1 Training Steps: 128 Loss: 200.566802
Epoch 1 Training Steps: 129 Loss: 200.813883
Epoch 1 Training Steps: 130 Loss: 201.146574
Epoch 1 Training Steps: 131 Loss: 201.42598
Epoch 1 Training Steps: 132 Loss: 201.742304
Epoch 1 Training Steps: 133 Loss: 202.07935
Epoch 1 Training Steps: 134 Loss: 202.608607
Epoch 1 Training Steps: 135 Loss: 203.133457
Epoch 1 Training Steps: 136 Loss: 203.460926
Epoch 1 Training Steps: 137 Loss: 204.152738
Epoch 1 Training Steps: 138 Loss: 204.194313
Epoch 1 Training Steps: 139 Loss: 204.552941
Epoch 1 Training Steps: 140 Loss: 205.011426
Epoch 1 Training Steps: 141 Loss: 205.396051
Epoch 1 Training Steps: 142 Loss: 205.939877
Epoch 1 Training Steps: 143 Loss: 206.044201
Epoch 1 Training Steps: 144 Loss: 206.254487
Epoch 1 Training Steps: 145 Loss: 206.820629
Epoch 1 Training Steps: 146 Loss: 207.20973
Epoch 1 Training Steps: 147 Loss: 207.938738
Epoch 1 Training Steps: 148 Loss: 208.487425
Epoch 1 Training Steps: 149 Loss: 208.734159
Epoch 1 Training Steps: 150 Loss: 208.999833
Epoch 1 Training Steps: 151 Loss: 209.346942
Epoch 1 Training Steps: 152 Loss: 209.746165
Epoch 1 Training Steps: 153 Loss: 210.366259
Epoch 1 Training Steps: 154 Loss: 210.627233
Epoch 1 Training Steps: 155 Loss: 211.114068
Epoch 1 Training Steps: 156 Loss: 211.54022
Epoch 1 Training Steps: 157 Loss: 212.003948
Epoch 1 Training Steps: 158 Loss: 212.538191
Epoch 1 Training Steps: 159 Loss: 212.992242
Epoch 1 Training Steps: 160 Loss: 213.149705
Epoch 1 Training Steps: 161 Loss: 213.459446
Epoch 1 Training Steps: 162 Loss: 213.8256
Epoch 1 Training Steps: 163 Loss: 214.084145
Epoch 1 Training Steps: 164 Loss: 214.363102
Epoch 1 Training Steps: 165 Loss: 214.455296
Epoch 1 Training Steps: 166 Loss: 214.709592
Epoch 1 Training Steps: 167 Loss: 214.927484
Epoch 1 Training Steps: 168 Loss: 215.082259
Epoch 1 Training Steps: 169 Loss: 215.497172
Epoch 1 Training Steps: 170 Loss: 215.870428
Epoch 1 Training Steps: 171 Loss: 216.07729
Epoch 1 Training Steps: 172 Loss: 216.439629
Epoch 1 Training Steps: 173 Loss: 216.653421
Epoch 1 Training Steps: 174 Loss: 216.89354
Epoch 1 Training Steps: 175 Loss: 216.931771
Epoch 1 Training Steps: 176 Loss: 217.252459
Epoch 1 Training Steps: 177 Loss: 217.30363
Epoch 1 Training Steps: 178 Loss: 217.359916
Epoch 1 Training Steps: 179 Loss: 217.33748
Epoch 1 Training Steps: 180 Loss: 217.492984
Epoch 1 Training Steps: 181 Loss: 217.845402
Epoch 1 Training Steps: 182 Loss: 218.099005
Epoch 1 Training Steps: 183 Loss: 217.969162
Epoch 1 Training Steps: 184 Loss: 218.02442
Epoch 1 Training Steps: 185 Loss: 218.33145
Epoch 1 Training Steps: 186 Loss: 218.777045
Epoch 1 Training Steps: 187 Loss: 219.238011
Epoch 1 Training Steps: 188 Loss: 220.083989
Epoch 1 Training Steps: 189 Loss: 221.144483
Epoch 1 Training Steps: 189 Loss: 221.144483
Epoch 1 Validation Steps: 0 Loss: 0
Epoch 1 Validation Steps: 1 Loss: 113.519081
Epoch 00:00.0 Validation Steps: 2 Loss: 129.466358
Epoch 1 Validation Steps: 3 Loss: 146.661507
Epoch 1 Validation Steps: 4 Loss: 162.482805
Epoch 1 Validation Steps: 5 Loss: 173.632558
Epoch 1 Validation Steps: 6 Loss: 182.39616
Epoch 1 Validation Steps: 7 Loss: 195.113823
Epoch 1 Validation Steps: 8 Loss: 206.321427
Epoch 1 Validation Steps: 9 Loss: 218.254684
Epoch 1 Validation Steps: 10 Loss: 228.056385
Epoch 1 Validation Steps: 11 Loss: 238.205143
Epoch 1 Validation Steps: 11 Loss: 238.205143
WARNING:tensorflow:From /home/suhad/anaconda3/envs/tf_gpu/lib/python3.6/site-packages/tensorflow_core/python/training/saver.py:963: (from be removed a future
Instructions for
Use standard APIs prefix.
W1012 25:02.3 deprecation.py:323] is deprecated will be
Instructions for
Use standard APIs prefix.
I Saved best to: /media/suhad/Backup/Female/Female_Model/check/best_dev-378
Epoch 2 Training Steps: 0 Loss: 0
Epoch 2 Training Steps: 1 Loss: 84.537727
Epoch 2 Training Steps: 2 Loss: 89.887283
Epoch 2 Training Steps: 3 Loss: 89.940051
Epoch 2 Training Steps: 4 Loss: 88.482643
Epoch 2 Training Steps: 5 Loss: 89.026488
Epoch 2 Training Steps: 6 Loss: 88.198677
Epoch 2 Training Steps: 7 Loss: 89.369774
Epoch 2 Training Steps: 8 Loss: 90.209833
Epoch 2 Training Steps: 9 Loss: 90.084576
Epoch 2 Training Steps: 10 Loss: 89.908315
Epoch 2 Training Steps: 11 Loss: 89.419215
Epoch 2 Training Steps: 12 Loss: 91.501235
Epoch 2 Training Steps: 13 Loss: 91.74459
Epoch 2 Training Steps: 14 Loss: 92.346613
Epoch 2 Training Steps: 15 Loss: 92.259706
Epoch 2 Training Steps: 16 Loss: 93.669928
Epoch 2 Training Steps: 17 Loss: 93.670913
Epoch 2 Training Steps: 18 Loss: 94.025137
Epoch 2 Training Steps: 19 Loss: 94.202892
Epoch 2 Training Steps: 20 Loss: 95.735138
Epoch 2 Training Steps: 21 Loss: 95.701679
Epoch 2 Training Steps: 22 Loss: 97.086191
Epoch 2 Training Steps: 23 Loss: 97.11573
Epoch 2 Training Steps: 24 Loss: 97.891394
Epoch 2 Training Steps: 25 Loss: 98.39599
Epoch 2 Training Steps: 26 Loss: 98.826638
Epoch 2 Training Steps: 27 Loss: 99.453126
Epoch 2 Training Steps: 28 Loss: 99.573188
Epoch 2 Training Steps: 29 Loss: 99.940886
Epoch 2 Training Steps: 30 Loss: 100.200798
Epoch 2 Training Steps: 31 Loss: 100.711968
Epoch 2 Training Steps: 32 Loss: 101.451385
Epoch 2 Training Steps: 33 Loss: 103.628692
Epoch 2 Training Steps: 34 Loss: 103.946555
Epoch 2 Training Steps: 35 Loss: 104.452206
Epoch 2 Training Steps: 36 Loss: 105.102262
Epoch 2 Training Steps: 37 Loss: 105.348009
Epoch 2 Training Steps: 38 Loss: 106.173072
Epoch 2 Training Steps: 39 Loss: 107.217326
Epoch 2 Training Steps: 40 Loss: 108.002983
Epoch 2 Training Steps: 41 Loss: 108.347418
Epoch 2 Training Steps: 42 Loss: 108.7728
Epoch 2 Training Steps: 43 Loss: 111.414766
Epoch 2 Training Steps: 44 Loss: 111.557681
Epoch 2 Training Steps: 45 Loss: 112.318907
Epoch 2 Training Steps: 46 Loss: 112.791566
Epoch 2 Training Steps: 47 Loss: 112.681681
Epoch 2 Training Steps: 48 Loss: 113.14829
Epoch 2 Training Steps: 49 Loss: 113.811397
Epoch 2 Training Steps: 50 Loss: 114.327677
Epoch 2 Training Steps: 51 Loss: 114.606676
Epoch 2 Training Steps: 52 Loss: 115.00661
Epoch 2 Training Steps: 53 Loss: 115.560557
Epoch 2 Training Steps: 54 Loss: 116.476157
Epoch 2 Training Steps: 55 Loss: 116.783333
Epoch 2 Training Steps: 56 Loss: 117.076862
Epoch 2 Training Steps: 57 Loss: 117.684257
Epoch 2 Training Steps: 58 Loss: 118.073152
Epoch 2 Training Steps: 59 Loss: 118.549344
Epoch 2 Training Steps: 60 Loss: 118.89284
Epoch 2 Training Steps: 61 Loss: 119.817285
Epoch 2 Training Steps: 62 Loss: 119.927378
Epoch 2 Training Steps: 63 Loss: 120.486283
Epoch 2 Training Steps: 64 Loss: 120.81123
Epoch 2 Training Steps: 65 Loss: 121.160249
Epoch 2 Training Steps: 66 Loss: 121.820687
Epoch 2 Training Steps: 67 Loss: 121.981956
Epoch 2 Training Steps: 68 Loss: 122.163768
Epoch 2 Training Steps: 69 Loss: 122.053853
Epoch 2 Training Steps: 70 Loss: 122.437831
Epoch 2 Training Steps: 71 Loss: 123.172638
Epoch 2 Training Steps: 72 Loss: 123.937284
Epoch 2 Training Steps: 73 Loss: 124.441416
Epoch 2 Training Steps: 74 Loss: 125.042134
Epoch 2 Training Steps: 75 Loss: 125.393838
Epoch 2 Training Steps: 76 Loss: 125.65218
Epoch 2 Training Steps: 77 Loss: 125.837172
Epoch 2 Training Steps: 78 Loss: 126.41847
Epoch 2 Training Steps: 79 Loss: 126.938837
Epoch 2 Training Steps: 80 Loss: 127.183979
Epoch 2 Training Steps: 81 Loss: 127.250914
Epoch 2 Training Steps: 82 Loss: 127.799975
Epoch 2 Training Steps: 83 Loss: 128.195802
Epoch 2 Training Steps: 84 Loss: 128.384886
Epoch 2 Training Steps: 85 Loss: 128.899599
Epoch 2 Training Steps: 86 Loss: 129.139158
Epoch 2 Training Steps: 87 Loss: 129.587692
Epoch 2 Training Steps: 88 Loss: 129.838109
Epoch 2 Training Steps: 89 Loss: 129.940831
Epoch 2 Training Steps: 90 Loss: 130.499837
Epoch 2 Training Steps: 91 Loss: 130.83491
Epoch 2 Training Steps: 92 Loss: 131.381455
Epoch 2 Training Steps: 93 Loss: 131.892342
Epoch 2 Training Steps: 94 Loss: 132.48936
Epoch 2 Training Steps: 95 Loss: 132.767396
Epoch 2 Training Steps: 96 Loss: 133.271364
Epoch 2 Training Steps: 97 Loss: 133.658829
Epoch 2 Training Steps: 98 Loss: 133.900596
Epoch 2 Training Steps: 99 Loss: 134.173488
Epoch 2 Training Steps: 100 Loss: 134.868402
Epoch 2 Training Steps: 101 Loss: 135.402219
Epoch 2 Training Steps: 102 Loss: 135.795759
Epoch 2 Training Steps: 103 Loss: 135.93179
Epoch 2 Training Steps: 104 Loss: 135.95868
Epoch 2 Training Steps: 105 Loss: 136.493353
Epoch 2 Training Steps: 106 Loss: 137.098573
Epoch 2 Training Steps: 107 Loss: 137.55074
Epoch 2 Training Steps: 108 Loss: 137.610922
Epoch 2 Training Steps: 109 Loss: 137.837417
Epoch 2 Training Steps: 110 Loss: 138.146239
Epoch 2 Training Steps: 111 Loss: 138.472342
Epoch 2 Training Steps: 112 Loss: 138.98538
Epoch 2 Training Steps: 113 Loss: 139.269186
Epoch 2 Training Steps: 114 Loss: 139.697878
Epoch 2 Training Steps: 115 Loss: 139.986969
Epoch 2 Training Steps: 116 Loss: 140.149149
Epoch 2 Training Steps: 117 Loss: 140.321091
Epoch 2 Training Steps: 118 Loss: 140.635578
Epoch 2 Training Steps: 119 Loss: 140.864992
Epoch 2 Training Steps: 120 Loss: 141.021265
Epoch 2 Training Steps: 121 Loss: 141.427827
Epoch 2 Training Steps: 122 Loss: 141.51486
Epoch 2 Training Steps: 123 Loss: 141.737867
Epoch 2 Training Steps: 124 Loss: 142.372535
Epoch 2 Training Steps: 125 Loss: 142.469672
Epoch 2 Training Steps: 126 Loss: 143.115664
Epoch 2 Training Steps: 127 Loss: 143.392886
Epoch 2 Training Steps: 128 Loss: 143.439242
Epoch 2 Training Steps: 129 Loss: 143.637522
Epoch 2 Training Steps: 130 Loss: 143.877761
Epoch 2 Training Steps: 131 Loss: 144.155935
Epoch 2 Training Steps: 132 Loss: 144.415318
Epoch 2 Training Steps: 133 Loss: 144.692758
Epoch 2 Training Steps: 134 Loss: 145.158993
Epoch 2 Training Steps: 135 Loss: 145.647907
Epoch 2 Training Steps: 136 Loss: 145.943614
Epoch 2 Training Steps: 137 Loss: 146.564897
Epoch 2 Training Steps: 138 Loss: 146.611113
Epoch 2 Training Steps: 139 Loss: 146.898208
Epoch 2 Training Steps: 140 Loss: 147.298507
Epoch 2 Training Steps: 141 Loss: 147.654377
Epoch 2 Training Steps: 142 Loss: 148.123632
Epoch 2 Training Steps: 143 Loss: 148.208316
Epoch 2 Training Steps: 144 Loss: 148.432006
Epoch 2 Training Steps: 145 Loss: 148.926877
Epoch 2 Training Steps: 146 Loss: 149.2973
Epoch 2 Training Steps: 147 Loss: 149.962235
Epoch 2 Training Steps: 148 Loss: 150.536769
Epoch 2 Training Steps: 149 Loss: 150.774858
Epoch 2 Training Steps: 150 Loss: 151.00682
Epoch 2 Training Steps: 151 Loss: 151.367225
Epoch 2 Training Steps: 152 Loss: 151.702883
Epoch 2 Training Steps: 153 Loss: 152.304684
Epoch 2 Training Steps: 154 Loss: 152.489052
Epoch 2 Training Steps: 155 Loss: 152.957764
Epoch 2 Training Steps: 156 Loss: 153.407811
Epoch 2 Training Steps: 157 Loss: 153.818284
Epoch 2 Training Steps: 158 Loss: 154.39387
Epoch 2 Training Steps: 159 Loss: 154.839447
Epoch 2 Training Steps: 160 Loss: 155.019265
Epoch 2 Training Steps: 161 Loss: 155.311042
Epoch 2 Training Steps: 162 Loss: 155.708984
Epoch 2 Training Steps: 163 Loss: 156.052133
Epoch 2 Training Steps: 164 Loss: 156.314715
Epoch 2 Training Steps: 165 Loss: 156.419458
Epoch 2 Training Steps: 166 Loss: 156.700117
Epoch 2 Training Steps: 167 Loss: 156.962358
Epoch 2 Training Steps: 168 Loss: 157.137838
Epoch 2 Training Steps: 169 Loss: 157.562598
Epoch 2 Training Steps: 170 Loss: 157.906284
Epoch 2 Training Steps: 171 Loss: 158.107699
Epoch 2 Training Steps: 172 Loss: 158.489592
Epoch 2 Training Steps: 173 Loss: 158.703795
Epoch 2 Training Steps: 174 Loss: 158.95709
Epoch 2 Training Steps: 175 Loss: 159.012622
Epoch 2 Training Steps: 176 Loss: 159.322996
Epoch 2 Training Steps: 177 Loss: 159.424613
Epoch 2 Training Steps: 178 Loss: 159.502893
Epoch 2 Training Steps: 179 Loss: 159.515867
Epoch 2 Training Steps: 180 Loss: 159.730942
Epoch 2 Training Steps: 181 Loss: 160.017307
Epoch 2 Training Steps: 182 Loss: 160.260526
Epoch 2 Training Steps: 183 Loss: 160.241578
Epoch 2 Training Steps: 184 Loss: 160.376199
Epoch 2 Training Steps: 185 Loss: 160.738875
Epoch 2 Training Steps: 186 Loss: 161.252483
Epoch 2 Training Steps: 187 Loss: 161.737454
Epoch 2 Training Steps: 188 Loss: 162.518582
Epoch 2 Training Steps: 189 Loss: 163.543491
Epoch 2 Training Steps: 189 Loss: 163.543491
Epoch 2 Validation Steps: 0 Loss: 0
Epoch 2 Validation Steps: 1 Loss: 102.870201
Epoch 2 Validation Steps: 2 Loss: 115.629292
Epoch 2 Validation Steps: 3 Loss: 129.61867
Epoch 2 Validation Steps: 4 Loss: 143.766481
Epoch 2 Validation Steps: 5 Loss: 153.634288
Epoch 2 Validation Steps: 6 Loss: 161.297087
Epoch 2 Validation Steps: 7 Loss: 172.887999
Epoch 2 Validation Steps: 8 Loss: 182.471589
Epoch 2 Validation Steps: 9 Loss: 192.893213
Epoch 2 Validation Steps: 10 Loss: 201.555518
Epoch 2 Validation Steps: 11 Loss: 210.967455
Epoch 2 Validation Steps: 11 Loss: 210.967455
I Saved best to: /media/suhad/Backup/Female/Female_Model/check/best_dev-567
Epoch 3 Training Steps: 0 Loss: 0
Epoch 3 Training Steps: 1 Loss: 72.287598
Epoch 3 Training Steps: 2 Loss: 75.019886
Epoch 3 Training Steps: 3 Loss: 73.277911
Epoch 3

i list only the first
because the it excced the max number that allowed here in the post

Just take the last value of train and dev for each epoch as you described above.

@othiele
and this is will be ok and correct
?
or take the avgerage of all the steps for each epoch in train and dev then draw
?
which’s is the correct
?

@lissyx
hi
iam sorry for annoy you
but if have any answer about my question

i want to take the loss value of the last step in train for each epoch and the dev loss value in the step in each epoch
then draw with these values for each epochs in excel
or
take the avgerage of all the steps for each epoch in train and dev then draw
which is correct
?