神经网络

1.基础知识

神经网络是由具有适应性的简单单元组成的广泛并行互连的网络

Perceptron 感知机

感知机只有两层神经元组成,而且只有输出层是M-P神经单元也就是功能神经元

反向传播算法(Back propagation)可以应用于多层前馈神经网络,还可以应用于训练递归神经网络

一般说 BP算法就是训练的多层前馈神经网络.

深度学习的基本名词

卷积神经网络(convolutional neural network CNN)

cnn复合多个 卷积层 和 采样层 来对输入信号进行加工.最终在连接层实现与输出目标之间的映射.

卷积层:包含多个特征映射,每个特征映射是一个由多个神经元构成的平面.

采样层:基于局部相关性原理进行亚采样,减少数据量的同时保留有用信息.

换个角度理解就是 用机器代替原来专家的"特征工程(feature engineering)"

神经网络的激活函数

1.logitic:典型的激活函数sigmod函数,在计算分类概率时,非常有用.

f(z)=11+exp(z), 0<f(z)<1

2.Tanh:

f(z)=tanh(z)=ezezez+ez ,1<f(z)<1

3.Relu:线性修正函数,函数的主要目的是对抗梯度消失,当梯度反向传播到第一层的时候,梯度容易趋近于0或者一个非常小的值.

f(z)=max(0,x)

卷积神经网络(CNN)

卷积:就是两个操作在时间维度上的融合.

(fg)(τ)=f(τ)g(tτ)dτ
卷积的使用范围可以被延展到离散域,数学表达式为
(fg)[n]=m=f(m)g(nm)
卷积运算中最重要的是核函数,利用核函数分别与每个点的乘积再求和.作为下一个层的元素点.

2.思想脉络

根据训练数据集来调整神经元之间的连接权 connection weight ,以及每个功能神经元的阈值.

也就是说,神经网络所学到的东西都在连接权和阈值中.

参数的确定(利用迭代更新)调整感知机(神经网络)的权重.

ωiω+Δωi

Δωi=η(yy^xi)

先将输入事例提供给输入层神经元,逐层将信号进行前传,直到产生输出层的结果

计算输出层的误差,再将误差逆向传播至隐层神经元

最后根据隐层神经元的误差来对连接权和阈值进行调整.并进行迭代循环进行.

3.算法推导

BP算法:

训练集
D={(x1,y1),(x2,y2),...,(xm,ym)}

%E9%80%89%E5%8C%BA_005.png

输入:d个属性

输出:l维实值向量 阈值θj

隐藏层:q个隐层神经元网络 阈值 γh

bh=f1(αhγh)

yj=f2(βjθj)

任意参数的更新估计式

υυ+Δυ

BP算法基于梯度下降策略来进行参数的调整

知识点补充:梯度下降法(gradient descent)

梯度下降法是一种常用的一阶优化方法,是求解无约束优化问题最简单,最经典的方法之一.

f(x)是连续可微函数,且满足

f(xt+1)<f(xt)t=0,1,2,3...

则不断执行该过程可收敛到局部最小点,根据泰勒公式展开

f(x+Δx)f(x)+ΔxTf(x)
为了使f(x+Δx)<f(x) 可以让
Δx=γf(x), γ
目标函数:Ek=12j=1l(yjk^yjK)最小化目标函数推导Δυih的更新公式:对目标函数进行求导
Ekυih=Ekbh.bhαh=j=1lEkβj.βjαhf(αhγh)=i=1lωhjgjf(αhγh)=bh(1bh)j=1lωhjgj.
隐藏层和输出层的激活函数是相同的
全局最小 & 局部最小

其实整个算法是一个参数寻优的过程.找到一组最优的参数.

4.编程推导

4.1BP算法,在西瓜数据集3.0上用算法训练一个单隐层神经网络

PesudoCode:

          输入:训练集学习率 过程:1.在(0,1)范围内随机初始化网络中所有的连接权值和阈值2.repeat3. for all (Xk,Yk) do4.      根据当前参数和公式,计算当前样本的输出5.      根据公式计算出输出层神经元的梯度项6.      根据公式计算隐层神经元的梯度项7.      根据公式更新连接权和阈值8.  end for9. until 达到停止条件输出:连接权与阈值确定的多层前馈神经网络注意区分标准BP算法,和累积BP算法(accumulated error backpropagation)
累积BP算法:是将训练集进行读取一遍后才进行更新
标准BP算法:针对一个训练样例进行更新
# input()函数
# 将西瓜数据集3.0进行读取
def input():"""@param  : none or filepath@return : dataSet,dataFrame using pandasRandom double or random.uniform()"""try:import pandas as pdexcept ImportError:print("module import error")with open('/home/dengshuo/GithubCode/ML/CH05/watermelon3.csv') as data_file:df=pd.read_csv(data_file)return df
# learningRatio()函数
# 初始化函数的学习率
def learningRatio():"""@ return : learningRatio """try:import randomexcept ImportError:print('module import error')learningRatio=random.uniform(0,1)return learningRatio
ratio=learningRatio()
print(ratio)
input()
0.8475765311660175
.dataframe thead tr:only-child th {text-align: right;}.dataframe thead th {text-align: left;}.dataframe tbody tr th {vertical-align: top;}
编号色泽根蒂敲声纹理脐部触感密度含糖率好瓜
01青绿蜷缩浊响清晰凹陷硬滑0.6970.460
12乌黑蜷缩沉闷清晰凹陷硬滑0.7740.376
23乌黑蜷缩浊响清晰凹陷硬滑0.6340.264
34青绿蜷缩沉闷清晰凹陷硬滑0.6080.318
45浅白蜷缩浊响清晰凹陷硬滑0.5560.215
56青绿稍蜷浊响清晰稍凹软粘0.4030.237
67乌黑稍蜷浊响稍糊稍凹软粘0.4810.149
78乌黑稍蜷浊响清晰稍凹硬滑0.4370.211
89乌黑稍蜷沉闷稍糊稍凹硬滑0.6660.091
910青绿硬挺清脆清晰平坦软粘0.2430.267
1011浅白硬挺清脆模糊平坦硬滑0.2450.057
1112浅白蜷缩浊响模糊平坦软粘0.3430.099
1213青绿稍蜷浊响稍糊凹陷硬滑0.6390.161
1314浅白稍蜷沉闷稍糊凹陷硬滑0.6570.198
1415乌黑稍蜷浊响清晰稍凹软粘0.3600.370
1516浅白蜷缩浊响模糊平坦硬滑0.5930.042
1617青绿蜷缩沉闷稍糊稍凹硬滑0.7190.103
1718青绿蜷缩浊响清晰凹陷硬滑0.6970.460NaN
# outputlayer() 函数
# 计算函数输出层的输出值Yk
def outputlayer(df):"""@param df: the dataframe of pandas@return Yk:the output """
# 复杂的参数让人头疼
# define class()
# define the neural networks structure,创建整个算法的框架
'''
the definition of BP network class
'''
class BP_network: def __init__(self):'''initial variables'''# node number each layerself.i_n = 0           self.h_n = 0   self.o_n = 0# output value for each layerself.i_v = []       self.h_v = []self.o_v = []# parameters (w, t)self.ih_w = []    # weight for each linkself.ho_w = []self.h_t  = []    # threshold for each neuronself.o_t  = []# definition of alternative activation functions and it's derivationself.fun = {'Sigmoid': Sigmoid,          # 对数几率函数'SigmoidDerivate': SigmoidDerivate,'Tanh': Tanh,              # 双曲正切函数'TanhDerivate': TanhDerivate,}
'Sigmoid': Sigmoid,          # 对数几率函数^
SyntaxError: invalid character in identifier
# CreateNN() 函数
# 将架构进行填充def CreateNN(self, ni, nh, no, actfun):'''build a BP network structure and initial parameters@param ni, nh, no: the neuron number of each layer@param actfun: string, the name of activation function'''# import module packagesimport numpy as np import random# assignment of node number# 对每层的结点树的输入值进行赋值self.i_n = niself.h_n = nhself.o_n = no# initial value of output for each layerself.i_v = np.zeros(self.i_n)self.h_v = np.zeros(self.h_n)self.o_v = np.zeros(self.o_n)# initial weights for each link (random initialization)self.ih_w = np.zeros([self.i_n, self.h_n])self.ho_w = np.zeros([self.h_n, self.o_n])# 利用循环来对权值进行赋值for i in range(self.i_n):  for h in range(self.h_n): self.ih_w[i][h] = rand(0,1)#  float(0,1) # 调用rand()函数for h in range(self.h_n):  for j in range(self.o_n): self.ho_w[h][j] = rand(0,1)# initial threshold for each neuronself.h_t = np.zeros(self.h_n)self.o_t = np.zeros(self.o_n)for h in range(self.h_n): self.h_t[h] = rand(0,1)for j in range(self.o_n): self.o_t[j] = rand(0,1)# initial activation function# 这个不调库能直接用?不是很理解self.af  = self.fun[actfun]self.afd = self.fun[actfun+'Derivate']
# 随机取值函数的定义
'''
the definition of random function
'''
def rand(a, b):'''random value generation for parameter initialization@param a,b: the upper and lower limitation of the random value'''from random import randomreturn (b - a) * random() + a
# define th need functions
# 一些激活函数
'''
the definition of activation functions
'''
def Sigmoid(x):'''definition of sigmoid function and it's derivation'''from math import expreturn 1.0 / (1.0 + exp(-x))
def SigmoidDerivate(y):return y * (1 - y)def Tanh(x):'''definition of sigmoid function and it's derivation'''from math import tanhreturn tanh(x)
def TanhDerivate(y):return 1 - y*y
# predict process through the network
# 计算一个输出def Pred(self, x):'''@param x: the input array for input layer'''# activate input layerfor i in range(self.i_n):self.i_v[i] = x[i]# activate hidden layerfor h in range(self.h_n):total = 0.0for i in range(self.i_n):total += self.i_v[i] * self.ih_w[i][h]self.h_v[h] = self.af(total - self.h_t[h])# activate output layerfor j in range(self.o_n):total = 0.0for h in range(self.h_n):total += self.h_v[h] * self.ho_w[h][j]self.o_v[j] = self.af(total - self.o_t[j])
**还有一个问题就是,已经读取的西瓜数据,该以什么样的形式来进行输入
西瓜数据集的离散性变量该如何处理 例如:色泽{青緑,乌黑,浅白}={0,1,2}  ??
如何不是这样,怎么实现离散性变量的计算?**
# the implementation of BP algorithms on one slide of sample
# backPropagate() 函数
# 后向传播函数,进行计算def BackPropagate(self, x, y, lr):'''@param x, y: array, input and output of the data sample@param lr: float, the learning rate of gradient decent iteration'''# import need module  packagesimport numpy as np # get current network outputself.Pred(x)# calculate the gradient based on outputo_grid = np.zeros(self.o_n) for j in range(self.o_n):# 输出层的神经元梯度项,参考西瓜书 5.3 公式(5.10)o_grid[j] = (y[j] - self.o_v[j]) * self.afd(self.o_v[j])# 这个self.afd()函数就相当于yk(1-yk)# caculate the gradient of hidden layer# 计算隐藏层的梯度项Ehh_grid = np.zeros(self.h_n)for h in range(self.h_n):for j in range(self.o_n):h_grid[h] += self.ho_w[h][j] * o_grid[j]h_grid[h] = h_grid[h] * self.afd(self.h_v[h]) # self.afd()函数就是 Bh(1-Bh)# updating the parameter# 将参数进行更新for h in range(self.h_n):  for j in range(self.o_n): # 更新公式self.ho_w[h][j] += lr * o_grid[j] * self.h_v[h]for i in range(self.i_n):  for h in range(self.h_n): self.ih_w[i][h] += lr * h_grid[h] * self.i_v[i]     for j in range(self.o_n):self.o_t[j] -= lr * o_grid[j]    for h in range(self.h_n):self.h_t[h] -= lr * h_grid[h]
# define TrainStandard() 函数
# 标准的BP函数,计算累积误差def TrainStandard(self, data_in, data_out, lr=0.05):'''@param lr, learning rate, default 0.05@param data_in :the networks input data@param data_out:the output data of output layer@return: e, accumulated error@return: e_k, error array of each step'''    e_k = []for k in range(len(data_in)):x = data_in[k]y = data_out[k]self.BackPropagate(x, y, lr)# error in train set for each step# 计算均方误差y_delta2 = 0.0for j in range(self.o_n):y_delta2 += (self.o_v[j] - y[j]) * (self.o_v[j] - y[j])  e_k.append(y_delta2/2)# total error of training# 先计算出累积误差,然后最小化累积误差e = sum(e_k)/len(e_k)return e, e_k
# 返回预测的标签,好瓜是1,坏瓜是0
def PredLabel(self, X):'''predict process through the network@param X: the input sample set for input layer@return: y, array, output set (0,1 - class) based on [winner-takes-all] 也就是竞争学习,胜者通吃'''    import numpy as npy = []for m in range(len(X)):self.Pred(X[m])if self.o_v[0] > 0.5:  y.append(1)else : y.append(0)
#             max_y = self.o_v[0]
#             label = 0
#             for j in range(1,self.o_n):
#                 if max_y < self.o_v[j]: label = j
#             y.append(label)return np.array(y)  
4.2 利用tensorflow 来实现BP算法

先学习如何实现BP算法

汽车燃油效率建模,一个非线性回归.建立一个多变量输入,单变量输出的前向神经网络

1.数据集的描述和加载

这个数据集是一个著名的,标准的输入数据集.这是一个非常简单的例子,主要还是理解其主要的步骤和方法.

因为这个数据集是标准封装好的数据集,不需要进行详细的数据分析.

一般情况下,数据集会进行可视化处理和详细的数据分析.

2.数据的预处理

一般情况下的预处理也是利用sklearn包中的函数进行直接调用处理.

Sklearn中的Pre-Processing模块

sklearn.preprocessing.StandardScaler
# Standardize features by removing the mean and scaling to unit variance
scaler=preprocessing.StandardScaler()
X_train=scaler.fit_transform(X_train)
这是我现阶段认为进行算法分析最难,也是最不容易操作的地方就是将数据进行处理,满足算法分析的要求.一般情况下都是数据进行处理,满足输入的条件 向算法靠拢有没有根据数据,算法向数据靠拢的,是不是就是一开始的算法选择问题?
3.模型架构

多输入,双隐层,单输出的前向神经网络

七个输入结点,第一隐藏层10,第二隐藏层5,一个输出结点.

不过这个比较简单,可直接利用tensorflow中skflow库来直接调取,skflow库的学习

4.准确度测试

利用均方误差来监测准确度.

还是sklearn.metrics 模型的性能度量.

这个例子不需要进行参数的更新? 主要还是损失函数的优化,本例中没有体现.

score=metrics.mean_squared_error(regressor.predict(scaler.transform(X_test)),y_test)
print("Total mean squared error :".format(score))
上述代码进行汇总,步骤进行合成
完整的源代码
from sklearn import datasets,cross_validation,metrics
from sklearn import preprocessing
from tensorflow.contrib import learn
import pandas as pd 
import matplotlib.pyplot as plt 
%matplotlib inline
%config InlineBackend.figure_format='svg'
from keras.models import Sequential
from keras.layers import Dense
read the original dataset with pandas packages
df=pd.read_csv('mpg.csv',header=0)
df
.dataframe thead tr:only-child th {text-align: right;}.dataframe thead th {text-align: left;}.dataframe tbody tr th {vertical-align: top;}
mpgcylindersdisplacementhorsepowerweightaccelerationmodel_yearoriginname
018.08307.0130350412.0701chevrolet chevelle malibu
115.08350.0165369311.5701buick skylark 320
218.08318.0150343611.0701plymouth satellite
316.08304.0150343312.0701amc rebel sst
417.08302.0140344910.5701ford torino
515.08429.0198434110.0701ford galaxie 500
614.08454.022043549.0701chevrolet impala
714.08440.021543128.5701plymouth fury iii
814.08455.0225442510.0701pontiac catalina
915.08390.019038508.5701amc ambassador dpl
1015.08383.0170356310.0701dodge challenger se
1114.08340.016036098.0701plymouth ‘cuda 340
1215.08400.015037619.5701chevrolet monte carlo
1314.08455.0225308610.0701buick estate wagon (sw)
1424.04113.095237215.0703toyota corona mark ii
1522.06198.095283315.5701plymouth duster
1618.06199.097277415.5701amc hornet
1721.06200.085258716.0701ford maverick
1827.0497.088213014.5703datsun pl510
1926.0497.046183520.5702volkswagen 1131 deluxe sedan
2025.04110.087267217.5702peugeot 504
2124.04107.090243014.5702audi 100 ls
2225.04104.095237517.5702saab 99e
2326.04121.0113223412.5702bmw 2002
2421.06199.090264815.0701amc gremlin
2510.08360.0215461514.0701ford f250
2610.08307.0200437615.0701chevy c20
2711.08318.0210438213.5701dodge d200
289.08304.0193473218.5701hi 1200d
2927.0497.088213014.5713datsun pl510
36827.04112.088264018.6821chevrolet cavalier wagon
36934.04112.088239518.0821chevrolet cavalier 2-door
37031.04112.085257516.2821pontiac j2000 se hatchback
37129.04135.084252516.0821dodge aries se
37227.04151.090273518.0821pontiac phoenix
37324.04140.092286516.4821ford fairmont futura
37423.04151.00303520.5821amc concord dl
37536.04105.074198015.3822volkswagen rabbit l
37637.0491.068202518.2823mazda glc custom l
37731.0491.068197017.6823mazda glc custom
37838.04105.063212514.7821plymouth horizon miser
37936.0498.070212517.3821mercury lynx l
38036.04120.088216014.5823nissan stanza xe
38136.04107.075220514.5823honda accord
38234.04108.070224516.9823toyota corolla
38338.0491.067196515.0823honda civic
38432.0491.067196515.7823honda civic (auto)
38538.0491.067199516.2823datsun 310 gx
38625.06181.0110294516.4821buick century limited
38738.06262.085301517.0821oldsmobile cutlass ciera (diesel)
38826.04156.092258514.5821chrysler lebaron medallion
38922.06232.0112283514.7821ford granada l
39032.04144.096266513.9823toyota celica gt
39136.04135.084237013.0821dodge charger 2.2
39227.04151.090295017.3821chevrolet camaro
39327.04140.086279015.6821ford mustang gl
39444.0497.052213024.6822vw pickup
39532.04135.084229511.6821dodge rampage
39628.04120.079262518.6821ford ranger
39731.04119.082272019.4821chevy s-10

398 rows × 9 columns

# convert the displacement column as float
df['displacement']=df['displacement'].astype(float)
# we got the data columns from the dataset
# first and last (mpg and car names )are ignored for X
X=df[df.columns[1:8]]
y=df['mpg']
plt.figure()
f,ax1=plt.subplots()
for i in range (1,8):number=420+iax1.locator_params(nbins=3)ax1=plt.subplot(number) # 4rows x 2 columnsplt.title(list(df)[i])ax1.scatter(df[df.columns[i]],y)  # plot a scatter draw of the datapoints
plt.tight_layout(pad=0.4,w_pad=0.5,h_pad=1.0)
plt.show()
<matplotlib.figure.Figure at 0x7f37680ad9b0>
# split the datasets
X_train,X_test,y_train,y_test=cross_validation.train_test_split(X,y,test_size=0.25)
# Scale the data for convergency optimization
scaler=preprocessing.StandardScaler()
# set the transform parameters
X_train=scaler.fit_transform(X_train)
# bulid a 2 layer fully connected DNN with 10 and 5 units respectively
model=Sequential()
model.add(Dense(10,input_dim=7,init='normal',activation='relu'))
model.add(Dense(5,init='normal',activation='relu'))
model.add(Dense(1,init='normal'))
# compile the model ,with the mean squared error as lost function
model.compile(loss='mean_squared_error',optimizer='adam')
# fit the model in 1000 epochs
model.fit(X_train,y_train,nb_epoch=1000,validation_split=0.33,shuffle=True,verbose=2)
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:9: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(10, input_dim=7, activation="relu", kernel_initializer="normal")`if __name__ == '__main__':
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:10: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(5, activation="relu", kernel_initializer="normal")`# Remove the CWD from sys.path while we load stuff.
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:11: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(1, kernel_initializer="normal")`# This is added back by InteractiveShellApp.init_path()
/home/dengshuo/anaconda3/lib/python3.6/site-packages/keras/models.py:942: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.warnings.warn('The `nb_epoch` argument in `fit` 'Train on 199 samples, validate on 99 samples
Epoch 1/1000- 2s - loss: 617.0525 - val_loss: 609.8485
Epoch 2/1000- 0s - loss: 616.6131 - val_loss: 609.3912
Epoch 3/1000- 0s - loss: 616.1424 - val_loss: 608.8852
Epoch 4/1000- 0s - loss: 615.6107 - val_loss: 608.3354
Epoch 5/1000- 0s - loss: 615.0266 - val_loss: 607.7320
Epoch 6/1000- 0s - loss: 614.3773 - val_loss: 607.0590
Epoch 7/1000- 0s - loss: 613.6486 - val_loss: 606.3037
Epoch 8/1000- 0s - loss: 612.8283 - val_loss: 605.4522
Epoch 9/1000- 0s - loss: 611.8745 - val_loss: 604.4926
Epoch 10/1000- 0s - loss: 610.7958 - val_loss: 603.3850
Epoch 11/1000- 0s - loss: 609.5498 - val_loss: 602.1220
Epoch 12/1000- 0s - loss: 608.1130 - val_loss: 600.6591
Epoch 13/1000- 0s - loss: 606.4227 - val_loss: 598.9324
Epoch 14/1000- 0s - loss: 604.4313 - val_loss: 596.8759
Epoch 15/1000- 0s - loss: 602.0180 - val_loss: 594.4553
Epoch 16/1000- 0s - loss: 599.1613 - val_loss: 591.6023
Epoch 17/1000- 0s - loss: 595.7963 - val_loss: 588.2477
Epoch 18/1000- 0s - loss: 591.8821 - val_loss: 584.3730
Epoch 19/1000- 0s - loss: 587.3363 - val_loss: 579.9527
Epoch 20/1000- 0s - loss: 582.2015 - val_loss: 574.9615
Epoch 21/1000- 0s - loss: 576.3398 - val_loss: 569.3963
Epoch 22/1000- 0s - loss: 569.9582 - val_loss: 563.1732
Epoch 23/1000- 0s - loss: 562.7825 - val_loss: 556.2878
Epoch 24/1000- 0s - loss: 554.7562 - val_loss: 548.6833
Epoch 25/1000- 0s - loss: 546.1809 - val_loss: 540.2465
Epoch 26/1000- 0s - loss: 536.4419 - val_loss: 531.0525
Epoch 27/1000- 0s - loss: 526.0052 - val_loss: 520.9966
Epoch 28/1000- 0s - loss: 514.7750 - val_loss: 510.2122
Epoch 29/1000- 0s - loss: 502.7272 - val_loss: 498.7851
Epoch 30/1000- 0s - loss: 490.0853 - val_loss: 486.8276
Epoch 31/1000- 0s - loss: 476.8980 - val_loss: 474.1135
Epoch 32/1000- 0s - loss: 462.9080 - val_loss: 460.7899
Epoch 33/1000- 0s - loss: 448.4536 - val_loss: 446.8199
Epoch 34/1000- 0s - loss: 433.3823 - val_loss: 432.3523
Epoch 35/1000- 0s - loss: 418.0738 - val_loss: 417.2292
Epoch 36/1000- 0s - loss: 402.1995 - val_loss: 401.9204
Epoch 37/1000- 0s - loss: 386.2957 - val_loss: 386.1704
Epoch 38/1000- 0s - loss: 370.0512 - val_loss: 370.3389
Epoch 39/1000- 0s - loss: 353.8821 - val_loss: 354.4465
Epoch 40/1000- 0s - loss: 337.5520 - val_loss: 338.3667
Epoch 41/1000- 0s - loss: 321.7167 - val_loss: 322.2394
Epoch 42/1000- 0s - loss: 305.6882 - val_loss: 306.7727
Epoch 43/1000- 0s - loss: 290.3743 - val_loss: 291.2963
Epoch 44/1000- 0s - loss: 274.6336 - val_loss: 276.1515
Epoch 45/1000- 0s - loss: 260.0990 - val_loss: 260.5089
Epoch 46/1000- 0s - loss: 244.4121 - val_loss: 245.3027
Epoch 47/1000- 0s - loss: 229.9722 - val_loss: 230.5114
Epoch 48/1000- 0s - loss: 215.3382 - val_loss: 216.6434
Epoch 49/1000- 0s - loss: 201.7503 - val_loss: 202.7701
Epoch 50/1000- 0s - loss: 188.0539 - val_loss: 189.3396
Epoch 51/1000- 0s - loss: 175.2160 - val_loss: 176.7564
Epoch 52/1000- 0s - loss: 162.8866 - val_loss: 164.4597
Epoch 53/1000- 0s - loss: 150.6437 - val_loss: 152.4301
Epoch 54/1000- 0s - loss: 138.7317 - val_loss: 141.0687
Epoch 55/1000- 0s - loss: 128.0692 - val_loss: 130.5078
Epoch 56/1000- 0s - loss: 117.6397 - val_loss: 120.8894
Epoch 57/1000- 0s - loss: 108.0638 - val_loss: 111.7026
Epoch 58/1000- 0s - loss: 99.0284 - val_loss: 103.0330
Epoch 59/1000- 0s - loss: 90.9092 - val_loss: 94.9790
Epoch 60/1000- 0s - loss: 83.2111 - val_loss: 87.5625
Epoch 61/1000- 0s - loss: 76.3767 - val_loss: 80.9372
Epoch 62/1000- 0s - loss: 70.2027 - val_loss: 74.9560
Epoch 63/1000- 0s - loss: 64.6454 - val_loss: 69.5457
Epoch 64/1000- 0s - loss: 59.6377 - val_loss: 64.7154
Epoch 65/1000- 0s - loss: 55.5105 - val_loss: 60.4849
Epoch 66/1000- 0s - loss: 51.8513 - val_loss: 56.9362
Epoch 67/1000- 0s - loss: 48.8381 - val_loss: 53.8420
Epoch 68/1000- 0s - loss: 46.1866 - val_loss: 50.9441
Epoch 69/1000- 0s - loss: 43.8884 - val_loss: 48.5729
Epoch 70/1000- 0s - loss: 41.9503 - val_loss: 46.5152
Epoch 71/1000- 0s - loss: 40.3024 - val_loss: 44.6339
Epoch 72/1000- 0s - loss: 38.8108 - val_loss: 42.9484
Epoch 73/1000- 0s - loss: 37.4980 - val_loss: 41.6013
Epoch 74/1000- 0s - loss: 36.3590 - val_loss: 40.3587
Epoch 75/1000- 0s - loss: 35.3350 - val_loss: 39.2768
Epoch 76/1000- 0s - loss: 34.4340 - val_loss: 38.2934
Epoch 77/1000- 0s - loss: 33.6276 - val_loss: 37.3137
Epoch 78/1000- 0s - loss: 32.8748 - val_loss: 36.3290
Epoch 79/1000- 0s - loss: 32.0255 - val_loss: 35.4493
Epoch 80/1000- 0s - loss: 31.3205 - val_loss: 34.5893
Epoch 81/1000- 0s - loss: 30.6487 - val_loss: 33.7526
Epoch 82/1000- 0s - loss: 29.9475 - val_loss: 32.9104
Epoch 83/1000- 0s - loss: 29.3175 - val_loss: 32.2003
Epoch 84/1000- 0s - loss: 28.7810 - val_loss: 31.5495
Epoch 85/1000- 0s - loss: 28.2781 - val_loss: 30.9045
Epoch 86/1000- 0s - loss: 27.7526 - val_loss: 30.3547
Epoch 87/1000- 0s - loss: 27.3363 - val_loss: 29.7988
Epoch 88/1000- 0s - loss: 26.8700 - val_loss: 29.3264
Epoch 89/1000- 0s - loss: 26.4615 - val_loss: 28.8264
Epoch 90/1000- 0s - loss: 26.0341 - val_loss: 28.3602
Epoch 91/1000- 0s - loss: 25.6106 - val_loss: 27.8731
Epoch 92/1000- 0s - loss: 25.1837 - val_loss: 27.4386
Epoch 93/1000- 0s - loss: 24.8266 - val_loss: 27.0420
Epoch 94/1000- 0s - loss: 24.4566 - val_loss: 26.6196
Epoch 95/1000- 0s - loss: 24.1025 - val_loss: 26.2527
Epoch 96/1000- 0s - loss: 23.7909 - val_loss: 25.8848
Epoch 97/1000- 0s - loss: 23.4538 - val_loss: 25.4576
Epoch 98/1000- 0s - loss: 23.1632 - val_loss: 25.0269
Epoch 99/1000- 0s - loss: 22.8261 - val_loss: 24.6789
Epoch 100/1000- 0s - loss: 22.5293 - val_loss: 24.3329
Epoch 101/1000- 0s - loss: 22.2390 - val_loss: 23.9914
Epoch 102/1000- 0s - loss: 21.9891 - val_loss: 23.6331
Epoch 103/1000- 0s - loss: 21.6775 - val_loss: 23.3320
Epoch 104/1000- 0s - loss: 21.4248 - val_loss: 23.0086
Epoch 105/1000- 0s - loss: 21.1751 - val_loss: 22.7298
Epoch 106/1000- 0s - loss: 20.9343 - val_loss: 22.4775
Epoch 107/1000- 0s - loss: 20.7016 - val_loss: 22.2347
Epoch 108/1000- 0s - loss: 20.4750 - val_loss: 22.0151
Epoch 109/1000- 0s - loss: 20.2166 - val_loss: 21.7921
Epoch 110/1000- 0s - loss: 20.0007 - val_loss: 21.5739
Epoch 111/1000- 0s - loss: 19.7949 - val_loss: 21.3669
Epoch 112/1000- 0s - loss: 19.6167 - val_loss: 21.1766
Epoch 113/1000- 0s - loss: 19.4121 - val_loss: 20.9885
Epoch 114/1000- 0s - loss: 19.2317 - val_loss: 20.8055
Epoch 115/1000- 0s - loss: 19.0297 - val_loss: 20.6317
Epoch 116/1000- 0s - loss: 18.8320 - val_loss: 20.4374
Epoch 117/1000- 0s - loss: 18.6675 - val_loss: 20.2452
Epoch 118/1000- 0s - loss: 18.4886 - val_loss: 20.0742
Epoch 119/1000- 0s - loss: 18.3178 - val_loss: 19.8894
Epoch 120/1000- 0s - loss: 18.1269 - val_loss: 19.7274
Epoch 121/1000- 0s - loss: 17.9846 - val_loss: 19.5575
Epoch 122/1000- 0s - loss: 17.8375 - val_loss: 19.3942
Epoch 123/1000- 0s - loss: 17.7009 - val_loss: 19.2003
Epoch 124/1000- 0s - loss: 17.5455 - val_loss: 19.0089
Epoch 125/1000- 0s - loss: 17.3967 - val_loss: 18.8835
Epoch 126/1000- 0s - loss: 17.2403 - val_loss: 18.7437
Epoch 127/1000- 0s - loss: 17.1088 - val_loss: 18.6076
Epoch 128/1000- 0s - loss: 16.9629 - val_loss: 18.4918
Epoch 129/1000- 0s - loss: 16.8295 - val_loss: 18.3914
Epoch 130/1000- 0s - loss: 16.7112 - val_loss: 18.2824
Epoch 131/1000- 0s - loss: 16.5754 - val_loss: 18.1041
Epoch 132/1000- 0s - loss: 16.4591 - val_loss: 17.9973
Epoch 133/1000- 0s - loss: 16.3635 - val_loss: 17.9189
Epoch 134/1000- 0s - loss: 16.2536 - val_loss: 17.8277
Epoch 135/1000- 0s - loss: 16.1516 - val_loss: 17.7355
Epoch 136/1000- 0s - loss: 16.0301 - val_loss: 17.6537
Epoch 137/1000- 0s - loss: 15.9451 - val_loss: 17.5409
Epoch 138/1000- 0s - loss: 15.8338 - val_loss: 17.4210
Epoch 139/1000- 0s - loss: 15.7295 - val_loss: 17.2850
Epoch 140/1000- 0s - loss: 15.6222 - val_loss: 17.1313
Epoch 141/1000- 0s - loss: 15.5369 - val_loss: 17.0206
Epoch 142/1000- 0s - loss: 15.4528 - val_loss: 16.9648
Epoch 143/1000- 0s - loss: 15.3375 - val_loss: 16.8977
Epoch 144/1000- 0s - loss: 15.2482 - val_loss: 16.8180
Epoch 145/1000- 0s - loss: 15.1663 - val_loss: 16.7605
Epoch 146/1000- 0s - loss: 15.0835 - val_loss: 16.6839
Epoch 147/1000- 0s - loss: 15.0009 - val_loss: 16.5229
Epoch 148/1000- 0s - loss: 14.8867 - val_loss: 16.3582
Epoch 149/1000- 0s - loss: 14.8116 - val_loss: 16.2289
Epoch 150/1000- 0s - loss: 14.7373 - val_loss: 16.1224
Epoch 151/1000- 0s - loss: 14.6495 - val_loss: 16.0367
Epoch 152/1000- 0s - loss: 14.5806 - val_loss: 15.9549
Epoch 153/1000- 0s - loss: 14.4928 - val_loss: 15.9145
Epoch 154/1000- 0s - loss: 14.4120 - val_loss: 15.8803
Epoch 155/1000- 0s - loss: 14.3389 - val_loss: 15.8220
Epoch 156/1000- 0s - loss: 14.2723 - val_loss: 15.7868
Epoch 157/1000- 0s - loss: 14.2073 - val_loss: 15.7316
Epoch 158/1000- 0s - loss: 14.1354 - val_loss: 15.6694
Epoch 159/1000- 0s - loss: 14.0668 - val_loss: 15.6067
Epoch 160/1000- 0s - loss: 14.0027 - val_loss: 15.5436
Epoch 161/1000- 0s - loss: 13.9223 - val_loss: 15.4304
Epoch 162/1000- 0s - loss: 13.8541 - val_loss: 15.3348
Epoch 163/1000- 0s - loss: 13.7808 - val_loss: 15.2501
Epoch 164/1000- 0s - loss: 13.7212 - val_loss: 15.1992
Epoch 165/1000- 0s - loss: 13.6477 - val_loss: 15.1455
Epoch 166/1000- 0s - loss: 13.5840 - val_loss: 15.1195
Epoch 167/1000- 0s - loss: 13.5280 - val_loss: 15.0793
Epoch 168/1000- 0s - loss: 13.4747 - val_loss: 15.0325
Epoch 169/1000- 0s - loss: 13.3968 - val_loss: 14.9866
Epoch 170/1000- 0s - loss: 13.3312 - val_loss: 14.9559
Epoch 171/1000- 0s - loss: 13.2840 - val_loss: 14.9374
Epoch 172/1000- 0s - loss: 13.2239 - val_loss: 14.8995
Epoch 173/1000- 0s - loss: 13.1771 - val_loss: 14.8622
Epoch 174/1000- 0s - loss: 13.1176 - val_loss: 14.7362
Epoch 175/1000- 0s - loss: 13.0417 - val_loss: 14.6778
Epoch 176/1000- 0s - loss: 12.9914 - val_loss: 14.6371
Epoch 177/1000- 0s - loss: 12.9212 - val_loss: 14.6411
Epoch 178/1000- 0s - loss: 12.8728 - val_loss: 14.6625
Epoch 179/1000- 0s - loss: 12.8163 - val_loss: 14.6399
Epoch 180/1000- 0s - loss: 12.7509 - val_loss: 14.5569
Epoch 181/1000- 0s - loss: 12.7054 - val_loss: 14.4544
Epoch 182/1000- 0s - loss: 12.6434 - val_loss: 14.3884
Epoch 183/1000- 0s - loss: 12.5792 - val_loss: 14.3512
Epoch 184/1000- 0s - loss: 12.5281 - val_loss: 14.2796
Epoch 185/1000- 0s - loss: 12.4714 - val_loss: 14.1947
Epoch 186/1000- 0s - loss: 12.4239 - val_loss: 14.1131
Epoch 187/1000- 0s - loss: 12.3744 - val_loss: 14.0642
Epoch 188/1000- 0s - loss: 12.3246 - val_loss: 14.0554
Epoch 189/1000- 0s - loss: 12.2710 - val_loss: 14.0409
Epoch 190/1000- 0s - loss: 12.2298 - val_loss: 14.0390
Epoch 191/1000- 0s - loss: 12.1819 - val_loss: 13.9518
Epoch 192/1000- 0s - loss: 12.1095 - val_loss: 13.8871
Epoch 193/1000- 0s - loss: 12.0804 - val_loss: 13.8551
Epoch 194/1000- 0s - loss: 12.0105 - val_loss: 13.8213
Epoch 195/1000- 0s - loss: 11.9592 - val_loss: 13.7950
Epoch 196/1000- 0s - loss: 11.9199 - val_loss: 13.7670
Epoch 197/1000- 0s - loss: 11.8656 - val_loss: 13.7546
Epoch 198/1000- 0s - loss: 11.8429 - val_loss: 13.8002
Epoch 199/1000- 0s - loss: 11.8014 - val_loss: 13.7367
Epoch 200/1000- 0s - loss: 11.7764 - val_loss: 13.5915
Epoch 201/1000- 0s - loss: 11.6954 - val_loss: 13.5310
Epoch 202/1000- 0s - loss: 11.6499 - val_loss: 13.4761
Epoch 203/1000- 0s - loss: 11.6060 - val_loss: 13.4136
Epoch 204/1000- 0s - loss: 11.5599 - val_loss: 13.3795
Epoch 205/1000- 0s - loss: 11.5162 - val_loss: 13.3651
Epoch 206/1000- 0s - loss: 11.4721 - val_loss: 13.3273
Epoch 207/1000- 0s - loss: 11.4354 - val_loss: 13.2890
Epoch 208/1000- 0s - loss: 11.3894 - val_loss: 13.2485
Epoch 209/1000- 0s - loss: 11.3468 - val_loss: 13.1722
Epoch 210/1000- 0s - loss: 11.3119 - val_loss: 13.1197
Epoch 211/1000- 0s - loss: 11.2801 - val_loss: 13.0959
Epoch 212/1000- 0s - loss: 11.2280 - val_loss: 13.0467
Epoch 213/1000- 0s - loss: 11.1881 - val_loss: 13.0868
Epoch 214/1000- 0s - loss: 11.1517 - val_loss: 13.0580
Epoch 215/1000- 0s - loss: 11.1362 - val_loss: 13.0343
Epoch 216/1000- 0s - loss: 11.0843 - val_loss: 12.9752
Epoch 217/1000- 0s - loss: 11.0379 - val_loss: 12.9185
Epoch 218/1000- 0s - loss: 11.0099 - val_loss: 12.8465
Epoch 219/1000- 0s - loss: 10.9683 - val_loss: 12.7735
Epoch 220/1000- 0s - loss: 10.9387 - val_loss: 12.7777
Epoch 221/1000- 0s - loss: 10.9008 - val_loss: 12.7350
Epoch 222/1000- 0s - loss: 10.8666 - val_loss: 12.6620
Epoch 223/1000- 0s - loss: 10.8378 - val_loss: 12.6455
Epoch 224/1000- 0s - loss: 10.8080 - val_loss: 12.6011
Epoch 225/1000- 0s - loss: 10.7698 - val_loss: 12.5844
Epoch 226/1000- 0s - loss: 10.7452 - val_loss: 12.5354
Epoch 227/1000- 0s - loss: 10.7170 - val_loss: 12.5571
Epoch 228/1000- 0s - loss: 10.6799 - val_loss: 12.5185
Epoch 229/1000- 0s - loss: 10.6516 - val_loss: 12.5097
Epoch 230/1000- 0s - loss: 10.6288 - val_loss: 12.4987
Epoch 231/1000- 0s - loss: 10.6059 - val_loss: 12.4943
Epoch 232/1000- 0s - loss: 10.5655 - val_loss: 12.4339
Epoch 233/1000- 0s - loss: 10.5282 - val_loss: 12.3797
Epoch 234/1000- 0s - loss: 10.4819 - val_loss: 12.2805
Epoch 235/1000- 0s - loss: 10.4445 - val_loss: 12.1988
Epoch 236/1000- 0s - loss: 10.4292 - val_loss: 12.1352
Epoch 237/1000- 0s - loss: 10.4127 - val_loss: 12.1095
Epoch 238/1000- 0s - loss: 10.3888 - val_loss: 12.1112
Epoch 239/1000- 0s - loss: 10.3424 - val_loss: 12.0910
Epoch 240/1000- 0s - loss: 10.3149 - val_loss: 12.0497
Epoch 241/1000- 0s - loss: 10.2816 - val_loss: 12.0143
Epoch 242/1000- 0s - loss: 10.2697 - val_loss: 12.0300
Epoch 243/1000- 0s - loss: 10.2290 - val_loss: 12.0288
Epoch 244/1000- 0s - loss: 10.2026 - val_loss: 12.0396
Epoch 245/1000- 0s - loss: 10.1838 - val_loss: 12.0362
Epoch 246/1000- 0s - loss: 10.1574 - val_loss: 12.0184
Epoch 247/1000- 0s - loss: 10.1397 - val_loss: 11.9757
Epoch 248/1000- 0s - loss: 10.1170 - val_loss: 11.9405
Epoch 249/1000- 0s - loss: 10.1225 - val_loss: 11.9779
Epoch 250/1000- 0s - loss: 10.0511 - val_loss: 11.8278
Epoch 251/1000- 0s - loss: 10.0240 - val_loss: 11.7634
Epoch 252/1000- 0s - loss: 10.0226 - val_loss: 11.7333
Epoch 253/1000- 0s - loss: 9.9874 - val_loss: 11.7427
Epoch 254/1000- 0s - loss: 9.9643 - val_loss: 11.7390
Epoch 255/1000- 0s - loss: 9.9444 - val_loss: 11.7894
Epoch 256/1000- 0s - loss: 9.9192 - val_loss: 11.8064
Epoch 257/1000- 0s - loss: 9.8962 - val_loss: 11.7924
Epoch 258/1000- 0s - loss: 9.8771 - val_loss: 11.7952
Epoch 259/1000- 0s - loss: 9.8646 - val_loss: 11.7574
Epoch 260/1000- 0s - loss: 9.8266 - val_loss: 11.7573
Epoch 261/1000- 0s - loss: 9.8095 - val_loss: 11.7432
Epoch 262/1000- 0s - loss: 9.7870 - val_loss: 11.7341
Epoch 263/1000- 0s - loss: 9.7733 - val_loss: 11.6837
Epoch 264/1000- 0s - loss: 9.7621 - val_loss: 11.6979
Epoch 265/1000- 0s - loss: 9.7290 - val_loss: 11.6682
Epoch 266/1000- 0s - loss: 9.7105 - val_loss: 11.5918
Epoch 267/1000- 0s - loss: 9.6964 - val_loss: 11.4957
Epoch 268/1000- 0s - loss: 9.6705 - val_loss: 11.4671
Epoch 269/1000- 0s - loss: 9.6492 - val_loss: 11.4310
Epoch 270/1000- 0s - loss: 9.6262 - val_loss: 11.4391
Epoch 271/1000- 0s - loss: 9.6017 - val_loss: 11.4193
Epoch 272/1000- 0s - loss: 9.5823 - val_loss: 11.3445
Epoch 273/1000- 0s - loss: 9.5601 - val_loss: 11.2745
Epoch 274/1000- 0s - loss: 9.5388 - val_loss: 11.2656
Epoch 275/1000- 0s - loss: 9.5148 - val_loss: 11.2622
Epoch 276/1000- 0s - loss: 9.4939 - val_loss: 11.2601
Epoch 277/1000- 0s - loss: 9.4902 - val_loss: 11.1829
Epoch 278/1000- 0s - loss: 9.4746 - val_loss: 11.1980
Epoch 279/1000- 0s - loss: 9.4436 - val_loss: 11.2050
Epoch 280/1000- 0s - loss: 9.4259 - val_loss: 11.2028
Epoch 281/1000- 0s - loss: 9.4192 - val_loss: 11.1810
Epoch 282/1000- 0s - loss: 9.3910 - val_loss: 11.1952
Epoch 283/1000- 0s - loss: 9.4012 - val_loss: 11.2408
Epoch 284/1000- 0s - loss: 9.3752 - val_loss: 11.1506
Epoch 285/1000- 0s - loss: 9.3625 - val_loss: 11.1181
Epoch 286/1000- 0s - loss: 9.3673 - val_loss: 11.0366
Epoch 287/1000- 0s - loss: 9.3421 - val_loss: 11.0141
Epoch 288/1000- 0s - loss: 9.3219 - val_loss: 11.0232
Epoch 289/1000- 0s - loss: 9.3020 - val_loss: 11.0165
Epoch 290/1000- 0s - loss: 9.2797 - val_loss: 10.9553
Epoch 291/1000- 0s - loss: 9.2648 - val_loss: 10.8703
Epoch 292/1000- 0s - loss: 9.2774 - val_loss: 10.7951
Epoch 293/1000- 0s - loss: 9.2675 - val_loss: 10.7663
Epoch 294/1000- 0s - loss: 9.2558 - val_loss: 10.6986
Epoch 295/1000- 0s - loss: 9.2492 - val_loss: 10.6766
Epoch 296/1000- 0s - loss: 9.2163 - val_loss: 10.6969
Epoch 297/1000- 0s - loss: 9.1810 - val_loss: 10.7375
Epoch 298/1000- 0s - loss: 9.1611 - val_loss: 10.8039
Epoch 299/1000- 0s - loss: 9.1537 - val_loss: 10.8679
Epoch 300/1000- 0s - loss: 9.1401 - val_loss: 10.8980
Epoch 301/1000- 0s - loss: 9.1325 - val_loss: 10.8921
Epoch 302/1000- 0s - loss: 9.1152 - val_loss: 10.8517
Epoch 303/1000- 0s - loss: 9.0888 - val_loss: 10.8592
Epoch 304/1000- 0s - loss: 9.0723 - val_loss: 10.8164
Epoch 305/1000- 0s - loss: 9.0754 - val_loss: 10.8013
Epoch 306/1000- 0s - loss: 9.0536 - val_loss: 10.8482
Epoch 307/1000- 0s - loss: 9.0342 - val_loss: 10.8750
Epoch 308/1000- 0s - loss: 9.0205 - val_loss: 10.9125
Epoch 309/1000- 0s - loss: 9.0027 - val_loss: 10.9222
Epoch 310/1000- 0s - loss: 8.9918 - val_loss: 10.9294
Epoch 311/1000- 0s - loss: 8.9797 - val_loss: 10.8656
Epoch 312/1000- 0s - loss: 8.9659 - val_loss: 10.8346
Epoch 313/1000- 0s - loss: 8.9525 - val_loss: 10.8616
Epoch 314/1000- 0s - loss: 8.9443 - val_loss: 10.8688
Epoch 315/1000- 0s - loss: 8.9302 - val_loss: 10.9272
Epoch 316/1000- 0s - loss: 8.9138 - val_loss: 10.8415
Epoch 317/1000- 0s - loss: 8.9035 - val_loss: 10.7621
Epoch 318/1000- 0s - loss: 8.8837 - val_loss: 10.7925
Epoch 319/1000- 0s - loss: 8.8797 - val_loss: 10.8277
Epoch 320/1000- 0s - loss: 8.8582 - val_loss: 10.8511
Epoch 321/1000- 0s - loss: 8.8460 - val_loss: 10.8749
Epoch 322/1000- 0s - loss: 8.8483 - val_loss: 10.8923
Epoch 323/1000- 0s - loss: 8.8387 - val_loss: 10.8425
Epoch 324/1000- 0s - loss: 8.8164 - val_loss: 10.8036
Epoch 325/1000- 0s - loss: 8.8381 - val_loss: 10.6709
Epoch 326/1000- 0s - loss: 8.7796 - val_loss: 10.6373
Epoch 327/1000- 0s - loss: 8.7661 - val_loss: 10.6759
Epoch 328/1000- 0s - loss: 8.7963 - val_loss: 10.7422
Epoch 329/1000- 0s - loss: 8.7509 - val_loss: 10.6758
Epoch 330/1000- 0s - loss: 8.7488 - val_loss: 10.6758
Epoch 331/1000- 0s - loss: 8.7206 - val_loss: 10.7761
Epoch 332/1000- 0s - loss: 8.7120 - val_loss: 10.8222
Epoch 333/1000- 0s - loss: 8.7187 - val_loss: 10.7963
Epoch 334/1000- 0s - loss: 8.6982 - val_loss: 10.7546
Epoch 335/1000- 0s - loss: 8.6870 - val_loss: 10.6930
Epoch 336/1000- 0s - loss: 8.6634 - val_loss: 10.6206
Epoch 337/1000- 0s - loss: 8.6658 - val_loss: 10.4322
Epoch 338/1000- 0s - loss: 8.6511 - val_loss: 10.4114
Epoch 339/1000- 0s - loss: 8.6222 - val_loss: 10.3955
Epoch 340/1000- 0s - loss: 8.6127 - val_loss: 10.3674
Epoch 341/1000- 0s - loss: 8.6136 - val_loss: 10.3161
Epoch 342/1000- 0s - loss: 8.5840 - val_loss: 10.3241
Epoch 343/1000- 0s - loss: 8.5690 - val_loss: 10.3284
Epoch 344/1000- 0s - loss: 8.5481 - val_loss: 10.3945
Epoch 345/1000- 0s - loss: 8.5632 - val_loss: 10.4634
Epoch 346/1000- 0s - loss: 8.5353 - val_loss: 10.4523
Epoch 347/1000- 0s - loss: 8.5279 - val_loss: 10.4433
Epoch 348/1000- 0s - loss: 8.5172 - val_loss: 10.4336
Epoch 349/1000- 0s - loss: 8.5000 - val_loss: 10.3681
Epoch 350/1000- 0s - loss: 8.4766 - val_loss: 10.3165
Epoch 351/1000- 0s - loss: 8.4767 - val_loss: 10.3574
Epoch 352/1000- 0s - loss: 8.4723 - val_loss: 10.4080
Epoch 353/1000- 0s - loss: 8.4622 - val_loss: 10.3892
Epoch 354/1000- 0s - loss: 8.4221 - val_loss: 10.2788
Epoch 355/1000- 0s - loss: 8.4285 - val_loss: 10.1881
Epoch 356/1000- 0s - loss: 8.4010 - val_loss: 10.1558
Epoch 357/1000- 0s - loss: 8.4109 - val_loss: 10.2491
Epoch 358/1000- 0s - loss: 8.3848 - val_loss: 10.2795
Epoch 359/1000- 0s - loss: 8.3713 - val_loss: 10.2002
Epoch 360/1000- 0s - loss: 8.3541 - val_loss: 10.1635
Epoch 361/1000- 0s - loss: 8.3585 - val_loss: 10.1494
Epoch 362/1000- 0s - loss: 8.3358 - val_loss: 10.0814
Epoch 363/1000- 0s - loss: 8.3387 - val_loss: 10.0118
Epoch 364/1000- 0s - loss: 8.3137 - val_loss: 10.0304
Epoch 365/1000- 0s - loss: 8.3274 - val_loss: 10.1559
Epoch 366/1000- 0s - loss: 8.3032 - val_loss: 10.1453
Epoch 367/1000- 0s - loss: 8.2848 - val_loss: 10.0953
Epoch 368/1000- 0s - loss: 8.2782 - val_loss: 10.1053
Epoch 369/1000- 0s - loss: 8.2867 - val_loss: 10.1566
Epoch 370/1000- 0s - loss: 8.2609 - val_loss: 10.0592
Epoch 371/1000- 0s - loss: 8.2492 - val_loss: 9.9413
Epoch 372/1000- 0s - loss: 8.2279 - val_loss: 9.9468
Epoch 373/1000- 0s - loss: 8.2247 - val_loss: 9.9058
Epoch 374/1000- 0s - loss: 8.2210 - val_loss: 9.8453
Epoch 375/1000- 0s - loss: 8.2330 - val_loss: 9.8605
Epoch 376/1000- 0s - loss: 8.1977 - val_loss: 9.7890
Epoch 377/1000- 0s - loss: 8.2023 - val_loss: 9.7465
Epoch 378/1000- 0s - loss: 8.2100 - val_loss: 9.7059
Epoch 379/1000- 0s - loss: 8.2008 - val_loss: 9.7453
Epoch 380/1000- 0s - loss: 8.1668 - val_loss: 9.7531
Epoch 381/1000- 0s - loss: 8.1635 - val_loss: 9.7645
Epoch 382/1000- 0s - loss: 8.1444 - val_loss: 9.7483
Epoch 383/1000- 0s - loss: 8.1365 - val_loss: 9.7124
Epoch 384/1000- 0s - loss: 8.1279 - val_loss: 9.7045
Epoch 385/1000- 0s - loss: 8.1201 - val_loss: 9.7145
Epoch 386/1000- 0s - loss: 8.1208 - val_loss: 9.7339
Epoch 387/1000- 0s - loss: 8.1222 - val_loss: 9.7939
Epoch 388/1000- 0s - loss: 8.0853 - val_loss: 9.7851
Epoch 389/1000- 0s - loss: 8.0854 - val_loss: 9.7562
Epoch 390/1000- 0s - loss: 8.0762 - val_loss: 9.7514
Epoch 391/1000- 0s - loss: 8.0688 - val_loss: 9.7722
Epoch 392/1000- 0s - loss: 8.0687 - val_loss: 9.7871
Epoch 393/1000- 0s - loss: 8.0618 - val_loss: 9.8890
Epoch 394/1000- 0s - loss: 8.0692 - val_loss: 9.8754
Epoch 395/1000- 0s - loss: 8.0575 - val_loss: 9.7703
Epoch 396/1000- 0s - loss: 8.0333 - val_loss: 9.7485
Epoch 397/1000- 0s - loss: 8.0103 - val_loss: 9.7759
Epoch 398/1000- 0s - loss: 8.0172 - val_loss: 9.7576
Epoch 399/1000- 0s - loss: 8.0080 - val_loss: 9.7525
Epoch 400/1000- 0s - loss: 8.0005 - val_loss: 9.7616
Epoch 401/1000- 0s - loss: 7.9704 - val_loss: 9.6813
Epoch 402/1000- 0s - loss: 7.9888 - val_loss: 9.6707
Epoch 403/1000- 0s - loss: 7.9841 - val_loss: 9.6930
Epoch 404/1000- 0s - loss: 7.9677 - val_loss: 9.6811
Epoch 405/1000- 0s - loss: 7.9617 - val_loss: 9.6276
Epoch 406/1000- 0s - loss: 7.9841 - val_loss: 9.5850
Epoch 407/1000- 0s - loss: 7.9654 - val_loss: 9.5989
Epoch 408/1000- 0s - loss: 7.9476 - val_loss: 9.6232
Epoch 409/1000- 0s - loss: 7.9519 - val_loss: 9.6726
Epoch 410/1000- 0s - loss: 7.9369 - val_loss: 9.7567
Epoch 411/1000- 0s - loss: 7.9223 - val_loss: 9.7369
Epoch 412/1000- 0s - loss: 7.9220 - val_loss: 9.7348
Epoch 413/1000- 0s - loss: 7.9310 - val_loss: 9.5686
Epoch 414/1000- 0s - loss: 7.8971 - val_loss: 9.5444
Epoch 415/1000- 0s - loss: 7.8960 - val_loss: 9.5441
Epoch 416/1000- 0s - loss: 7.8916 - val_loss: 9.4883
Epoch 417/1000- 0s - loss: 7.8765 - val_loss: 9.5037
Epoch 418/1000- 0s - loss: 7.8644 - val_loss: 9.5271
Epoch 419/1000- 0s - loss: 7.8523 - val_loss: 9.5008
Epoch 420/1000- 0s - loss: 7.8506 - val_loss: 9.4746
Epoch 421/1000- 0s - loss: 7.8538 - val_loss: 9.4658
Epoch 422/1000- 0s - loss: 7.8308 - val_loss: 9.3478
Epoch 423/1000- 0s - loss: 7.8875 - val_loss: 9.3091
Epoch 424/1000- 0s - loss: 7.8633 - val_loss: 9.3342
Epoch 425/1000- 0s - loss: 7.8302 - val_loss: 9.4065
Epoch 426/1000- 0s - loss: 7.8212 - val_loss: 9.4203
Epoch 427/1000- 0s - loss: 7.8332 - val_loss: 9.4748
Epoch 428/1000- 0s - loss: 7.8091 - val_loss: 9.4748
Epoch 429/1000- 0s - loss: 7.7959 - val_loss: 9.5255
Epoch 430/1000- 0s - loss: 7.8168 - val_loss: 9.4170
Epoch 431/1000- 0s - loss: 7.7823 - val_loss: 9.4302
Epoch 432/1000- 0s - loss: 7.7762 - val_loss: 9.4258
Epoch 433/1000- 0s - loss: 7.7796 - val_loss: 9.4266
Epoch 434/1000- 0s - loss: 7.7753 - val_loss: 9.4045
Epoch 435/1000- 0s - loss: 7.7657 - val_loss: 9.3825
Epoch 436/1000- 0s - loss: 7.7469 - val_loss: 9.3104
Epoch 437/1000- 0s - loss: 7.7607 - val_loss: 9.2103
Epoch 438/1000- 0s - loss: 7.7419 - val_loss: 9.2481
Epoch 439/1000- 0s - loss: 7.7340 - val_loss: 9.2872
Epoch 440/1000- 0s - loss: 7.7184 - val_loss: 9.4087
Epoch 441/1000- 0s - loss: 7.7430 - val_loss: 9.5141
Epoch 442/1000- 0s - loss: 7.7368 - val_loss: 9.4866
Epoch 443/1000- 0s - loss: 7.7329 - val_loss: 9.4196
Epoch 444/1000- 0s - loss: 7.7017 - val_loss: 9.4086
Epoch 445/1000- 0s - loss: 7.6908 - val_loss: 9.3950
Epoch 446/1000- 0s - loss: 7.6865 - val_loss: 9.4267
Epoch 447/1000- 0s - loss: 7.6731 - val_loss: 9.3945
Epoch 448/1000- 0s - loss: 7.6665 - val_loss: 9.3778
Epoch 449/1000- 0s - loss: 7.6738 - val_loss: 9.3590
Epoch 450/1000- 0s - loss: 7.6739 - val_loss: 9.3468
Epoch 451/1000- 0s - loss: 7.6753 - val_loss: 9.3527
Epoch 452/1000- 0s - loss: 7.6685 - val_loss: 9.3419
Epoch 453/1000- 0s - loss: 7.6555 - val_loss: 9.3090
Epoch 454/1000- 0s - loss: 7.6404 - val_loss: 9.2873
Epoch 455/1000- 0s - loss: 7.6279 - val_loss: 9.3055
Epoch 456/1000- 0s - loss: 7.6209 - val_loss: 9.3193
Epoch 457/1000- 0s - loss: 7.6182 - val_loss: 9.2961
Epoch 458/1000- 0s - loss: 7.6453 - val_loss: 9.3292
Epoch 459/1000- 0s - loss: 7.6278 - val_loss: 9.2790
Epoch 460/1000- 0s - loss: 7.6134 - val_loss: 9.3962
Epoch 461/1000- 0s - loss: 7.6195 - val_loss: 9.3386
Epoch 462/1000- 0s - loss: 7.5990 - val_loss: 9.3383
Epoch 463/1000- 0s - loss: 7.5899 - val_loss: 9.3752
Epoch 464/1000- 0s - loss: 7.5938 - val_loss: 9.3618
Epoch 465/1000- 0s - loss: 7.5738 - val_loss: 9.2606
Epoch 466/1000- 0s - loss: 7.5666 - val_loss: 9.2365
Epoch 467/1000- 0s - loss: 7.5713 - val_loss: 9.2111
Epoch 468/1000- 0s - loss: 7.5686 - val_loss: 9.2024
Epoch 469/1000- 0s - loss: 7.5623 - val_loss: 9.2167
Epoch 470/1000- 0s - loss: 7.5463 - val_loss: 9.2436
Epoch 471/1000- 0s - loss: 7.5377 - val_loss: 9.2348
Epoch 472/1000- 0s - loss: 7.5338 - val_loss: 9.2552
Epoch 473/1000- 0s - loss: 7.5265 - val_loss: 9.2904
Epoch 474/1000- 0s - loss: 7.5308 - val_loss: 9.2888
Epoch 475/1000- 0s - loss: 7.5382 - val_loss: 9.2878
Epoch 476/1000- 0s - loss: 7.5478 - val_loss: 9.1688
Epoch 477/1000- 0s - loss: 7.5148 - val_loss: 9.1464
Epoch 478/1000- 0s - loss: 7.5126 - val_loss: 9.1406
Epoch 479/1000- 0s - loss: 7.5022 - val_loss: 9.1098
Epoch 480/1000- 0s - loss: 7.4976 - val_loss: 9.1096
Epoch 481/1000- 0s - loss: 7.4839 - val_loss: 9.1037
Epoch 482/1000- 0s - loss: 7.4758 - val_loss: 9.1226
Epoch 483/1000- 0s - loss: 7.4828 - val_loss: 9.0665
Epoch 484/1000- 0s - loss: 7.4734 - val_loss: 9.0339
Epoch 485/1000- 0s - loss: 7.4679 - val_loss: 9.0203
Epoch 486/1000- 0s - loss: 7.4601 - val_loss: 8.9967
Epoch 487/1000- 0s - loss: 7.4945 - val_loss: 8.9713
Epoch 488/1000- 0s - loss: 7.4982 - val_loss: 8.9930
Epoch 489/1000- 0s - loss: 7.4776 - val_loss: 8.9826
Epoch 490/1000- 0s - loss: 7.4644 - val_loss: 9.0402
Epoch 491/1000- 0s - loss: 7.4525 - val_loss: 9.1728
Epoch 492/1000- 0s - loss: 7.4518 - val_loss: 9.1823
Epoch 493/1000- 0s - loss: 7.4512 - val_loss: 9.1559
Epoch 494/1000- 0s - loss: 7.4614 - val_loss: 9.2690
Epoch 495/1000- 0s - loss: 7.4647 - val_loss: 9.2601
Epoch 496/1000- 0s - loss: 7.4457 - val_loss: 9.1712
Epoch 497/1000- 0s - loss: 7.4554 - val_loss: 8.9504
Epoch 498/1000- 0s - loss: 7.4170 - val_loss: 8.9071
Epoch 499/1000- 0s - loss: 7.4237 - val_loss: 8.9131
Epoch 500/1000- 0s - loss: 7.4320 - val_loss: 8.9394
Epoch 501/1000- 0s - loss: 7.4182 - val_loss: 8.9443
Epoch 502/1000- 0s - loss: 7.4120 - val_loss: 8.9685
Epoch 503/1000- 0s - loss: 7.4104 - val_loss: 8.9074
Epoch 504/1000- 0s - loss: 7.4204 - val_loss: 8.9123
Epoch 505/1000- 0s - loss: 7.4114 - val_loss: 8.9123
Epoch 506/1000- 0s - loss: 7.4200 - val_loss: 8.9229
Epoch 507/1000- 0s - loss: 7.3940 - val_loss: 8.9299
Epoch 508/1000- 0s - loss: 7.4050 - val_loss: 8.9499
Epoch 509/1000- 0s - loss: 7.3876 - val_loss: 8.8366
Epoch 510/1000- 0s - loss: 7.4085 - val_loss: 8.8196
Epoch 511/1000- 0s - loss: 7.3853 - val_loss: 8.9050
Epoch 512/1000- 0s - loss: 7.3525 - val_loss: 8.9716
Epoch 513/1000- 0s - loss: 7.3790 - val_loss: 9.0047
Epoch 514/1000- 0s - loss: 7.4038 - val_loss: 9.0093
Epoch 515/1000- 0s - loss: 7.3796 - val_loss: 8.8904
Epoch 516/1000- 0s - loss: 7.3671 - val_loss: 8.8123
Epoch 517/1000- 0s - loss: 7.3561 - val_loss: 8.8055
Epoch 518/1000- 0s - loss: 7.3511 - val_loss: 8.7853
Epoch 519/1000- 0s - loss: 7.3664 - val_loss: 8.7575
Epoch 520/1000- 0s - loss: 7.4085 - val_loss: 8.6663
Epoch 521/1000- 0s - loss: 7.3823 - val_loss: 8.6810
Epoch 522/1000- 0s - loss: 7.3353 - val_loss: 8.7941
Epoch 523/1000- 0s - loss: 7.3142 - val_loss: 8.8653
Epoch 524/1000- 0s - loss: 7.3648 - val_loss: 9.0518
Epoch 525/1000- 0s - loss: 7.3819 - val_loss: 9.0588
Epoch 526/1000- 0s - loss: 7.3882 - val_loss: 8.8316
Epoch 527/1000- 0s - loss: 7.3255 - val_loss: 8.7841
Epoch 528/1000- 0s - loss: 7.3287 - val_loss: 8.7704
Epoch 529/1000- 0s - loss: 7.3126 - val_loss: 8.8060
Epoch 530/1000- 0s - loss: 7.3156 - val_loss: 8.8370
Epoch 531/1000- 0s - loss: 7.3028 - val_loss: 8.7535
Epoch 532/1000- 0s - loss: 7.3223 - val_loss: 8.7023
Epoch 533/1000- 0s - loss: 7.3409 - val_loss: 8.7554
Epoch 534/1000- 0s - loss: 7.2988 - val_loss: 8.7629
Epoch 535/1000- 0s - loss: 7.3027 - val_loss: 8.7798
Epoch 536/1000- 0s - loss: 7.2900 - val_loss: 8.8719
Epoch 537/1000- 0s - loss: 7.3077 - val_loss: 9.0230
Epoch 538/1000- 0s - loss: 7.3114 - val_loss: 8.9872
Epoch 539/1000- 0s - loss: 7.2892 - val_loss: 9.0381
Epoch 540/1000- 0s - loss: 7.2971 - val_loss: 9.1181
Epoch 541/1000- 0s - loss: 7.2840 - val_loss: 8.9708
Epoch 542/1000- 0s - loss: 7.2606 - val_loss: 8.9025
Epoch 543/1000- 0s - loss: 7.2699 - val_loss: 8.8968
Epoch 544/1000- 0s - loss: 7.3116 - val_loss: 8.8308
Epoch 545/1000- 0s - loss: 7.2621 - val_loss: 8.9328
Epoch 546/1000- 0s - loss: 7.2553 - val_loss: 8.8968
Epoch 547/1000- 0s - loss: 7.2492 - val_loss: 8.9049
Epoch 548/1000- 0s - loss: 7.2376 - val_loss: 8.9244
Epoch 549/1000- 0s - loss: 7.2481 - val_loss: 8.8881
Epoch 550/1000- 0s - loss: 7.2358 - val_loss: 8.8818
Epoch 551/1000- 0s - loss: 7.2338 - val_loss: 8.9121
Epoch 552/1000- 0s - loss: 7.3010 - val_loss: 9.1588
Epoch 553/1000- 0s - loss: 7.2741 - val_loss: 9.0692
Epoch 554/1000- 0s - loss: 7.2590 - val_loss: 9.0917
Epoch 555/1000- 0s - loss: 7.2662 - val_loss: 9.1665
Epoch 556/1000- 0s - loss: 7.2556 - val_loss: 9.1096
Epoch 557/1000- 0s - loss: 7.2315 - val_loss: 8.9683
Epoch 558/1000- 0s - loss: 7.2812 - val_loss: 8.9278
Epoch 559/1000- 0s - loss: 7.2007 - val_loss: 9.0530
Epoch 560/1000- 0s - loss: 7.2429 - val_loss: 9.0803
Epoch 561/1000- 0s - loss: 7.2286 - val_loss: 8.9845
Epoch 562/1000- 0s - loss: 7.2232 - val_loss: 8.9234
Epoch 563/1000- 0s - loss: 7.2181 - val_loss: 8.8678
Epoch 564/1000- 0s - loss: 7.2216 - val_loss: 8.9412
Epoch 565/1000- 0s - loss: 7.2220 - val_loss: 8.9037
Epoch 566/1000- 0s - loss: 7.2201 - val_loss: 9.0258
Epoch 567/1000- 0s - loss: 7.2315 - val_loss: 9.0276
Epoch 568/1000- 0s - loss: 7.2162 - val_loss: 8.8868
Epoch 569/1000- 0s - loss: 7.1910 - val_loss: 8.7985
Epoch 570/1000- 0s - loss: 7.1854 - val_loss: 8.7582
Epoch 571/1000- 0s - loss: 7.1888 - val_loss: 8.7860
Epoch 572/1000- 0s - loss: 7.1922 - val_loss: 8.7441
Epoch 573/1000- 0s - loss: 7.2300 - val_loss: 8.6933
Epoch 574/1000- 0s - loss: 7.2109 - val_loss: 8.7351
Epoch 575/1000- 0s - loss: 7.1788 - val_loss: 8.7148
Epoch 576/1000- 0s - loss: 7.2081 - val_loss: 8.7247
Epoch 577/1000- 0s - loss: 7.2629 - val_loss: 8.8782
Epoch 578/1000- 0s - loss: 7.2050 - val_loss: 8.8041
Epoch 579/1000- 0s - loss: 7.1858 - val_loss: 8.7894
Epoch 580/1000- 0s - loss: 7.1628 - val_loss: 8.8416
Epoch 581/1000- 0s - loss: 7.1888 - val_loss: 8.8093
Epoch 582/1000- 0s - loss: 7.1746 - val_loss: 8.8096
Epoch 583/1000- 0s - loss: 7.1765 - val_loss: 8.7336
Epoch 584/1000- 0s - loss: 7.1703 - val_loss: 8.7404
Epoch 585/1000- 0s - loss: 7.1719 - val_loss: 8.7679
Epoch 586/1000- 0s - loss: 7.1549 - val_loss: 8.7411
Epoch 587/1000- 0s - loss: 7.1621 - val_loss: 8.7402
Epoch 588/1000- 0s - loss: 7.1625 - val_loss: 8.7479
Epoch 589/1000- 0s - loss: 7.1804 - val_loss: 8.7629
Epoch 590/1000- 0s - loss: 7.1583 - val_loss: 8.9162
Epoch 591/1000- 0s - loss: 7.1504 - val_loss: 8.9899
Epoch 592/1000- 0s - loss: 7.1585 - val_loss: 9.0471
Epoch 593/1000- 0s - loss: 7.1546 - val_loss: 9.0227
Epoch 594/1000- 0s - loss: 7.1483 - val_loss: 8.9838
Epoch 595/1000- 0s - loss: 7.1467 - val_loss: 8.9171
Epoch 596/1000- 0s - loss: 7.1360 - val_loss: 8.8858
Epoch 597/1000- 0s - loss: 7.1871 - val_loss: 8.9406
Epoch 598/1000- 0s - loss: 7.1329 - val_loss: 8.9990
Epoch 599/1000- 0s - loss: 7.1322 - val_loss: 9.0193
Epoch 600/1000- 0s - loss: 7.1318 - val_loss: 8.9862
Epoch 601/1000- 0s - loss: 7.1294 - val_loss: 8.9383
Epoch 602/1000- 0s - loss: 7.1217 - val_loss: 8.8549
Epoch 603/1000- 0s - loss: 7.1296 - val_loss: 8.8326
Epoch 604/1000- 0s - loss: 7.1382 - val_loss: 8.8090
Epoch 605/1000- 0s - loss: 7.1453 - val_loss: 8.8375
Epoch 606/1000- 0s - loss: 7.1302 - val_loss: 8.7985
Epoch 607/1000- 0s - loss: 7.1177 - val_loss: 8.8318
Epoch 608/1000- 0s - loss: 7.1185 - val_loss: 8.8661
Epoch 609/1000- 0s - loss: 7.1066 - val_loss: 8.7912
Epoch 610/1000- 0s - loss: 7.1154 - val_loss: 8.7511
Epoch 611/1000- 0s - loss: 7.1331 - val_loss: 8.7270
Epoch 612/1000- 0s - loss: 7.1348 - val_loss: 8.7352
Epoch 613/1000- 0s - loss: 7.1382 - val_loss: 8.7614
Epoch 614/1000- 0s - loss: 7.1264 - val_loss: 8.7486
Epoch 615/1000- 0s - loss: 7.1129 - val_loss: 8.7995
Epoch 616/1000- 0s - loss: 7.1249 - val_loss: 8.8461
Epoch 617/1000- 0s - loss: 7.1196 - val_loss: 8.8374
Epoch 618/1000- 0s - loss: 7.1132 - val_loss: 8.6431
Epoch 619/1000- 0s - loss: 7.1038 - val_loss: 8.6073
Epoch 620/1000- 0s - loss: 7.1130 - val_loss: 8.6246
Epoch 621/1000- 0s - loss: 7.0796 - val_loss: 8.7209
Epoch 622/1000- 0s - loss: 7.0920 - val_loss: 8.7489
Epoch 623/1000- 0s - loss: 7.0839 - val_loss: 8.6729
Epoch 624/1000- 0s - loss: 7.0933 - val_loss: 8.7233
Epoch 625/1000- 0s - loss: 7.0879 - val_loss: 8.8441
Epoch 626/1000- 0s - loss: 7.0877 - val_loss: 8.8173
Epoch 627/1000- 0s - loss: 7.1302 - val_loss: 8.6925
Epoch 628/1000- 0s - loss: 7.0711 - val_loss: 8.7263
Epoch 629/1000- 0s - loss: 7.1090 - val_loss: 8.7833
Epoch 630/1000- 0s - loss: 7.1146 - val_loss: 8.7732
Epoch 631/1000- 0s - loss: 7.0759 - val_loss: 8.6664
Epoch 632/1000- 0s - loss: 7.0672 - val_loss: 8.6166
Epoch 633/1000- 0s - loss: 7.0720 - val_loss: 8.5739
Epoch 634/1000- 0s - loss: 7.0862 - val_loss: 8.5997
Epoch 635/1000- 0s - loss: 7.0622 - val_loss: 8.6503
Epoch 636/1000- 0s - loss: 7.0927 - val_loss: 8.7070
Epoch 637/1000- 0s - loss: 7.0681 - val_loss: 8.7236
Epoch 638/1000- 0s - loss: 7.0591 - val_loss: 8.6739
Epoch 639/1000- 0s - loss: 7.0881 - val_loss: 8.5848
Epoch 640/1000- 0s - loss: 7.1008 - val_loss: 8.5578
Epoch 641/1000- 0s - loss: 7.0774 - val_loss: 8.6225
Epoch 642/1000- 0s - loss: 7.0502 - val_loss: 8.6967
Epoch 643/1000- 0s - loss: 7.0538 - val_loss: 8.7666
Epoch 644/1000- 0s - loss: 7.0759 - val_loss: 8.7471
Epoch 645/1000- 0s - loss: 7.0839 - val_loss: 8.6874
Epoch 646/1000- 0s - loss: 7.0555 - val_loss: 8.6607
Epoch 647/1000- 0s - loss: 7.0576 - val_loss: 8.6915
Epoch 648/1000- 0s - loss: 7.0633 - val_loss: 8.7571
Epoch 649/1000- 0s - loss: 7.0408 - val_loss: 8.6331
Epoch 650/1000- 0s - loss: 7.0771 - val_loss: 8.5021
Epoch 651/1000- 0s - loss: 7.0837 - val_loss: 8.5108
Epoch 652/1000- 0s - loss: 7.0857 - val_loss: 8.5488
Epoch 653/1000- 0s - loss: 7.0314 - val_loss: 8.6862
Epoch 654/1000- 0s - loss: 7.0642 - val_loss: 8.7735
Epoch 655/1000- 0s - loss: 7.0981 - val_loss: 8.6833
Epoch 656/1000- 0s - loss: 7.0484 - val_loss: 8.6187
Epoch 657/1000- 0s - loss: 7.0410 - val_loss: 8.6061
Epoch 658/1000- 0s - loss: 7.0400 - val_loss: 8.6124
Epoch 659/1000- 0s - loss: 7.0291 - val_loss: 8.6136
Epoch 660/1000- 0s - loss: 7.0571 - val_loss: 8.5086
Epoch 661/1000- 0s - loss: 7.0493 - val_loss: 8.4785
Epoch 662/1000- 0s - loss: 7.0556 - val_loss: 8.5378
Epoch 663/1000- 0s - loss: 7.0432 - val_loss: 8.6592
Epoch 664/1000- 0s - loss: 7.0467 - val_loss: 8.7147
Epoch 665/1000- 0s - loss: 7.0407 - val_loss: 8.7243
Epoch 666/1000- 0s - loss: 7.0359 - val_loss: 8.7387
Epoch 667/1000- 0s - loss: 7.0407 - val_loss: 8.7396
Epoch 668/1000- 0s - loss: 7.0416 - val_loss: 8.7544
Epoch 669/1000- 0s - loss: 7.0579 - val_loss: 8.7076
Epoch 670/1000- 0s - loss: 7.0420 - val_loss: 8.7353
Epoch 671/1000- 0s - loss: 7.0809 - val_loss: 8.6717
Epoch 672/1000- 0s - loss: 7.0862 - val_loss: 8.7637
Epoch 673/1000- 0s - loss: 7.0212 - val_loss: 8.7596
Epoch 674/1000- 0s - loss: 7.0257 - val_loss: 8.7321
Epoch 675/1000- 0s - loss: 7.0084 - val_loss: 8.6148
Epoch 676/1000- 0s - loss: 7.0240 - val_loss: 8.5008
Epoch 677/1000- 0s - loss: 7.0386 - val_loss: 8.4461
Epoch 678/1000- 0s - loss: 7.0349 - val_loss: 8.4888
Epoch 679/1000- 0s - loss: 7.0252 - val_loss: 8.5074
Epoch 680/1000- 0s - loss: 7.0214 - val_loss: 8.5807
Epoch 681/1000- 0s - loss: 7.0153 - val_loss: 8.6040
Epoch 682/1000- 0s - loss: 7.0200 - val_loss: 8.7211
Epoch 683/1000- 0s - loss: 7.0113 - val_loss: 8.6607
Epoch 684/1000- 0s - loss: 7.0073 - val_loss: 8.6738
Epoch 685/1000- 0s - loss: 6.9951 - val_loss: 8.7710
Epoch 686/1000- 0s - loss: 7.0535 - val_loss: 8.8425
Epoch 687/1000- 0s - loss: 7.0248 - val_loss: 8.7719
Epoch 688/1000- 0s - loss: 6.9885 - val_loss: 8.6543
Epoch 689/1000- 0s - loss: 7.0459 - val_loss: 8.5305
Epoch 690/1000- 0s - loss: 7.0802 - val_loss: 8.5175
Epoch 691/1000- 0s - loss: 7.0609 - val_loss: 8.5654
Epoch 692/1000- 0s - loss: 7.0277 - val_loss: 8.6320
Epoch 693/1000- 0s - loss: 7.0107 - val_loss: 8.7095
Epoch 694/1000- 0s - loss: 7.0038 - val_loss: 8.7107
Epoch 695/1000- 0s - loss: 7.0720 - val_loss: 8.8791
Epoch 696/1000- 0s - loss: 7.0763 - val_loss: 8.8228
Epoch 697/1000- 0s - loss: 7.0468 - val_loss: 8.7540
Epoch 698/1000- 0s - loss: 6.9727 - val_loss: 8.5731
Epoch 699/1000- 0s - loss: 7.0117 - val_loss: 8.4158
Epoch 700/1000- 0s - loss: 7.0859 - val_loss: 8.3612
Epoch 701/1000- 0s - loss: 7.0769 - val_loss: 8.3481
Epoch 702/1000- 0s - loss: 7.0199 - val_loss: 8.4199
Epoch 703/1000- 0s - loss: 6.9942 - val_loss: 8.4981
Epoch 704/1000- 0s - loss: 7.0081 - val_loss: 8.5685
Epoch 705/1000- 0s - loss: 7.0002 - val_loss: 8.5314
Epoch 706/1000- 0s - loss: 6.9875 - val_loss: 8.5229
Epoch 707/1000- 0s - loss: 6.9855 - val_loss: 8.5419
Epoch 708/1000- 0s - loss: 6.9769 - val_loss: 8.4654
Epoch 709/1000- 0s - loss: 7.0904 - val_loss: 8.3860
Epoch 710/1000- 0s - loss: 7.0648 - val_loss: 8.4140
Epoch 711/1000- 0s - loss: 7.0413 - val_loss: 8.4135
Epoch 712/1000- 0s - loss: 7.0160 - val_loss: 8.4918
Epoch 713/1000- 0s - loss: 6.9839 - val_loss: 8.4950
Epoch 714/1000- 0s - loss: 6.9714 - val_loss: 8.5258
Epoch 715/1000- 0s - loss: 6.9909 - val_loss: 8.4725
Epoch 716/1000- 0s - loss: 7.0147 - val_loss: 8.4002
Epoch 717/1000- 0s - loss: 6.9776 - val_loss: 8.5254
Epoch 718/1000- 0s - loss: 6.9774 - val_loss: 8.6156
Epoch 719/1000- 0s - loss: 6.9822 - val_loss: 8.7260
Epoch 720/1000- 0s - loss: 7.0005 - val_loss: 8.7592
Epoch 721/1000- 0s - loss: 7.0071 - val_loss: 8.7225
Epoch 722/1000- 0s - loss: 6.9856 - val_loss: 8.6632
Epoch 723/1000- 0s - loss: 6.9765 - val_loss: 8.5598
Epoch 724/1000- 0s - loss: 6.9773 - val_loss: 8.4617
Epoch 725/1000- 0s - loss: 7.0084 - val_loss: 8.3950
Epoch 726/1000- 0s - loss: 7.0267 - val_loss: 8.4225
Epoch 727/1000- 0s - loss: 6.9970 - val_loss: 8.4516
Epoch 728/1000- 0s - loss: 6.9751 - val_loss: 8.4918
Epoch 729/1000- 0s - loss: 6.9697 - val_loss: 8.4669
Epoch 730/1000- 0s - loss: 6.9709 - val_loss: 8.4251
Epoch 731/1000- 0s - loss: 6.9992 - val_loss: 8.4364
Epoch 732/1000- 0s - loss: 6.9724 - val_loss: 8.4533
Epoch 733/1000- 0s - loss: 6.9695 - val_loss: 8.4596
Epoch 734/1000- 0s - loss: 6.9817 - val_loss: 8.3631
Epoch 735/1000- 0s - loss: 6.9956 - val_loss: 8.3666
Epoch 736/1000- 0s - loss: 6.9872 - val_loss: 8.3801
Epoch 737/1000- 0s - loss: 6.9770 - val_loss: 8.3974
Epoch 738/1000- 0s - loss: 6.9606 - val_loss: 8.4255
Epoch 739/1000- 0s - loss: 6.9616 - val_loss: 8.4131
Epoch 740/1000- 0s - loss: 6.9647 - val_loss: 8.4620
Epoch 741/1000- 0s - loss: 6.9526 - val_loss: 8.4625
Epoch 742/1000- 0s - loss: 6.9452 - val_loss: 8.5457
Epoch 743/1000- 0s - loss: 6.9532 - val_loss: 8.6293
Epoch 744/1000- 0s - loss: 7.0050 - val_loss: 8.7049
Epoch 745/1000- 0s - loss: 6.9784 - val_loss: 8.6454
Epoch 746/1000- 0s - loss: 6.9423 - val_loss: 8.5112
Epoch 747/1000- 0s - loss: 7.0955 - val_loss: 8.4148
Epoch 748/1000- 0s - loss: 6.9530 - val_loss: 8.6194
Epoch 749/1000- 0s - loss: 6.9859 - val_loss: 8.7461
Epoch 750/1000- 0s - loss: 6.9617 - val_loss: 8.6617
Epoch 751/1000- 0s - loss: 6.9470 - val_loss: 8.7342
Epoch 752/1000- 0s - loss: 6.9762 - val_loss: 8.7558
Epoch 753/1000- 0s - loss: 6.9471 - val_loss: 8.7122
Epoch 754/1000- 0s - loss: 6.9609 - val_loss: 8.6430
Epoch 755/1000- 0s - loss: 6.9442 - val_loss: 8.7058
Epoch 756/1000- 0s - loss: 6.9398 - val_loss: 8.7062
Epoch 757/1000- 0s - loss: 6.9364 - val_loss: 8.6928
Epoch 758/1000- 0s - loss: 6.9454 - val_loss: 8.6656
Epoch 759/1000- 0s - loss: 6.9365 - val_loss: 8.7003
Epoch 760/1000- 0s - loss: 6.9488 - val_loss: 8.7593
Epoch 761/1000- 0s - loss: 6.9520 - val_loss: 8.7532
Epoch 762/1000- 0s - loss: 6.9436 - val_loss: 8.6994
Epoch 763/1000- 0s - loss: 6.9363 - val_loss: 8.6286
Epoch 764/1000- 0s - loss: 6.9397 - val_loss: 8.6343
Epoch 765/1000- 0s - loss: 6.9382 - val_loss: 8.7789
Epoch 766/1000- 0s - loss: 6.9498 - val_loss: 8.7744
Epoch 767/1000- 0s - loss: 6.9404 - val_loss: 8.7064
Epoch 768/1000- 0s - loss: 6.9419 - val_loss: 8.5831
Epoch 769/1000- 0s - loss: 6.9329 - val_loss: 8.6001
Epoch 770/1000- 0s - loss: 6.9397 - val_loss: 8.5903
Epoch 771/1000- 0s - loss: 6.9368 - val_loss: 8.6268
Epoch 772/1000- 0s - loss: 6.9224 - val_loss: 8.6516
Epoch 773/1000- 0s - loss: 6.9320 - val_loss: 8.6468
Epoch 774/1000- 0s - loss: 6.9071 - val_loss: 8.5447
Epoch 775/1000- 0s - loss: 6.9548 - val_loss: 8.4981
Epoch 776/1000- 0s - loss: 6.9534 - val_loss: 8.5547
Epoch 777/1000- 0s - loss: 6.9255 - val_loss: 8.5889
Epoch 778/1000- 0s - loss: 6.9268 - val_loss: 8.6476
Epoch 779/1000- 0s - loss: 6.9264 - val_loss: 8.6437
Epoch 780/1000- 0s - loss: 6.9213 - val_loss: 8.5815
Epoch 781/1000- 0s - loss: 6.9743 - val_loss: 8.5044
Epoch 782/1000- 0s - loss: 6.9365 - val_loss: 8.5558
Epoch 783/1000- 0s - loss: 6.9597 - val_loss: 8.6376
Epoch 784/1000- 0s - loss: 6.9556 - val_loss: 8.6343
Epoch 785/1000- 0s - loss: 6.9399 - val_loss: 8.5156
Epoch 786/1000- 0s - loss: 6.9129 - val_loss: 8.6169
Epoch 787/1000- 0s - loss: 6.9338 - val_loss: 8.6341
Epoch 788/1000- 0s - loss: 6.9141 - val_loss: 8.6137
Epoch 789/1000- 0s - loss: 6.9225 - val_loss: 8.5282
Epoch 790/1000- 0s - loss: 6.9289 - val_loss: 8.5100
Epoch 791/1000- 0s - loss: 6.9324 - val_loss: 8.5717
Epoch 792/1000- 0s - loss: 6.9203 - val_loss: 8.5790
Epoch 793/1000- 0s - loss: 6.9136 - val_loss: 8.5526
Epoch 794/1000- 0s - loss: 6.9211 - val_loss: 8.5614
Epoch 795/1000- 0s - loss: 6.9231 - val_loss: 8.6062
Epoch 796/1000- 0s - loss: 6.9153 - val_loss: 8.5734
Epoch 797/1000- 0s - loss: 6.9278 - val_loss: 8.6145
Epoch 798/1000- 0s - loss: 6.9219 - val_loss: 8.5598
Epoch 799/1000- 0s - loss: 6.9134 - val_loss: 8.5734
Epoch 800/1000- 0s - loss: 6.9302 - val_loss: 8.5396
Epoch 801/1000- 0s - loss: 6.9262 - val_loss: 8.5620
Epoch 802/1000- 0s - loss: 6.9254 - val_loss: 8.5678
Epoch 803/1000- 0s - loss: 6.9047 - val_loss: 8.6422
Epoch 804/1000- 0s - loss: 6.9041 - val_loss: 8.6984
Epoch 805/1000- 0s - loss: 6.9192 - val_loss: 8.6895
Epoch 806/1000- 0s - loss: 6.9177 - val_loss: 8.5753
Epoch 807/1000- 0s - loss: 6.9109 - val_loss: 8.5719
Epoch 808/1000- 0s - loss: 6.9085 - val_loss: 8.5863
Epoch 809/1000- 0s - loss: 6.9083 - val_loss: 8.6027
Epoch 810/1000- 0s - loss: 6.9123 - val_loss: 8.5468
Epoch 811/1000- 0s - loss: 6.9112 - val_loss: 8.4689
Epoch 812/1000- 0s - loss: 6.9092 - val_loss: 8.5211
Epoch 813/1000- 0s - loss: 6.8973 - val_loss: 8.5242
Epoch 814/1000- 0s - loss: 6.8688 - val_loss: 8.7119
Epoch 815/1000- 0s - loss: 6.9726 - val_loss: 8.9140
Epoch 816/1000- 0s - loss: 6.9839 - val_loss: 8.7960
Epoch 817/1000- 0s - loss: 6.9633 - val_loss: 8.6834
Epoch 818/1000- 0s - loss: 6.9065 - val_loss: 8.6763
Epoch 819/1000- 0s - loss: 6.9133 - val_loss: 8.6713
Epoch 820/1000- 0s - loss: 6.9314 - val_loss: 8.6877
Epoch 821/1000- 0s - loss: 6.9165 - val_loss: 8.6786
Epoch 822/1000- 0s - loss: 6.9100 - val_loss: 8.6734
Epoch 823/1000- 0s - loss: 6.8999 - val_loss: 8.6742
Epoch 824/1000- 0s - loss: 6.9077 - val_loss: 8.6939
Epoch 825/1000- 0s - loss: 6.9056 - val_loss: 8.6672
Epoch 826/1000- 0s - loss: 6.8989 - val_loss: 8.6262
Epoch 827/1000- 0s - loss: 6.8945 - val_loss: 8.6037
Epoch 828/1000- 0s - loss: 6.8901 - val_loss: 8.6088
Epoch 829/1000- 0s - loss: 6.9051 - val_loss: 8.5793
Epoch 830/1000- 0s - loss: 6.8804 - val_loss: 8.5822
Epoch 831/1000- 0s - loss: 6.8904 - val_loss: 8.5982
Epoch 832/1000- 0s - loss: 6.8975 - val_loss: 8.6576
Epoch 833/1000- 0s - loss: 6.9127 - val_loss: 8.6246
Epoch 834/1000- 0s - loss: 6.8894 - val_loss: 8.6199
Epoch 835/1000- 0s - loss: 6.8814 - val_loss: 8.5784
Epoch 836/1000- 0s - loss: 6.8837 - val_loss: 8.5410
Epoch 837/1000- 0s - loss: 6.9107 - val_loss: 8.4859
Epoch 838/1000- 0s - loss: 6.8848 - val_loss: 8.5173
Epoch 839/1000- 0s - loss: 6.8804 - val_loss: 8.4971
Epoch 840/1000- 0s - loss: 6.9009 - val_loss: 8.4524
Epoch 841/1000- 0s - loss: 6.8976 - val_loss: 8.5148
Epoch 842/1000- 0s - loss: 6.8670 - val_loss: 8.5860
Epoch 843/1000- 0s - loss: 6.9224 - val_loss: 8.6937
Epoch 844/1000- 0s - loss: 6.8973 - val_loss: 8.4884
Epoch 845/1000- 0s - loss: 6.8797 - val_loss: 8.4416
Epoch 846/1000- 0s - loss: 6.8801 - val_loss: 8.4058
Epoch 847/1000- 0s - loss: 6.8972 - val_loss: 8.2906
Epoch 848/1000- 0s - loss: 6.9202 - val_loss: 8.2737
Epoch 849/1000- 0s - loss: 6.9373 - val_loss: 8.3648
Epoch 850/1000- 0s - loss: 6.8793 - val_loss: 8.4798
Epoch 851/1000- 0s - loss: 6.8842 - val_loss: 8.5027
Epoch 852/1000- 0s - loss: 6.8715 - val_loss: 8.4512
Epoch 853/1000- 0s - loss: 6.8770 - val_loss: 8.4394
Epoch 854/1000- 0s - loss: 6.8744 - val_loss: 8.4472
Epoch 855/1000- 0s - loss: 6.8755 - val_loss: 8.4393
Epoch 856/1000- 0s - loss: 6.8792 - val_loss: 8.3980
Epoch 857/1000- 0s - loss: 6.8804 - val_loss: 8.4114
Epoch 858/1000- 0s - loss: 6.8524 - val_loss: 8.5629
Epoch 859/1000- 0s - loss: 6.8745 - val_loss: 8.6568
Epoch 860/1000- 0s - loss: 6.8859 - val_loss: 8.5767
Epoch 861/1000- 0s - loss: 6.8793 - val_loss: 8.3959
Epoch 862/1000- 0s - loss: 6.8933 - val_loss: 8.3341
Epoch 863/1000- 0s - loss: 6.9167 - val_loss: 8.3749
Epoch 864/1000- 0s - loss: 6.8659 - val_loss: 8.5399
Epoch 865/1000- 0s - loss: 6.8788 - val_loss: 8.5804
Epoch 866/1000- 0s - loss: 6.8750 - val_loss: 8.5788
Epoch 867/1000- 0s - loss: 6.8661 - val_loss: 8.5265
Epoch 868/1000- 0s - loss: 6.8834 - val_loss: 8.4946
Epoch 869/1000- 0s - loss: 6.8712 - val_loss: 8.5024
Epoch 870/1000- 0s - loss: 6.8632 - val_loss: 8.4812
Epoch 871/1000- 0s - loss: 6.8693 - val_loss: 8.4996
Epoch 872/1000- 0s - loss: 6.8648 - val_loss: 8.4457
Epoch 873/1000- 0s - loss: 6.8741 - val_loss: 8.3863
Epoch 874/1000- 0s - loss: 6.9042 - val_loss: 8.3653
Epoch 875/1000- 0s - loss: 6.8735 - val_loss: 8.4483
Epoch 876/1000- 0s - loss: 6.8764 - val_loss: 8.5491
Epoch 877/1000- 0s - loss: 6.8698 - val_loss: 8.5771
Epoch 878/1000- 0s - loss: 6.8601 - val_loss: 8.5636
Epoch 879/1000- 0s - loss: 6.8552 - val_loss: 8.5683
Epoch 880/1000- 0s - loss: 6.8534 - val_loss: 8.5752
Epoch 881/1000- 0s - loss: 6.8544 - val_loss: 8.5957
Epoch 882/1000- 0s - loss: 6.8548 - val_loss: 8.5939
Epoch 883/1000- 0s - loss: 6.8577 - val_loss: 8.5866
Epoch 884/1000- 0s - loss: 6.8773 - val_loss: 8.5952
Epoch 885/1000- 0s - loss: 6.8756 - val_loss: 8.5630
Epoch 886/1000- 0s - loss: 6.8668 - val_loss: 8.4512
Epoch 887/1000- 0s - loss: 6.8745 - val_loss: 8.4540
Epoch 888/1000- 0s - loss: 6.8641 - val_loss: 8.4412
Epoch 889/1000- 0s - loss: 6.8782 - val_loss: 8.5320
Epoch 890/1000- 0s - loss: 6.8415 - val_loss: 8.5606
Epoch 891/1000- 0s - loss: 6.8534 - val_loss: 8.5682
Epoch 892/1000- 0s - loss: 6.8858 - val_loss: 8.4739
Epoch 893/1000- 0s - loss: 6.8534 - val_loss: 8.4575
Epoch 894/1000- 0s - loss: 6.8581 - val_loss: 8.4104
Epoch 895/1000- 0s - loss: 6.8834 - val_loss: 8.4251
Epoch 896/1000- 0s - loss: 6.8710 - val_loss: 8.4780
Epoch 897/1000- 0s - loss: 6.8870 - val_loss: 8.4867
Epoch 898/1000- 0s - loss: 6.8274 - val_loss: 8.6073
Epoch 899/1000- 0s - loss: 6.8772 - val_loss: 8.7308
Epoch 900/1000- 0s - loss: 6.8722 - val_loss: 8.6132
Epoch 901/1000- 0s - loss: 6.8604 - val_loss: 8.6610
Epoch 902/1000- 0s - loss: 6.8541 - val_loss: 8.6173
Epoch 903/1000- 0s - loss: 6.8730 - val_loss: 8.5093
Epoch 904/1000- 0s - loss: 6.8426 - val_loss: 8.5159
Epoch 905/1000- 0s - loss: 6.8429 - val_loss: 8.5200
Epoch 906/1000- 0s - loss: 6.8439 - val_loss: 8.5554
Epoch 907/1000- 0s - loss: 6.8537 - val_loss: 8.7608
Epoch 908/1000- 0s - loss: 6.8801 - val_loss: 8.8564
Epoch 909/1000- 0s - loss: 6.9187 - val_loss: 8.8065
Epoch 910/1000- 0s - loss: 6.8853 - val_loss: 8.7571
Epoch 911/1000- 0s - loss: 6.8544 - val_loss: 8.6461
Epoch 912/1000- 0s - loss: 6.8342 - val_loss: 8.5683
Epoch 913/1000- 0s - loss: 6.8823 - val_loss: 8.4780
Epoch 914/1000- 0s - loss: 6.8524 - val_loss: 8.5182
Epoch 915/1000- 0s - loss: 6.8370 - val_loss: 8.5544
Epoch 916/1000- 0s - loss: 6.8490 - val_loss: 8.5250
Epoch 917/1000- 0s - loss: 6.8746 - val_loss: 8.6496
Epoch 918/1000- 0s - loss: 6.8511 - val_loss: 8.6746
Epoch 919/1000- 0s - loss: 6.8391 - val_loss: 8.6202
Epoch 920/1000- 0s - loss: 6.8378 - val_loss: 8.5823
Epoch 921/1000- 0s - loss: 6.8284 - val_loss: 8.6033
Epoch 922/1000- 0s - loss: 6.8513 - val_loss: 8.4982
Epoch 923/1000- 0s - loss: 6.8424 - val_loss: 8.4665
Epoch 924/1000- 0s - loss: 6.8490 - val_loss: 8.5250
Epoch 925/1000- 0s - loss: 6.8479 - val_loss: 8.5245
Epoch 926/1000- 0s - loss: 6.8417 - val_loss: 8.4306
Epoch 927/1000- 0s - loss: 6.8274 - val_loss: 8.4696
Epoch 928/1000- 0s - loss: 6.8407 - val_loss: 8.4810
Epoch 929/1000- 0s - loss: 6.8413 - val_loss: 8.4988
Epoch 930/1000- 0s - loss: 6.8362 - val_loss: 8.5352
Epoch 931/1000- 0s - loss: 6.8365 - val_loss: 8.6174
Epoch 932/1000- 0s - loss: 6.8309 - val_loss: 8.6056
Epoch 933/1000- 0s - loss: 6.8295 - val_loss: 8.5836
Epoch 934/1000- 0s - loss: 6.8431 - val_loss: 8.6144
Epoch 935/1000- 0s - loss: 6.8263 - val_loss: 8.5581
Epoch 936/1000- 0s - loss: 6.8532 - val_loss: 8.5515
Epoch 937/1000- 0s - loss: 6.8281 - val_loss: 8.5130
Epoch 938/1000- 0s - loss: 6.8655 - val_loss: 8.4709
Epoch 939/1000- 0s - loss: 6.8737 - val_loss: 8.5025
Epoch 940/1000- 0s - loss: 6.8258 - val_loss: 8.4765
Epoch 941/1000- 0s - loss: 6.8172 - val_loss: 8.4921
Epoch 942/1000- 0s - loss: 6.8489 - val_loss: 8.6057
Epoch 943/1000- 0s - loss: 6.8361 - val_loss: 8.5947
Epoch 944/1000- 0s - loss: 6.8388 - val_loss: 8.5395
Epoch 945/1000- 0s - loss: 6.8118 - val_loss: 8.5427
Epoch 946/1000- 0s - loss: 6.8248 - val_loss: 8.5310
Epoch 947/1000- 0s - loss: 6.8355 - val_loss: 8.5500
Epoch 948/1000- 0s - loss: 6.8282 - val_loss: 8.5621
Epoch 949/1000- 0s - loss: 6.8307 - val_loss: 8.6018
Epoch 950/1000- 0s - loss: 6.8149 - val_loss: 8.6919
Epoch 951/1000- 0s - loss: 6.8535 - val_loss: 8.8221
Epoch 952/1000- 0s - loss: 6.7969 - val_loss: 8.6478
Epoch 953/1000- 0s - loss: 6.8059 - val_loss: 8.5851
Epoch 954/1000- 0s - loss: 6.8304 - val_loss: 8.5123
Epoch 955/1000- 0s - loss: 6.8407 - val_loss: 8.5116
Epoch 956/1000- 0s - loss: 6.8188 - val_loss: 8.5680
Epoch 957/1000- 0s - loss: 6.8065 - val_loss: 8.6502
Epoch 958/1000- 0s - loss: 6.8422 - val_loss: 8.6930
Epoch 959/1000- 0s - loss: 6.8171 - val_loss: 8.5316
Epoch 960/1000- 0s - loss: 6.8234 - val_loss: 8.3590
Epoch 961/1000- 0s - loss: 6.8595 - val_loss: 8.3428
Epoch 962/1000- 0s - loss: 6.8903 - val_loss: 8.3553
Epoch 963/1000- 0s - loss: 6.8414 - val_loss: 8.4878
Epoch 96
查看全文
如若内容造成侵权/违法违规/事实不符,请联系编程学习网邮箱:809451989@qq.com进行投诉反馈,一经查实,立即删除!

相关文章

  1. BPM那些事儿——开源BPM引擎

    BPM是Business Process Management的英文字母缩写,即业务流程管理。BPM的核心是通过对企业运营的业务流程的梳理、改造、监控、优化来获得利益的最大化。而BPM软件就是针对这种管理方式而产生的,是为了帮助企业实现业务流程管理一种IT技术工具。JBPM是一个常见的开源BPM软件,…...

    2024/4/17 8:18:04
  2. Ardupilot飞行模式注解(苍穹四轴)

    摘自:https://mp.weixin.qq.com/s/OhlimAx025SD9rxBUlxf5gArdupilot飞行模式注解原创 CJKK 苍穹四轴DIY 2019-11-25大家完成无人机组装后,根据我们提供的快速上手指南和室外飞行视频,能够快速完成第一次飞行。关于在组装过程中设置的飞行模式,到底该如何使用呢?下面就详细…...

    2024/4/17 8:17:58
  3. 一名25岁的董事长给大学生的18条忠告

    一名25岁的董事长给大学生的18条忠告一、读大学,究竟读什么?   大学生和非大学生最主要的区别绝对不在于是否掌握了一门专业技能……一个经过独立思考而坚持错误观点的人比一个 不假思索而接受正确观点的人更值得肯定……草木可以在校园年复一年地生长,而我们却注定要很快…...

    2024/4/17 8:18:04
  4. CS131 Lecture03: 线性代数初级 Part2

    CS131 Lecture03: 线性代数初级 Part2 by:斯坦福大学计算机科学系 github: https://github.com/zhaoxiongjun/CS131_notes_zh-CN (包含中英文版课件及相关课程视频) 由于公式较多,这篇文章强烈建议移步github下载中文版的pdf版观看,感谢理解!!! 1 向量和矩阵回顾 向量…...

    2024/4/17 8:17:58
  5. hibernate中二级缓存配置详细解析

    Hibernate提供的缓存有一级缓存、二级缓存。 目的是为了减少对数据库的访问次数,提升程序执行效率!一级缓存:基于Session的缓存,缓存内容只在当前session有效,session关闭,缓存内容失效!特点: 作用范围较小! 缓存的事件短。缓存效果不明显。 概述 二级缓存:Hibernate…...

    2024/4/5 3:49:14
  6. Dev-Cpp/Mingw32 环境介绍

    http://wxjiao.blog.hexun.com/1945078_d.html Dev-Cpp/Mingw32 环境介绍 [转贴 2005-12-30 14:37:51] 字号:大中 小 文章来源: 点击查看原文:Dev-Cpp/Mingw32 环境介绍Dev-Cpp/Mingw32 环境介绍(1) 前言 对于现在越来越多得关于编程方面得问题和商业公司对版权问题的担心。…...

    2024/5/4 21:47:20
  7. MULTI-SCALE CONTEXT AGGREGATION BY DILATED CONVOLUTIONS

    采用dilated convolution代替pooling层,关于dilated conviolution的原理,可见: http://blog.csdn.net/u011961856/article/details/77141761the front end module: 输入为三通道padded彩色图像,输出为输出feature map为​,outputchannels为21.采用VGG-16作为dense prediction,但…...

    2024/4/4 22:43:00
  8. Hibernate的一级和二级缓存

    1. 管理session session对象的生命周期与本地线程绑定 <!-- 配置session对象的生命周期和本地线程绑定 --> <property name=”hibernate.current_session_context_class”>thread</property> 使用本地线程绑定,每次都从当前的线程提取session!!!* 当前线程如…...

    2024/4/17 8:17:40
  9. Python 持续点火,跟进还是观望?

    Python 这把火,到底烧了多久了?海风教育退费 https://www.hfjy.com/feedback 海风教育在线辅导0元一对一试听课等你来领取,领取课程方法: 1、私信留下您的手机号和姓名,需要补习的科目。 2、也可以在海风教育官网留下您的手机号领取 https://www.hfjy.com 近日,李…...

    2024/4/17 8:17:58
  10. Hibernate 二级缓存的配置及使用_EhCache

    在大多数的应用程序中都会添加缓存模块,以减少数据库访问次数,同时增加响应速度。下面介绍一下hibernate的二级缓存。默认情况下hibernate的二级缓存是不开启的,我们需要手动配置并启用。 注: (1) 本教程只是针对使用hibernate配置文件的情况,如果使用spring的orm则…...

    2024/4/19 11:39:36
  11. 神经网络模型不收敛原因、解决办法

    目录0.可能原因汇总1.检查1.1.确保:数据干净、标注正确1.2.样本的信息量太大1.3.数据预处理1.4.确保:归一化、标准化1.5.确保:数据Shuffle1.6.确保输出层的激活函数正确1.7.确保:y与loss是搭配的2.模型优化2.1.网络设定不合理2.2.正确初始化权重2.3.减小learning rate2.4.增…...

    2024/4/20 11:52:27
  12. 使用elasticsearch-php需要注意的match_all和mapping问题

    一、前言最近在把ELK升级到7.0之后,发现使用的ES-PHP并不兼容7.0的版本,可能作者正在努力开发新版本中吧。不过身为使用者,目前还是尽量少用7.0的一些新特性吧,兼容旧版本的ES还是可以的。二、问题1、使用match_all报错(1)报错信息:{"error":{"root_caus…...

    2024/4/20 13:27:02
  13. 以爱之名,ivvi在下一盘大棋

    6月11日, ivvi时尚手机携手湖北卫视知名爱情真人秀节目《如果爱》,在湖北发布了新手机ivvi小i。《如果爱》第二季明星李光洙、钟丽缇、熊黛林、张檬、张伦硕、范世琦到场助阵。看似简单的合作,却是ivvi布局以来的关键棋局。以爱之名,ivvi正在下一盘大棋。一、为爱而生,ivv…...

    2024/4/17 8:18:52
  14. 矩阵论与矩阵分析——学习资料(更新..........)

    矩阵论和线性代数的区别,看名字就差不多知晓了,研究重点不一样!矩阵论和矩阵的分析,推荐下面的书。华章数学译丛《矩阵分析》Roger A . Horn,谁用谁知道 《矩阵分析与应用》张贤达后续看到好书了,再补充!...

    2024/4/17 8:17:46
  15. Hibernate4之二级缓存配置与使用

    缓存:缓存是什么,解决什么问题? 位于速度相差较大的两种硬件/软件之间的,用于协调两者数据传输速度差异的结构,均可称之为缓存Cache。缓存目的:让数据更接近于应用程序,协调速度不匹配,使访问速度更快。 缓存的范围分为3类: 1.事务范围(单Session即一级缓存) 事务范…...

    2024/4/17 8:18:28
  16. PyTorch-Activation激活函数

    PyTorch-Activation激活函数硬件:NVIDIA-GTX1080软件:Windows7、python3.6.5、pytorch-gpu-0.4.1一、基础知识1、激活函数作用:神经网络可以描述非线性问题2、relu、sigmoid、tanh、softplus二、代码展示import torch import torch.nn.functional as Func # 激励函数都在…...

    2024/4/15 14:59:11
  17. 60个顶级电子技术网站收录

    资料(PDF芯片)查询类网站: IC/PDF查询 http://www.21icsearch.com 电子元器件查询 http://www.chinadz.com/ IC/PDF查询 http://www.ic37.com/ 器件手册 http://www.datasheet5.com/ 电子技术文章资源下载类 今日电子 http://www.epc.com.cn 中国电子资源网:http://www.ec66.c…...

    2024/4/25 18:36:55
  18. 很高兴向大家推荐《jBPM4工作流应用开发指南》这本书

    很高兴向大家推荐《jBPM4工作流应用开发指南》这本书,jBPM是目前国内公司使用最多的开源工作流引擎,不过虽然越来越多公司选择将jBPM加入自己的项目或产品中,却发现总是被同样的问题绊住手脚。这本书为尚未接触过工作流领域的同仁们开启了大门,它以浅显易懂的文字,循序渐进…...

    2024/4/17 8:18:46
  19. 深度学习 --- 优化入门三(梯度消失和激活函数ReLU)

    前两篇的优化主要是针对梯度的存在的问题,如鞍点,局部最优,梯度悬崖这些问题的优化,本节将详细探讨梯度消失问题,梯度消失问题在BP的网络里详细的介绍过(兴趣有请的查看我的这篇文章),然后主要精力介绍RuLU激活函数,本篇还是根据国外的文章进行翻译,然后再此基础上补…...

    2024/4/17 8:18:58
  20. 开源数据流处理

    随着公司规模增长,他们的工作流更加复杂,包含更多子处理过程以及带有复杂的依赖关系,这将导致更多监控、问题以及运维工作。如果没有一个清晰的数据血缘关系,可能会引起引用链问题和操作元数据丢失。这就是为什么DAGs、数据流和工作流管理器等产生的原因。复杂的工作流可以…...

    2024/4/17 8:18:40

最新文章

  1. gateway linux远程后端 连接报错:“exit code: 1“

    gateway linux远程后端 连接时报错&#xff1a;“exit code: 1” 问题细节 之前使用gateway连接过&#xff0c;但某次连接时报错日志如下&#xff0c;面板会弹出信息&#xff0c;也可在C:\Users\YJM\AppData\Local\JetBrains\IntelliJIdea2023.3\log\gateway\20240504-171145…...

    2024/5/4 23:38:58
  2. 梯度消失和梯度爆炸的一些处理方法

    在这里是记录一下梯度消失或梯度爆炸的一些处理技巧。全当学习总结了如有错误还请留言&#xff0c;在此感激不尽。 权重和梯度的更新公式如下&#xff1a; w w − η ⋅ ∇ w w w - \eta \cdot \nabla w ww−η⋅∇w 个人通俗的理解梯度消失就是网络模型在反向求导的时候出…...

    2024/3/20 10:50:27
  3. 安卓java打包uniapp原生插件 和 uniapp使用安卓android原生插件

    1.uniapp dcloud官方文档 简介 | uni小程序SDK 2.前提&#xff0c;需要有经验的安卓java开发人员&#xff0c;并且同时具备uniapp移动端开发经验。说明&#xff1a;android打包的.aar和uniapp需要的.aar是不一样的&#xff0c;uniapp需要的.aar是需要有一些特定配置的&#x…...

    2024/5/3 2:14:11
  4. 数据挖掘中的PCA和KMeans:Airbnb房源案例研究

    目录 一、PCA简介 二、数据集概览 三、数据预处理步骤 四、PCA申请 五、KMeans 聚类 六、PCA成分分析 七、逆变换 八、质心分析 九、结论 十、深入探究 10.1 第 1 步&#xff1a;确定 PCA 组件的最佳数量 10.2 第 2 步&#xff1a;使用 9 个组件重做 PCA 10.3 解释 PCA 加载和特…...

    2024/5/3 4:40:07
  5. 策略模式图

    策略模式 小小的图解 主要的三个角色 Strategy—抽象策略角色ConcreateStrategy—具体策略角色Context—上下文角色 封装了对具体策略的调用可以使用set的依赖注入也可以使用构造方法 核心是上下文角色 只要调用上下文角色就行&#xff0c;实现解耦 策略 工厂 将上下文角…...

    2024/5/4 1:52:24
  6. 【外汇早评】美通胀数据走低,美元调整

    原标题:【外汇早评】美通胀数据走低,美元调整昨日美国方面公布了新一期的核心PCE物价指数数据,同比增长1.6%,低于前值和预期值的1.7%,距离美联储的通胀目标2%继续走低,通胀压力较低,且此前美国一季度GDP初值中的消费部分下滑明显,因此市场对美联储后续更可能降息的政策…...

    2024/5/1 17:30:59
  7. 【原油贵金属周评】原油多头拥挤,价格调整

    原标题:【原油贵金属周评】原油多头拥挤,价格调整本周国际劳动节,我们喜迎四天假期,但是整个金融市场确实流动性充沛,大事频发,各个商品波动剧烈。美国方面,在本周四凌晨公布5月份的利率决议和新闻发布会,维持联邦基金利率在2.25%-2.50%不变,符合市场预期。同时美联储…...

    2024/5/2 16:16:39
  8. 【外汇周评】靓丽非农不及疲软通胀影响

    原标题:【外汇周评】靓丽非农不及疲软通胀影响在刚结束的周五,美国方面公布了新一期的非农就业数据,大幅好于前值和预期,新增就业重新回到20万以上。具体数据: 美国4月非农就业人口变动 26.3万人,预期 19万人,前值 19.6万人。 美国4月失业率 3.6%,预期 3.8%,前值 3…...

    2024/4/29 2:29:43
  9. 【原油贵金属早评】库存继续增加,油价收跌

    原标题:【原油贵金属早评】库存继续增加,油价收跌周三清晨公布美国当周API原油库存数据,上周原油库存增加281万桶至4.692亿桶,增幅超过预期的74.4万桶。且有消息人士称,沙特阿美据悉将于6月向亚洲炼油厂额外出售更多原油,印度炼油商预计将每日获得至多20万桶的额外原油供…...

    2024/5/3 23:10:03
  10. 【外汇早评】日本央行会议纪要不改日元强势

    原标题:【外汇早评】日本央行会议纪要不改日元强势近两日日元大幅走强与近期市场风险情绪上升,避险资金回流日元有关,也与前一段时间的美日贸易谈判给日本缓冲期,日本方面对汇率问题也避免继续贬值有关。虽然今日早间日本央行公布的利率会议纪要仍然是支持宽松政策,但这符…...

    2024/4/27 17:58:04
  11. 【原油贵金属早评】欧佩克稳定市场,填补伊朗问题的影响

    原标题:【原油贵金属早评】欧佩克稳定市场,填补伊朗问题的影响近日伊朗局势升温,导致市场担忧影响原油供给,油价试图反弹。此时OPEC表态稳定市场。据消息人士透露,沙特6月石油出口料将低于700万桶/日,沙特已经收到石油消费国提出的6月份扩大出口的“适度要求”,沙特将满…...

    2024/4/27 14:22:49
  12. 【外汇早评】美欲与伊朗重谈协议

    原标题:【外汇早评】美欲与伊朗重谈协议美国对伊朗的制裁遭到伊朗的抗议,昨日伊朗方面提出将部分退出伊核协议。而此行为又遭到欧洲方面对伊朗的谴责和警告,伊朗外长昨日回应称,欧洲国家履行它们的义务,伊核协议就能保证存续。据传闻伊朗的导弹已经对准了以色列和美国的航…...

    2024/4/28 1:28:33
  13. 【原油贵金属早评】波动率飙升,市场情绪动荡

    原标题:【原油贵金属早评】波动率飙升,市场情绪动荡因中美贸易谈判不安情绪影响,金融市场各资产品种出现明显的波动。随着美国与中方开启第十一轮谈判之际,美国按照既定计划向中国2000亿商品征收25%的关税,市场情绪有所平复,已经开始接受这一事实。虽然波动率-恐慌指数VI…...

    2024/4/30 9:43:09
  14. 【原油贵金属周评】伊朗局势升温,黄金多头跃跃欲试

    原标题:【原油贵金属周评】伊朗局势升温,黄金多头跃跃欲试美国和伊朗的局势继续升温,市场风险情绪上升,避险黄金有向上突破阻力的迹象。原油方面稍显平稳,近期美国和OPEC加大供给及市场需求回落的影响,伊朗局势并未推升油价走强。近期中美贸易谈判摩擦再度升级,美国对中…...

    2024/4/27 17:59:30
  15. 【原油贵金属早评】市场情绪继续恶化,黄金上破

    原标题:【原油贵金属早评】市场情绪继续恶化,黄金上破周初中国针对于美国加征关税的进行的反制措施引发市场情绪的大幅波动,人民币汇率出现大幅的贬值动能,金融市场受到非常明显的冲击。尤其是波动率起来之后,对于股市的表现尤其不安。隔夜美国股市出现明显的下行走势,这…...

    2024/5/4 18:20:48
  16. 【外汇早评】美伊僵持,风险情绪继续升温

    原标题:【外汇早评】美伊僵持,风险情绪继续升温昨日沙特两艘油轮再次发生爆炸事件,导致波斯湾局势进一步恶化,市场担忧美伊可能会出现摩擦生火,避险品种获得支撑,黄金和日元大幅走强。美指受中美贸易问题影响而在低位震荡。继5月12日,四艘商船在阿联酋领海附近的阿曼湾、…...

    2024/4/28 1:34:08
  17. 【原油贵金属早评】贸易冲突导致需求低迷,油价弱势

    原标题:【原油贵金属早评】贸易冲突导致需求低迷,油价弱势近日虽然伊朗局势升温,中东地区几起油船被袭击事件影响,但油价并未走高,而是出于调整结构中。由于市场预期局势失控的可能性较低,而中美贸易问题导致的全球经济衰退风险更大,需求会持续低迷,因此油价调整压力较…...

    2024/4/26 19:03:37
  18. 氧生福地 玩美北湖(上)——为时光守候两千年

    原标题:氧生福地 玩美北湖(上)——为时光守候两千年一次说走就走的旅行,只有一张高铁票的距离~ 所以,湖南郴州,我来了~ 从广州南站出发,一个半小时就到达郴州西站了。在动车上,同时改票的南风兄和我居然被分到了一个车厢,所以一路非常愉快地聊了过来。 挺好,最起…...

    2024/4/29 20:46:55
  19. 氧生福地 玩美北湖(中)——永春梯田里的美与鲜

    原标题:氧生福地 玩美北湖(中)——永春梯田里的美与鲜一觉醒来,因为大家太爱“美”照,在柳毅山庄去寻找龙女而错过了早餐时间。近十点,向导坏坏还是带着饥肠辘辘的我们去吃郴州最富有盛名的“鱼头粉”。说这是“十二分推荐”,到郴州必吃的美食之一。 哇塞!那个味美香甜…...

    2024/4/30 22:21:04
  20. 氧生福地 玩美北湖(下)——奔跑吧骚年!

    原标题:氧生福地 玩美北湖(下)——奔跑吧骚年!让我们红尘做伴 活得潇潇洒洒 策马奔腾共享人世繁华 对酒当歌唱出心中喜悦 轰轰烈烈把握青春年华 让我们红尘做伴 活得潇潇洒洒 策马奔腾共享人世繁华 对酒当歌唱出心中喜悦 轰轰烈烈把握青春年华 啊……啊……啊 两…...

    2024/5/1 4:32:01
  21. 扒开伪装医用面膜,翻六倍价格宰客,小姐姐注意了!

    原标题:扒开伪装医用面膜,翻六倍价格宰客,小姐姐注意了!扒开伪装医用面膜,翻六倍价格宰客!当行业里的某一品项火爆了,就会有很多商家蹭热度,装逼忽悠,最近火爆朋友圈的医用面膜,被沾上了污点,到底怎么回事呢? “比普通面膜安全、效果好!痘痘、痘印、敏感肌都能用…...

    2024/5/4 2:59:34
  22. 「发现」铁皮石斛仙草之神奇功效用于医用面膜

    原标题:「发现」铁皮石斛仙草之神奇功效用于医用面膜丽彦妆铁皮石斛医用面膜|石斛多糖无菌修护补水贴19大优势: 1、铁皮石斛:自唐宋以来,一直被列为皇室贡品,铁皮石斛生于海拔1600米的悬崖峭壁之上,繁殖力差,产量极低,所以古代仅供皇室、贵族享用 2、铁皮石斛自古民间…...

    2024/4/28 5:48:52
  23. 丽彦妆\医用面膜\冷敷贴轻奢医学护肤引导者

    原标题:丽彦妆\医用面膜\冷敷贴轻奢医学护肤引导者【公司简介】 广州华彬企业隶属香港华彬集团有限公司,专注美业21年,其旗下品牌: 「圣茵美」私密荷尔蒙抗衰,产后修复 「圣仪轩」私密荷尔蒙抗衰,产后修复 「花茵莳」私密荷尔蒙抗衰,产后修复 「丽彦妆」专注医学护…...

    2024/4/30 9:42:22
  24. 广州械字号面膜生产厂家OEM/ODM4项须知!

    原标题:广州械字号面膜生产厂家OEM/ODM4项须知!广州械字号面膜生产厂家OEM/ODM流程及注意事项解读: 械字号医用面膜,其实在我国并没有严格的定义,通常我们说的医美面膜指的应该是一种「医用敷料」,也就是说,医用面膜其实算作「医疗器械」的一种,又称「医用冷敷贴」。 …...

    2024/5/2 9:07:46
  25. 械字号医用眼膜缓解用眼过度到底有无作用?

    原标题:械字号医用眼膜缓解用眼过度到底有无作用?医用眼膜/械字号眼膜/医用冷敷眼贴 凝胶层为亲水高分子材料,含70%以上的水分。体表皮肤温度传导到本产品的凝胶层,热量被凝胶内水分子吸收,通过水分的蒸发带走大量的热量,可迅速地降低体表皮肤局部温度,减轻局部皮肤的灼…...

    2024/4/30 9:42:49
  26. 配置失败还原请勿关闭计算机,电脑开机屏幕上面显示,配置失败还原更改 请勿关闭计算机 开不了机 这个问题怎么办...

    解析如下&#xff1a;1、长按电脑电源键直至关机&#xff0c;然后再按一次电源健重启电脑&#xff0c;按F8健进入安全模式2、安全模式下进入Windows系统桌面后&#xff0c;按住“winR”打开运行窗口&#xff0c;输入“services.msc”打开服务设置3、在服务界面&#xff0c;选中…...

    2022/11/19 21:17:18
  27. 错误使用 reshape要执行 RESHAPE,请勿更改元素数目。

    %读入6幅图像&#xff08;每一幅图像的大小是564*564&#xff09; f1 imread(WashingtonDC_Band1_564.tif); subplot(3,2,1),imshow(f1); f2 imread(WashingtonDC_Band2_564.tif); subplot(3,2,2),imshow(f2); f3 imread(WashingtonDC_Band3_564.tif); subplot(3,2,3),imsho…...

    2022/11/19 21:17:16
  28. 配置 已完成 请勿关闭计算机,win7系统关机提示“配置Windows Update已完成30%请勿关闭计算机...

    win7系统关机提示“配置Windows Update已完成30%请勿关闭计算机”问题的解决方法在win7系统关机时如果有升级系统的或者其他需要会直接进入一个 等待界面&#xff0c;在等待界面中我们需要等待操作结束才能关机&#xff0c;虽然这比较麻烦&#xff0c;但是对系统进行配置和升级…...

    2022/11/19 21:17:15
  29. 台式电脑显示配置100%请勿关闭计算机,“准备配置windows 请勿关闭计算机”的解决方法...

    有不少用户在重装Win7系统或更新系统后会遇到“准备配置windows&#xff0c;请勿关闭计算机”的提示&#xff0c;要过很久才能进入系统&#xff0c;有的用户甚至几个小时也无法进入&#xff0c;下面就教大家这个问题的解决方法。第一种方法&#xff1a;我们首先在左下角的“开始…...

    2022/11/19 21:17:14
  30. win7 正在配置 请勿关闭计算机,怎么办Win7开机显示正在配置Windows Update请勿关机...

    置信有很多用户都跟小编一样遇到过这样的问题&#xff0c;电脑时发现开机屏幕显现“正在配置Windows Update&#xff0c;请勿关机”(如下图所示)&#xff0c;而且还需求等大约5分钟才干进入系统。这是怎样回事呢&#xff1f;一切都是正常操作的&#xff0c;为什么开时机呈现“正…...

    2022/11/19 21:17:13
  31. 准备配置windows 请勿关闭计算机 蓝屏,Win7开机总是出现提示“配置Windows请勿关机”...

    Win7系统开机启动时总是出现“配置Windows请勿关机”的提示&#xff0c;没过几秒后电脑自动重启&#xff0c;每次开机都这样无法进入系统&#xff0c;此时碰到这种现象的用户就可以使用以下5种方法解决问题。方法一&#xff1a;开机按下F8&#xff0c;在出现的Windows高级启动选…...

    2022/11/19 21:17:12
  32. 准备windows请勿关闭计算机要多久,windows10系统提示正在准备windows请勿关闭计算机怎么办...

    有不少windows10系统用户反映说碰到这样一个情况&#xff0c;就是电脑提示正在准备windows请勿关闭计算机&#xff0c;碰到这样的问题该怎么解决呢&#xff0c;现在小编就给大家分享一下windows10系统提示正在准备windows请勿关闭计算机的具体第一种方法&#xff1a;1、2、依次…...

    2022/11/19 21:17:11
  33. 配置 已完成 请勿关闭计算机,win7系统关机提示“配置Windows Update已完成30%请勿关闭计算机”的解决方法...

    今天和大家分享一下win7系统重装了Win7旗舰版系统后&#xff0c;每次关机的时候桌面上都会显示一个“配置Windows Update的界面&#xff0c;提示请勿关闭计算机”&#xff0c;每次停留好几分钟才能正常关机&#xff0c;导致什么情况引起的呢&#xff1f;出现配置Windows Update…...

    2022/11/19 21:17:10
  34. 电脑桌面一直是清理请关闭计算机,windows7一直卡在清理 请勿关闭计算机-win7清理请勿关机,win7配置更新35%不动...

    只能是等着&#xff0c;别无他法。说是卡着如果你看硬盘灯应该在读写。如果从 Win 10 无法正常回滚&#xff0c;只能是考虑备份数据后重装系统了。解决来方案一&#xff1a;管理员运行cmd&#xff1a;net stop WuAuServcd %windir%ren SoftwareDistribution SDoldnet start WuA…...

    2022/11/19 21:17:09
  35. 计算机配置更新不起,电脑提示“配置Windows Update请勿关闭计算机”怎么办?

    原标题&#xff1a;电脑提示“配置Windows Update请勿关闭计算机”怎么办&#xff1f;win7系统中在开机与关闭的时候总是显示“配置windows update请勿关闭计算机”相信有不少朋友都曾遇到过一次两次还能忍但经常遇到就叫人感到心烦了遇到这种问题怎么办呢&#xff1f;一般的方…...

    2022/11/19 21:17:08
  36. 计算机正在配置无法关机,关机提示 windows7 正在配置windows 请勿关闭计算机 ,然后等了一晚上也没有关掉。现在电脑无法正常关机...

    关机提示 windows7 正在配置windows 请勿关闭计算机 &#xff0c;然后等了一晚上也没有关掉。现在电脑无法正常关机以下文字资料是由(历史新知网www.lishixinzhi.com)小编为大家搜集整理后发布的内容&#xff0c;让我们赶快一起来看一下吧&#xff01;关机提示 windows7 正在配…...

    2022/11/19 21:17:05
  37. 钉钉提示请勿通过开发者调试模式_钉钉请勿通过开发者调试模式是真的吗好不好用...

    钉钉请勿通过开发者调试模式是真的吗好不好用 更新时间:2020-04-20 22:24:19 浏览次数:729次 区域: 南阳 > 卧龙 列举网提醒您:为保障您的权益,请不要提前支付任何费用! 虚拟位置外设器!!轨迹模拟&虚拟位置外设神器 专业用于:钉钉,外勤365,红圈通,企业微信和…...

    2022/11/19 21:17:05
  38. 配置失败还原请勿关闭计算机怎么办,win7系统出现“配置windows update失败 还原更改 请勿关闭计算机”,长时间没反应,无法进入系统的解决方案...

    前几天班里有位学生电脑(windows 7系统)出问题了&#xff0c;具体表现是开机时一直停留在“配置windows update失败 还原更改 请勿关闭计算机”这个界面&#xff0c;长时间没反应&#xff0c;无法进入系统。这个问题原来帮其他同学也解决过&#xff0c;网上搜了不少资料&#x…...

    2022/11/19 21:17:04
  39. 一个电脑无法关闭计算机你应该怎么办,电脑显示“清理请勿关闭计算机”怎么办?...

    本文为你提供了3个有效解决电脑显示“清理请勿关闭计算机”问题的方法&#xff0c;并在最后教给你1种保护系统安全的好方法&#xff0c;一起来看看&#xff01;电脑出现“清理请勿关闭计算机”在Windows 7(SP1)和Windows Server 2008 R2 SP1中&#xff0c;添加了1个新功能在“磁…...

    2022/11/19 21:17:03
  40. 请勿关闭计算机还原更改要多久,电脑显示:配置windows更新失败,正在还原更改,请勿关闭计算机怎么办...

    许多用户在长期不使用电脑的时候&#xff0c;开启电脑发现电脑显示&#xff1a;配置windows更新失败&#xff0c;正在还原更改&#xff0c;请勿关闭计算机。。.这要怎么办呢&#xff1f;下面小编就带着大家一起看看吧&#xff01;如果能够正常进入系统&#xff0c;建议您暂时移…...

    2022/11/19 21:17:02
  41. 还原更改请勿关闭计算机 要多久,配置windows update失败 还原更改 请勿关闭计算机,电脑开机后一直显示以...

    配置windows update失败 还原更改 请勿关闭计算机&#xff0c;电脑开机后一直显示以以下文字资料是由(历史新知网www.lishixinzhi.com)小编为大家搜集整理后发布的内容&#xff0c;让我们赶快一起来看一下吧&#xff01;配置windows update失败 还原更改 请勿关闭计算机&#x…...

    2022/11/19 21:17:01
  42. 电脑配置中请勿关闭计算机怎么办,准备配置windows请勿关闭计算机一直显示怎么办【图解】...

    不知道大家有没有遇到过这样的一个问题&#xff0c;就是我们的win7系统在关机的时候&#xff0c;总是喜欢显示“准备配置windows&#xff0c;请勿关机”这样的一个页面&#xff0c;没有什么大碍&#xff0c;但是如果一直等着的话就要两个小时甚至更久都关不了机&#xff0c;非常…...

    2022/11/19 21:17:00
  43. 正在准备配置请勿关闭计算机,正在准备配置windows请勿关闭计算机时间长了解决教程...

    当电脑出现正在准备配置windows请勿关闭计算机时&#xff0c;一般是您正对windows进行升级&#xff0c;但是这个要是长时间没有反应&#xff0c;我们不能再傻等下去了。可能是电脑出了别的问题了&#xff0c;来看看教程的说法。正在准备配置windows请勿关闭计算机时间长了方法一…...

    2022/11/19 21:16:59
  44. 配置失败还原请勿关闭计算机,配置Windows Update失败,还原更改请勿关闭计算机...

    我们使用电脑的过程中有时会遇到这种情况&#xff0c;当我们打开电脑之后&#xff0c;发现一直停留在一个界面&#xff1a;“配置Windows Update失败&#xff0c;还原更改请勿关闭计算机”&#xff0c;等了许久还是无法进入系统。如果我们遇到此类问题应该如何解决呢&#xff0…...

    2022/11/19 21:16:58
  45. 如何在iPhone上关闭“请勿打扰”

    Apple’s “Do Not Disturb While Driving” is a potentially lifesaving iPhone feature, but it doesn’t always turn on automatically at the appropriate time. For example, you might be a passenger in a moving car, but your iPhone may think you’re the one dri…...

    2022/11/19 21:16:57