16 - 误差计算

7个月前 257次点击 来自 TensorFlow

收录专题: TensorFlow入门笔记

1.MSE(均方误差)

MSE是指真实值与预测值(估计值)差平方的期望,结果越大,表明预测效果越差

y = tf.constant([1,2,3,0,2])
y = tf.one_hot(y,depth=4)
y = tf.cast(y,dtype=tf.float32)

out = tf.random.normal([5,4])
# MSE标准定义方式
loss1 = tf.reduce_mean(tf.square(y-out))
# L2-norm的标准定义方式
loss2 = tf.square(tf.norm(y-out))/(5*4)
# 直接调用losses中的MSE函数
loss3 = tf.reduce_mean(tf.losses.MSE(y,out))

print(loss1)
print(loss2)
print(loss3)

2.Cross Entropy Loss(交叉熵)

香农用信息熵的概念来描述信源的不确定度。
通常,一个信源发送出什么符号是不确定的,衡量它可以根据其出现的概率来度量。
概率大,出现机会多,不确定性小;反之不确定性就大。变量的不确定性越大,熵也就越大,把它搞清楚所需要的信息量也就越大。

交叉熵(Cross Entropy)是Shannon信息论中一个重要概念,主要用于度量两个概率分布间的差异性信息。
交叉熵越小,信息量越大,模型越不稳定。

a = tf.fill([4],0.25)
a = a*tf.math.log(a)/tf.math.log(2.)
print(a)
CEL = -tf.reduce_sum(a*tf.math.log(a)/tf.math.log(2.))
print(CEL)

a = tf.constant([0.1,0.1,0.1,0.7])
CEL = -tf.reduce_sum(a*tf.math.log(a)/tf.math.log(2.))
print(CEL)

a = tf.constant([0.01,0.01,0.01,0.97])
CEL = -tf.reduce_sum(a*tf.math.log(a)/tf.math.log(2.))
print(CEL)

计算多分类问题的交叉熵

loss1 = tf.losses.categorical_crossentropy([0,1,0,0],[0.25,0.25,0.25,0.25])
loss2 = tf.losses.categorical_crossentropy([0,1,0,0],[0.1,0.1,0.7,0.1])
loss3 = tf.losses.categorical_crossentropy([0,1,0,0],[0.1,0.7,0.1,0.1])
loss4 = tf.losses.categorical_crossentropy([0,1,0,0],[0.01,0.97,0.01,0.01])
print(loss1)
print(loss2)
print(loss3)
print(loss4)
loss5 = criteon([0,1,0,0],[0.01,0.97,0.01,0.01]) # 函数功能等同于tf.losses.categorical_crossentropy
print(loss5)
Card image cap
开发者雷

尘世间一个小小的开发者,每天增加一些无聊的知识,就不会无聊了

要加油~~~

技术文档 >> 系列应用 >>
热推应用
Let'sLearnSwift
学习Swift的入门教程
PyPie
Python is as good as Pie
标签