我的交叉熵函数的实现有什么问题?
machine-learning
neural-network
python
6
0

我正在学习神经网络,我想在python中编写一个函数cross_entropy 。定义为

交叉熵

其中N是样本数, k是类别数, log是自然对数,如果样本i在类别j ,则t_i,j为1,否则为0p_i,j是样本ij类。为了避免对数的数值问题,请将预测裁剪到[10^{−12}, 1 − 10^{−12}]范围。

根据上面的描述,我通过将预测剪裁到[epsilon, 1 − epsilon]范围为[epsilon, 1 − epsilon] ,然后根据以上公式计算cross_entropy来写下代码。

def cross_entropy(predictions, targets, epsilon=1e-12):
    """
    Computes cross entropy between targets (encoded as one-hot vectors)
    and predictions. 
    Input: predictions (N, k) ndarray
           targets (N, k) ndarray        
    Returns: scalar
    """
    predictions = np.clip(predictions, epsilon, 1. - epsilon)
    ce = - np.mean(np.log(predictions) * targets) 
    return ce

以下代码将用于检查函数cross_entropy是否正确。

predictions = np.array([[0.25,0.25,0.25,0.25],
                        [0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
                  [0,0,0,1]])
ans = 0.71355817782  #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))

以上代码的输出为False,也就是说我定义函数cross_entropy代码不正确。然后我打印出cross_entropy(predictions, targets) 。它给出了0.178389544455 ,正确的结果应该是ans = 0.71355817782 。有人可以帮我检查我的代码有什么问题吗?

参考资料:
Stack Overflow
收藏
评论
共 2 个回答
高赞 时间 活跃

您一点也不远,但请记住,您正在取N个和的平均值,其中N = 2(在这种情况下)。因此您的代码可以读为:

def cross_entropy(predictions, targets, epsilon=1e-12):
    """
    Computes cross entropy between targets (encoded as one-hot vectors)
    and predictions. 
    Input: predictions (N, k) ndarray
           targets (N, k) ndarray        
    Returns: scalar
    """
    predictions = np.clip(predictions, epsilon, 1. - epsilon)
    N = predictions.shape[0]
    ce = -np.sum(targets*np.log(predictions+1e-9))/N
    return ce

predictions = np.array([[0.25,0.25,0.25,0.25],
                        [0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
                   [0,0,0,1]])
ans = 0.71355817782  #Correct answer
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))

在这里,我认为如果坚持使用np.sum()更清楚一些。另外,我在np.log()添加了1e-9,以避免在计算中出现log(0)的可能性。希望这可以帮助!

注意:根据@Peter的注释,如果您的epsilon值大于0 ,则1e-9的偏移量确实是多余的。

收藏
评论
def cross_entropy(x, y):
    """ Computes cross entropy between two distributions.
    Input: x: iterabale of N non-negative values
           y: iterabale of N non-negative values
    Returns: scalar
    """

    if np.any(x < 0) or np.any(y < 0):
        raise ValueError('Negative values exist.')

    # Force to proper probability mass function.
    x = np.array(x, dtype=np.float)
    y = np.array(y, dtype=np.float)
    x /= np.sum(x)
    y /= np.sum(y)

    # Ignore zero 'y' elements.
    mask = y > 0
    x = x[mask]
    y = y[mask]    
    ce = -np.sum(x * np.log(y)) 
    return ce

def cross_entropy_via_scipy(x, y):
        ''' SEE: https://en.wikipedia.org/wiki/Cross_entropy'''
        return  entropy(x) + entropy(x, y)

from scipy.stats import entropy, truncnorm

x = truncnorm.rvs(0.1, 2, size=100)
y = truncnorm.rvs(0.1, 2, size=100)
print np.isclose(cross_entropy(x, y), cross_entropy_via_scipy(x, y))
收藏
评论
新手导航
  • 社区规范
  • 提出问题
  • 进行投票
  • 个人资料
  • 优化问题
  • 回答问题

关于我们

常见问题

内容许可

联系我们

@2020 AskGo
京ICP备20001863号