📜  numpy softmax - Python (1)

📅  最后修改于: 2023-12-03 14:44:48.543000             🧑  作者: Mango

Numpy Softmax - Python

Introduction

In deep learning, softmax function is commonly used for multiclass classification. It outputs a probability distribution over the classes. In this article, we will discuss how to implement softmax function using NumPy.

NumPy Softmax Implementation

Here is the implementation of softmax function using NumPy:

import numpy as np

def softmax(x):
    # apply exponential function to each element of x
    exp_x = np.exp(x)
    # sum over the second dimension(axis=1)
    sum_exp_x = np.sum(exp_x, axis=1, keepdims=True)
    # compute softmax
    return exp_x / sum_exp_x

The above code applies exponential function to each element of x, then it sums over the second dimension and finally it computes softmax.

Usage

To use the softmax function, you can pass the input tensor to it as shown below:

#input tensor
x = np.array([[1, 2, 3],
              [4, 5, 6],
              [7, 8, 9]])

#softmax of input tensor
softmax_x = softmax(x)

#print softmax tensor
print(softmax_x)

The above code will output the softmax tensor as:

[[0.09003057 0.24472847 0.66524096]
 [0.09003057 0.24472847 0.66524096]
 [0.09003057 0.24472847 0.66524096]]
Conclusion

In this article, we discussed how to implement softmax function using NumPy. Softmax function can be used in multiclass classification problems. It outputs a probability distribution over the classes. By using NumPy, we can implement it very easily.