Give an array , Add the columns ( That's ok ) normalization ( Zoom to [0,1] )

Method 1
import numpy as np x = np.array([[1000, 10, 0.5], [ 765, 5, 0.35], [ 800, 7,
0.09]]) x_normed = x / x.max(axis=0) print(x_normed) # [[ 1. 1. 1. ] # [ 0.765
0.5 0.7 ] # [ 0.8 0.7 0.18 ]]
x.max(axis=0)
In the 0 Take the maximum value on the dimension ( That is, each line ), Returns a row vector (ncols,), Contains the maximum value for each column , And then you can use it x Divided by this vector , The maximum value of each column is then scaled to 1.

How to determine axis Value of , Remember, you just need to axis The dimension assigned is the dimension to be compressed , If you want to get the maximum value of each column , The row dimension needs to be compressed .

Method 2
from sklearn.preprocessing import normalize data = np.array([ [1000, 10, 0.5],
[765, 5, 0.35], [800, 7, 0.09], ]) data = normalize(data, axis=0, norm='max')
print(data) >>[[ 1. 1. 1. ] [ 0.765 0.5 0.7 ] [ 0.8 0.7 0.18 ]]
  use sklearn.preprocessing

 

reference resources   
https://stackoverflow.com/questions/29661574/normalize-numpy-array-columns-in-python

<https://stackoverflow.com/questions/29661574/normalize-numpy-array-columns-in-python>