Tests fail with: --8<---------------cut here---------------start------------->8--- =================================== FAILURES =================================== __________________________________ test_selu ___________________________________ def test_selu(): x = K.placeholder(ndim=2) f = K.function([x], [activations.selu(x)]) alpha = 1.6732632423543772848170429916717 scale = 1.0507009873554804934193349852946 positive_values = get_standard_values() result = f([positive_values])[0] assert_allclose(result, positive_values * scale, rtol=1e-05) negative_values = np.array([[-1, -2]], dtype=K.floatx()) result = f([negative_values])[0] true_result = (np.exp(negative_values) - 1) * scale * alpha > assert_allclose(result, true_result) E AssertionError: E Not equal to tolerance rtol=1e-07, atol=0 E E Mismatch: 50% E Max absolute difference: 1.1920929e-07 E Max relative difference: 1.0726715e-07 E x: array([[-1.111331, -1.520167]], dtype=float32) E y: array([[-1.111331, -1.520167]], dtype=float32) tests/keras/activations_test.py:226: AssertionError --8<---------------cut here---------------end--------------->8--- See attached log. -- Pierre Neidhardt https://ambrevar.xyz/