lesson 9/11

This commit is contained in:
elvis
2023-11-17 12:42:12 +01:00
parent 9e7f427faa
commit 7afb1372f8
23 changed files with 7604 additions and 0 deletions

View File

@ -0,0 +1,42 @@
function v = roughNN( w , x )
%
% v = roughNN( w , x )
%
% returns the falue of the function v = f( x ) as currently estimated by
% a small NN with 1 input, 1 output, 3 hidden layers of 5 nodes each, and
% tanh activation function.
%
% Input:
%
% - w is the [ 76 x 1 ] real vector containing the weights of the NN,
% i.e., w is made as follows:
% [ 1 .. 5 ] are the [ 5 x 1 ] weigths of the first layer
% [ 6 .. 10 ] are the [ 5 x 1 ] biases of the first layer
% [ 11 .. 35 ] are the [ 5 x 5 ] weigths of the second layer
% [ 36 .. 40 ] are the [ 5 x 1 ] biases of the second layer
% [ 41 .. 65 ] are the [ 5 x 5 ] weigths of the third layer
% [ 66 .. 70 ] are the [ 5 x 1 ] biases of the third layer
% [ 71 .. 75 ] are the [ 5 x 1 ] weigths of the fourth (output) layer
% [ 76 ] is the [ 1 x 1 ] bias of the fourth (output) layer
%
% - x is the real scalar containing the input of f()
%
% Output:
%
% - v (real, scalar): v = f( x ) as estimated by the NN with weights w
%
%{
% =======================================
% Author: Antonio Frangioni
% Date: 28-08-22
% Version 1.00
% Copyright Antonio Frangioni
% =======================================
%}
g = tanh( ( ones( 5 , 1 ) * x ) .* w( 1 : 5 ) + w( 6 : 10 ) );
g = tanh( reshape( w( 11 : 35 ) , [ 5 5 ] ) * g + w( 36 : 40 ) );
g = tanh( reshape( w( 41 : 65 ) , [ 5 5 ] ) * g + w( 66 : 70 ) );
v = g' * w( 71 : 75 ) + w( 76 );
end