TransWikia.com

Equation of a Multi-Layer Perceptron Network

Data Science Asked by Vasco Ferreira on May 4, 2021

I’m writing an article about business management of wine companies where I use a Multi-Layer Perceptron Network.

My teacher then asked me to write an equation that lets me calculate the output of the network. My answer was that due to the nature of multi-layer perceptron networks there is no single equation per se. What I have is a table of weights and bias. I can then use this formula:

$$f(x) = (sum^{m}_{i=1} w_i * x_i) + b$$

Where:

  • m is the number of neurons in the previous layer,
  • w is a random weight,
  • x is the input value,
  • b is a random bias.

Doing this for each layer/neuron in the hidden layers and the output layer.

She showed me an example of another work she made (image on the bottom), telling me that it should be something like that. Looking at the chart, I suppose that it is a logistic regression.

So, my questions are the following:

  1. Is there any equation to predict the output of a multi-layer perceptron network other than iterating over each neuron with $w*x+b$?
  2. Should I just tell my teacher that a logistic regression is a different case and the same does not apply to this type of neural networks?
  3. Is the first formula correct to show that a value of a neuron is the sum product of the previous layers plus the bias?

Example Formula of a network


Edit 1: I didn’t wrote the formula but I do also have activation functions (relu).

One Answer

You are forgetting one element of the MLP which is the activation function. If your activation function is linear - then you can simply flatten out all the neurons into one single linear equation. The advantage of MLP however is its non-linearities so I suspect in your network you do have some activation (sigmoid? tanh? relu? etc..).

As for your graph - you could simply output predictions from your MLP and plot the exact scatter plot you have above. The only difference would be you wouldn't have a simple way of expressing this network in algebraic notation (as you have done on the existing x-axis).

To describe networks effectively in text you should look into matrix notation describing the weights and inputs of each layer. Maybe take a look at something like this to get started: https://www.jeremyjordan.me/intro-to-neural-networks/

Correct answer by Oliver Foster on May 4, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP