TransWikia.com

Neural networks: how to think of it?

Data Science Asked by Luca Di Mauro on August 18, 2020

I am trying to understand neural networks in easy and visual way, specifically neutral networks used for text classification and analysis.
I know that there are several ways to build a NN and an easy way to think of it is as follows: enter image description here

or, in a more schematic way, as follows:

enter image description here

What I am still not understanding is the layer(s). Let’s suppose I have a text and I want to find similarity between words: I would use algorithms such as cosine similarity or Jaccard or word2vec if I am interested in synonymous.
Now each of them takes an input, for example one or more sentences: "I like play basketball" or/and "My mom is an English teacher at the University of Cambridge". If I want to test the similarity of words within the same sentence I should consider to tokenize the sentence, then ‘doing something’ that I do not know yet (I hope you can tell me more about this step), and apply an algorithm which says, for example:

  • if two words have the same length, then consider them similar with value 0.5;
    or
  • if two words have same letters, taking into account the length of the word, than you can consider the number of letters in common divided by their length, and assign this value… (just for saying);

and so on.
My output should be the value of this similarity comparison, should it not?
My question, therefore, is the following:

what is a neutral network and how I can think of it when I apply such problems with ‘hidden layers’ (word improperly used as example here) where an algorithm that I cannot see (because others built it already) is applied ?

Could you please provide an easy textual/numerical example in order to make me easy the understanding, if I am wrong in the example mentioned above?

Thanks a mill.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP