AnswerBun.com

Combining categorical and continuous features for neural networks

Cross Validated Asked by 3michelin on August 5, 2020

Is it OK to combine categorical and continuous features into the same vector for training deep neural networks? Say there is a categorical feature and continuous feature that I want to feed into a deep neural net at the same time. Is this the way to do it?

categorical feature (one-hot encoded) = [0,0,0,1,0]
continuous feature (number) = 8
final feature vector passed into neural network = categorical feature vector CONCATENATE continuous feature = [0,0,0,1,0,8]

Basically, the question is, is it OK to have a one-hot encoding and a continuous feature together in one feature vector?

2 Answers

Yes, that is one typical way of doing it. But, you need to standardize your features so that gradient descent doesn't suffer, and the regularization treats your weights equally. One way is to standardize the numerical features and then concatenate the one-hot vectors, and the other way is standardizing together. As far as I see, there is no consensus over the two.

Correct answer by gunes on August 5, 2020

Yes, this is absolutely standard.

Answered by Sycorax on August 5, 2020

Add your own answers!

Related Questions

Pseudo R2 and prob>chi2

1  Asked on January 3, 2021 by nsamwa

 

Interpretation of TSA::arimax output model is presented in R

1  Asked on January 2, 2021 by wasif

   

Belief propagation on Polytree

0  Asked on January 2, 2021 by jonasc

   

Split train//validation/test sets by time, is it correct?

3  Asked on December 31, 2020 by wishihadabettername

     

Chi squared test questions

0  Asked on December 30, 2020 by woodpigeon

     

QQ plot comparison of z-normalized datasets

1  Asked on December 30, 2020 by prinzvonk

     

Ask a Question

Get help from others!

© 2022 AnswerBun.com. All rights reserved. Sites we Love: PCI Database, MenuIva, UKBizDB, Menu Kuliner, Sharing RPP, SolveDir