Why do many people view men painting their nails or wearing makeup as wrong? My parents told me it was not good as a man to paint my nails and someone at my work made fun of a guy for having painted nails, but I don't understand why it is only wrong for men. To my knowledge, it does not go against any religion or have any logical reason behind it.