Children should know a human body is a natural thing so they won't grow up oversensitve and act all shocked and judgemental when they see someone (semi-) naked in the future so no.
I don't really think so. I mean, anytime that a parent raises a child in an environment that is not seen as "normal" will be harmful in such a way that they will be treated/feel like an outsider later in life, but that is due to society as a whole instead of the specific nudism in this case. A child will not think it is wrong, because they will not know to think it is wrong, and it is not intrinsically wrong.
Nudity is only sexualized when people make it so, there's nothing harmful about it. It actually would promote body positivity, self-respect and a sex positive outlook, because if you haven't noticed, making bodies taboo hasn't done teens many favours for making the right choices
People don't need to see other people naked. Otherwise why would we have created clothes in the first place? We cover are bodies not because we are ashamed but to have respect to other beings. If people walked around naked, rape crimes would create a huge spark in rise among our culture. We're attracted to genitals and this would be a horrible influence on our children. Like gay people are gay because they don't like the va*ina. And lesbians are lesbians because they don't like d**k. It's easy as that.