I have been wanting to Travel to Japan, find the culture interesting and have been learning the language.
But ngl, seeing how other tourists have been treating Japan recently is turning me away because it feels like there's a lot of tension now between tourists and natives. I am also black and I know that Japan doesn't have a good history with colourism and a lot of the tourists causing trouble have also been black which makes things even worse for me if I went there.
Is it still worth visiting or will I just be profiled and looked at weirdly. Tell me now before I waste time learning more of the language.