This sucks! i only say that because my boyfriend is Japanese & he is getting really into the Samurai culture. He's developing a respect for his culture & this is undermining that cultural respect & his quest for his identity. What the hell happened, Japan? i know this is mainstream, ( and i was being tongue in cheek, before ) , but really...What the hell has become of Japanese culture? i think there was a very good reason they fought to keep the west out. Freakin' weirdo's!