isn't it funny, and by funny i mean absolutely disgusting, how gay relationships were accepted in various cultures UNTIL europeans came along and those "practices" were ultimately destroyed in brutal fashion?
Many countries that were colonized struggle to get by to this day and aren't as "accepting" of lgbt rights BECAUSE of western society and Christianity. Now, westerners are seen as the "liberal" ones when they were the ones to impose these violent ideals into other people's minds