• dsign 11 hours ago |
    My (cynic) take: using LLMs to police speech, down to the minutest nuance.

    There are bad things that we humans say and do. But let's stop using AIs to do our police work. Nothing good lies down that path.

    • persnickety 10 hours ago |
      What's the difference what tool is used to police speech? I fail to see any difference in goodness between this and any conceivable alternative..
      • blackbear_ 10 hours ago |
        Efficiency in terms of cost and time makes a big the difference between something becoming widely used or remaining a curiosity.
        • brobdingnagians 8 hours ago |
          But the presumption that we should be policing speech is not necessarily correct. Making it cheaper to do a thing which ought not be done is no real benefit to humanity.
    • weinzierl 10 hours ago |
      Models are biased, but it is a bias we control through the selection of training data. The act of inference reflects this bias but is free from arbitrary rule otherwise.

      I think this is a big step forward compared to policing by moody humans[1]. LLMs are a knife, let's use them for good instead of demonizing them.

      [1] https://en.m.wikipedia.org/wiki/Hungry_judge_effect

      • ch4s3 9 hours ago |
        > but it is a bias we control through the selection of training data

        I don't find this to be very satisfying. It implies that the "bias of the model" is something we can quantify, understand, and control for during training. I'm sure you can come up with canned pat answers about coarse grained biases that have a lot of social valiance, but how do you know the model is weighted towards or against subtle word choices in millions of different scenarios.

        People seem to have enough trouble understanding the nuances of texts that are coming from another cultural perspective or form merely 20 years ago. Its a common enough experience to see young people cringing in response to the popular media of 30 years ago, because cultural expectations change, words shift in meaning, the "biases" of society move. LLMs are trained on a mountain of old books, media, internet forms, twitter and facebook posts, etc. How could you ever get a handle on just the "biases" of the training data from the contemporary internet sources, let alone trying to pick apart the backed in cultural assumptions of a bunch of books spanning decades?

        • weinzierl 8 hours ago |
          When I said, we control the bias I meant we can control for it when we try to make a model as unbiased as possible. I think we can do that well even if the result will never be completely unbiased.

          When it comes to giving a model a deliberate bias I agree with you.

      • aiforecastthway 5 hours ago |
        > Models are biased, but it is a bias we control through the selection of training data.

        But what if controlling bias is impossible because the very format of the data demands its emergence?

        I always thought both structuralism and poststructuralism were the worst types of in(s/n)ane navel gazing. I think I even wrote on a course feedback section in a final many, many years ago that these topics were massive wastes of my time. (We didn't have "evals" as such, but some faculty would elicit feedback on a separate sheet of paper you could leave with the final exam.)

        Observation: the more time I spend just letting the algorithms converge under a huge variety of treatments, the more convinced I become that the most basic structures of our language cast a spell on human understanding of the world, and that the interactions between the primitives of our language and human meaning-making is FAR more deterministic than we would like. Especially in terms of how they direct certain emergent phenomena in human groups.

        Implication: maybe it's reasonable to hypothesize that bias isn't "in the data". Maybe bias emerges, nearly deterministically, from the underlying grammatical structure of the data, at least in certain semantic contexts. Maybe certain utterances, and the persuasive power of those utterances, correspond to a sort of default hard-wired path for a majority subset of the population. A co-occurrence of certain neural pathways and certain grammar/semantics that eventually must arise in a given language and socially situated use of that language.

        FWIW: I have almost no social science training, the training I did get was received by me with extreme skepticism, and this sounds bunk to me even as I type it. So, for me, this hypothesis would have sounded stupid -- or at least inane/pointless -- before 2019. But I'm becoming increasingly convinced that the languages we've converged on encode certain types of group preferene/bias at an extremely fundamental level.

        ...either that, or I'm far FAR worse at filtering/generating data than I give myself credit for. But in at least a few cases I have literally reviewed every token within a certain delta of influencing the models behavior on an input sequence and see the same results.

    • libertine 7 hours ago |
      You're missing when LLMs are used to misinform, manipulate, and create chaos.

      The way you're framing it is "let's use limited humans to moderate an infinite number of LLM bots" - how do you go about it?

  • Aeolun 11 hours ago |
    It’s kind of bizarre that I went into this article searching for evidence that Russia is worse (whatever that means) then was dissapointed when I didn’t find it.
    • hackandthink 10 hours ago |
      The others other us and we other the others.

      What a stupid world.

    • elefanten 10 hours ago |
      Concluding "who's worse" wasn't the point of this paper, it was to implement some automated analysis based on old social science models.

      But, I'm surprised you conclude what you do because the high level findings seem to indicate that Russia war bloggers are more prone to the more extreme forms of "otherization" per the 4 degrees laid out in the model -- e.g. figure 7 shows Russian war bloggers significantly (but not massively) overindex on the 3rd (Villainization) and 4th (Dehumanization) degrees, which are the worse of the two, compared to Ukrainian war bloggers, who only overindex on the 2nd degree (Survival or Security).

      Additionally, Table 4 shows Telegram channels that are more prone to otherization are more central (more influential) in the Russian blogger network than they are in the Ukrainian blogger network, as summarized by the line: "The results, shown in Table 4, reveal statistically significant correlations for both degree centrality and eigenvector centrality and the use of othering language by both groups of war bloggers, with a stronger correlation among Russian war bloggers."

      • ossobuco 9 hours ago |
        Personal experience, but Ukrainian supporters are much worse in terms of "Othering", be it justified or not. There is a widespread tendence to call Russians a variety of names, such as "Orcs"[0], "Cockroaches", and "Zombies".

        On the other hand I often see Russians define Ukrainians as "Nazis". The main difference is that Nazis is used as a political connotation and is usually referred to Ukrainian politicians and military members. Russian propaganda carefully describes the Ukrainian population as relatives captive of a far-right dictatorship. Ukrainian propaganda instead frequently dehumanizes the Russian population.

        - [0]: https://en.wikipedia.org/wiki/Orc_(slang)

        • FooBarBizBazz 9 hours ago |
          I've noticed this too, and it seemed like dumb propaganda on the part of the Ukranians. When you hear someone describing their enemies in cartoonishly dehumanizing terms, it doesn't really increase your respect for them (the speaker).

          On the other hand, if your town was flattened, your family was killed, and you've been living in your basement for the last year, maybe that's just how you're going to be. Maybe no other response is reasonable.

          (I still think "orc" is weak though. Something real would be stronger. Even just "beasts". Though, animals are frequently nice. "Monster" is a little metaphorical, but, monsters are also real (in that metaphorical sense). That might be stronger.)

          • metabagel 8 hours ago |
            Orcs attack in hordes and fight battles. It’s an apt description for the Russian “meat wave” attacks.
          • Muromec 7 hours ago |
            You don't want to know what is actually used when "orc" is not enough.
          • jncfhnb 7 hours ago |
            The term originated due to their disorganized, unprofessional, looting heavy behaviors. It has gained prominence in the 2022 invasion mostly because of how pathetic they are. One of the famous quotes of the war is “We’re lucky they’re so fucking stupid”. Which is entirely correct. The understanding of the war from the Ukrainian side is that the Russian army is a huge, dumb horde that vastly underperforms for its size. That doesn’t mean it’s not dangerous. But we’re years into this three day special military operation and we still see laughably wasteful efforts by the Russian military every week.

            Hordes of people running and driving through open fields against artillery strikes and FPV drones and suffering 90% casualties is the canonical mental image these days.

            • orbital-decay 2 hours ago |
              >It has gained prominence in the 2022 invasion

              See the problem, that is incorrect. The term has been in use long before 2022. Originally it was coined during the 2015, and meant to be used for both sides due to how miserable and incompetent the entire thing looked (and was).

              That is the problem with the paper in question as well - authors don't seem to be familiar with the topic they're trying to research, thinking it's a single event. The timeframe in the dataset is 2015 to mid-2023, which makes very little sense. The use of Telegram for war reporting and the language have been completely different at various points of this timeframe.

              To add insult to injury, they are labeling various channels as pro-R or pro-U based on recent messages, but certain channels literally switched sides. They (and many others as well) wiped their message history multiple times, came back with slightly or completely different narratives, and their actual history can only be found in one of the Telegram-related cache services, if at all, as some of these services are either long dead or the info didn't survive. Some people who have been trying to profit from the war started multiple pro-R and pro-U media, including the Telegram channels, although 2022 quickly made them choose sides.

              So much happened in 8 years they tried to shove into an LLM and do a primitive sentiment analysis. Gathering the full picture on this timeframe should have been their main thing, as it's not trivial. Just like with anything on the internet and in real life across 8 years, especially if you don't speak any of the languages. These results are not going to be accurate.

        • cynicalsecurity 9 hours ago |
          Maybe because Ukrainians are right about Russians? Killing innocent people and starting an aggressive war for land and power somehow justifies them being called orcs, don't you think so?
          • ossobuco 8 hours ago |
            No. Dehumanization is a refuge for those who don't understand human nature, history and politics.

            It precludes any legit pursue of an understanding of the situation.

            It's great if all you want is to motivate violence against a certain group; I won't deny that.

            • metabagel 8 hours ago |
              To be clear, Ukraine was invaded, children kidnapped, summary executions, and women raped by the Russian soldiers. Plus, Russia launched missile strikes against hospitals, schools, apartment buildings, playgrounds, and supermarkets. Plus, indiscriminate glide bomb attacks against cities.

              I occasionally call them orcs too. It’s an apt description.

              • ossobuco 8 hours ago |
                USAers have done worse throughout the past 50 years (and before that). Do you call them orcs too? What about Israelis?
                • metabagel 8 hours ago |
                  No, the U.S. has not done worse than Russia has in this war. Not even in Vietnam, I think. But, I guess we were orcs when we invaded and occupied Iraq, since we had no business being there.

                  With regard to Israelis, any sort of othering will be perceived as antisemitism. But, what Israel is doing in Gaza is on par with what Russia is doing in Ukraine.

                  • ossobuco 7 hours ago |
                    > No, the U.S. has not done worse than Russia has in this war. Not even in Vietnam, I think.

                    Rethink[0]. It just takes some basic math to understand that the USA has done much worse, even if we just consider civilian deaths estimates. Hundreds of thousands of civilians were killed in Vietnam, Cambodia, and Laos. And for what, to stop communism? Is that a valid reason to devastate three countries?

                    > But, I guess we were orcs when we invaded and occupied Iraq

                    Orcs once, orcs forever. Isn't that how it works? If you can call an entire population "orcs", then there must be some intrinsic evil rooted in their ethnicity, culture, or whatever it is.

                    - [0] https://en.wikipedia.org/wiki/Vietnam_War_casualties#Deaths_...

                    • oneshtein 7 hours ago |
                      Communists killed more than 100 millions worldwide, not thousands.
                      • ossobuco 6 hours ago |
                        25k people starve to death each day[0]. Those deaths all happen in capitalist economies. That's 9 million each year, for who knows how many years since we developed the necessary technology to have a surplus of food. We're talking about hundreds of millions of preventable deaths, if not more. That's just the result of a very inefficient economic system—without even counting wars, lack of healthcare, and so on.

                        I'll happily choose communism over such an incredibly unjust system, even if it caused "100 million" deaths worldwide, as you say. That number is complete BS, by the way.

                        - [0]: https://www.un.org/en/chronicle/article/losing-25000-hunger-...

                        • mopsi 4 hours ago |
                          Which of the countries facing hunger have anything resembling a modern market economy?
                          • ossobuco 3 hours ago |
                            The real question is why don't they have a modern market economy, and if they all did, what would happen to the existing modern market economies?
                  • af78 7 hours ago |
                    Exactly.

                    https://en.wikipedia.org/wiki/Siege_of_Mariupol

                    In this example, one among countless others, russia killed tens of thousands of civilians, in a couple of months. russia caused more suffering in a few months of war than the USA in a decade in Vietnam. Comparing the two is utterly dishonest.

                • Muromec 7 hours ago |
                  USAers didn't try to kill me and my family in the last 50 years. I truly understand you if they or Israel did and you call them names.
                • af78 7 hours ago |
                  Literal whataboutism.
                  • ossobuco 6 hours ago |
                    Nah, just the context required to point out the hypocrisy of any USAer trying to exercise morality over other countries.
                    • af78 an hour ago |
                      The comment you were replying to seemed to relativize, contextualize the use of a dehumanizing term (‘orc’) that is frequently used by Ukrainians and supporters of Ukraine when talking about russian invaders. Whether the Vietnamese or the Palestinians similarly use dehumanizing terms about Americans or Israelis is irrelevant, just like the nationality, US or otherwise, of the person making this comment.
                  • protomolecule 3 hours ago |
                    That's a word that people always use when their double standards and hypocrisy gets exposed.
            • Muromec 7 hours ago |
              It's important to put the line between yourself and the people who go in assault waves at your trench to preserve your own sanity. The thing is more of a natural response then top-down motivation in Ukrainian case.
        • oneshtein 8 hours ago |
          The difference is that Ukrainians started to use bad names for Russian soldiers after invasion, while Russians use bad words for all nations they know long before the war.

          PS.

          War is much much simpler than peace. If you have good words for invaders — then you're traitor.

        • libertine 7 hours ago |
          Well your personal experience might seem a bit off, or biased (?) because lot of Russians address Ukrainians as "Khokhols"[0].

          Not since the war, but since Russians have oppressed Ukrainians. It's quite a normalized and promoted slur, online and offline.

          It's a culturally derogatory term like you have common slurs that were used to designate some ethnicities or races, like Chechens. These are cultural slurs, unlike "orcs" as an online slur which is a Western term, from a Western reference.

          I think you should look more into how Russia has dehumanized some of its ethnic minorities within the Federation and its neighbors throughout the years and how it has until today.

          [0]https://en.wikipedia.org/wiki/Anti-Ukrainian_sentiment

          • ossobuco 6 hours ago |
            "Khokhols" is not dehumanizing. It's the name of an hairstyle now used as an ethnic slur. It's the equivalent of calling "Ivan" any Russian.
            • libertine 6 hours ago |
              Ah, I see where you're coming from... say no more.

              So you're saying that some cultural symbol used in a derogatory manner to address the "Little Russians".. inferiors to Russia... is humanizing and a show of equal brotherly love?

              You chose to empty the word of the meaning into a simple hairstyle, much like the Nazis just made use of cultural symbols to address the Jewish or Polish people.

              It doesn't look like you're not being honest.

              It's funny because that's one of the Russian twists in their propaganda, "let's focus on the subjective meaning of words... and not the actions!".

              Here's my take on it, if someone goes into someone else land to erase their culture and kill as many people as possible, terrorizing them, and trying to make their living unbearable while addressing them by an ethnic slur, I'd say that's enough of a sign of dehumanization.

              • ossobuco 6 hours ago |
                I guess you stopped reading at "hairstyle" and completely missed the ethnic slur part.
                • libertine 4 hours ago |
                  No, I've read it all. But looks like you've ignored the part where I said that an ethnic slur tied to actions is what renders it dehumanizing.

                  > You chose to empty the word of the meaning into a simple hairstyle, much like the Nazis just made use of cultural symbols to address the Jewish or Polish people for example.

                  Using your assessment, the Nazi Germany slur "Schlitzauge" was a "simple" ethnic slur to address Slavic people, or "Polacke" was "just" slur to address people from Poland. If you add the context of propaganda and war, and the actions toward those people I think it's pretty clear it was dehumanizing.

                  You don't need to be literal to dehumanize a group of people, it's actions taken with a given label that put meaning into a slur.

        • jncfhnb 7 hours ago |
          Russian usage of Nazis is specific the Russian history of defending against the Nazis in WWII. It’s not like in the west. In Russia it’s a patriotic thing to imply that Russia is acting in self defense and justify their actions to the public. Russian propaganda to Russians is about convincing them it’s not a genocidal land grab.
        • EasyMark 5 hours ago |
          I tend to give those who are being invaded and genocided a little leeway in being angry and expressing anger/name-calling at their oppressors & murderers. This could end tomorrow if Russia pulls out or starts pulling back. I think it all boils down to that in the end, and it really is quite simple. The article however is still quite interesting as a study of propaganda.
    • lupusreal 9 hours ago |
      Othering of Russians is going very strong on reddit. I keep tabs on this because my brother is thoroughly sucked into it. The degree of othering and dehumanization is so extreme and comprehensive that the people doing it seem to become acclimatized to it and don't realize the extent and severity of what they're doing, like fish who don't notice the water they're in.

      Before somebody like this jumps down my throat, yes I'm sure Russians are doing it too, and yes I know Russia started the war and Ukraine has a right to defend myself. Spare me the tedious accusations of being a shill for the ""orcs"" and ""ziggers"" (really?), the with-us-or-against-us mentality is part of the extremist radicalization process. Both sides are using it and both can be criticized for it.

      • aguaviva 8 hours ago |
        The way I've seen the "orc" term in use ("ziggers" is a new one to me), it's always been applied to individuals carrying out acts of aggression on Ukrainian territory. Or perhaps their active helpers on the homefront (those otherwise engaged in and supporting that aggression directly). Not against Russians as a people.

        Then again I wouldn't know from discussions on Reddit, as I can barely stand to look for more than a few seconds at any of the political content there.

        (None of this is in reference to the paper, which of course talks about much broader issues, and doesn't refer to the "orc" term at all).

      • rightbyte 8 hours ago |
        I think the tone on the topic on Reddit is due to bots and shills. Not just this topic but almost all topics.

        There is also a subgroup of the population that seems very susceptible to fall for dehumanization. It is like they go mad as soon as there is a somewhat socially accepted victim of it.

      • metabagel 8 hours ago |
        You underestimate the betrayal which Ukrainians feel against Russians, and their hatred of Russia for what it has put them through. It seems like you can’t quite put yourself in the shoes of the Ukrainians, who are fighting an existential war for survival.
        • aguaviva 5 hours ago |
          Indeed - complaints about the "orc" term and "otherization" in general are rather academic, in the context of what's happening there.

          As if the people writing that paper, were they to wake up one morning and find their house being shelled for no reason, wouldn't start to "otherize" the responsible party.

      • Muromec 7 hours ago |
        "zigger" is more of a 4chan thing and isn't really used by Ukrainians.
  • mightyham 10 hours ago |
    Very cool research topic, doing language analytics on telegram channels is a pretty interesting case study for modern war propaganda.

    That being said, I think the theoretical framework presented here is flawed. Many political scholars would say that the definition of politics is the process of defining ingroups and outgroups. Fundamental to understanding political group dynamics is analyzing power, which a major component is the capacity for violence. This research provides a "Model of Othering" which, when properly understood, is really just synonym for political rhetoric. There are a lot of subtle moral presuppositions being made throughout this research that doesn't seem appropriate for a serious scholarly analysis.

    This might also be getting a bit too far off topic, but it also strikes me as fairly tone def to be publishing research comparing Russian war rhetoric to Nazi Germany when politicians from both of America's own political parties are currently calling Iran a "terrorist country".

  • aiforecastthway 10 hours ago |
    This paper is important to read.

    The tone of the introduction is a bit preachy and the findings as stated are underwhelming and beyond obvious [1].

    But when reading this type of papers -- and academic literature in general -- don't consider the application and research questions that were chosen by an academic and/or young graduate student with minimal resources. Look at the actual innovation. In this case, the methods. Then ask what else you can do with the increment of innovation.

    It is now possible to methodologically analyze rhetorical patterns in open source communications on a shoestring budget. Because producers are also consumers, you can begin to understand at a very granular level which sequences of words elicit the desired effect in subsets of a population, and how. Work like this used to be laborious, expensive, and required a huge amount of socio-cultural training/expertise/judgement. By comparison, all of the technical work described in the paper has a relatively low barrier to entry.

    Also: the obvious next step is to treat this as an optimization problem instead of a categorization problem.

    --

    [1] "othering intensifies during crises" is framed as a finding in this paper, but that's like using nukes to fish for trout. It's a fact we already know and have known for thousands of years. Therefore, the fact that the method reached this conclusion is best understood as validating a proposed method for automating intelligence collection and analysis in the context of open source war propaganda.

    • elefanten 10 hours ago |
      Yeah, I agree, this is a cool demonstration of growing capabilities in quantitative social sciences, probably soon leading to analyses that have been imagined for a long time but not (easily) possible until recently.

      It'll take time to understand and integrate, but I imagine it should make theory that used to depend on small numbers of examples (glued together with rhetoric and charisma) richer and more sophisticated. Exciting.

      • aiforecastthway 10 hours ago |
        > Exciting.

        Fast forward 20 years and we have propaganda machines that have been optimized on a nearly individual level, available to the highest bidder (or, more likely, available only to the people who happen to control the attention platforms at that particular point in history).

        But, I suppose it's probably accurate to say that, within the walls of the academy at least, the potential for research on nuclear technology was exciting in the 1930s-1940s.

        • chongli 9 hours ago |
          Fast forward 20 years and we have propaganda machines that have been optimized on a nearly individual level

          And now I have to ask: what are the beneficial applications of this research? My gut reaction is to unplug, to go offline, to seek out in-person communication and to shun online media.

          But I have watched all this play out gradually over time. What about the younger generations growing up now? Will they be more vulnerable to the ever-increasing sophistication of these techniques? Or will they somehow develop a resistance to them?

          Again, my gut instinct here is towards pessimism. I work with young people as a volunteer tutor. Many of them seem to keep shutting down due to the overwhelming burden of information thrown at them. At the same time, their attention is sucked away by media which draws them into a frenetic loop of scrolling behaviour, like a mouse on a treadwheel. It's extremely disturbing to watch.

          • jancsika 8 hours ago |
            > My gut reaction is to unplug, to go offline, to seek out in-person communication and to shun online media.

            Keep in mind this is rank speculation 20 years into the future.

            If this were 2004 OP would have you convinced 90% of your day would be spent culling spam from your inbox.

            In short, they have no idea what they're talking about.

            • aiforecastthway 7 hours ago |
              > Keep in mind this is rank speculation 20 years into the future.

              It's a bit amusing that the comment right next to yours (albiet a cousin not a sibling) is insisting that what I'm describing is already happening.

              The (naive) weaponization of this tech is already underway. What's more, the tools and methods to turn the categorization work in this paper into a goal-oriented optimization procedure already exist -- see for example the Diplomacy work from Meta; again, focus on methods, not the application.

              The tech is here today, the barrier to use is low, and the incentive structure to weaponize exists.

              What will take 10-20 years is the trickling of that impact into social processes and then the retrospective reflection of what happened.

              Note, for example, that Facebook was founded in 2004, but it took 10-15 years for the impact on social processes to reach a point of significant material impact on global society and politics.

              > If this were 2004 OP would have you convinced 90% of your day would be spent culling spam from your inbox.

              It's not about the amount of time spent; it's about what happens in the time that is spent.

              Spam, and online attention markets more generally, is an incredible apt analogy.

              Even for the most strict definition of "spam", the market for prevention is pretty huge. Anti-email-spam tech alone accounts for ~$5B/yr in spend with a 21% CAGR. Going all-in on anti-spam tech in 2004 and capturing even of sliver of the market would make you very wealthy today.

              If you include spam-like content on social media platforms, that number certainly more than doubles. On headcount alone, Meta told Congress that they have 40K employees directly working on trust and safety, and TikTok says they have about the same number. That's just headcount, and at just two companies.

              Beyond that, the following question will prove only more prescient as time goes on: where does "spam" stop and "algorithmically curated content designed to part consumers from their dollar" start? Even today, the average person in places like the US will spend ~2.5 hours consuming algorithmically curated feeds on social media (in service to online advertising revenue). And many aspects of non-social media now have some aspect of quantitative optimization as well.

              Contrary to your take here, with the benefit of hindsight, I think the "Eternal September" doomers of the early naughts significantly under-estimated the impact of information technology on how people spend their time and how society spends its resources. (And, anyways, your strawman is a bit too hyperolic -- no one was seriously arguing that 90% of anyone's day would be spend culling spam from their inbox, least of all me...)

        • oneshtein 8 hours ago |
          It's reality. Russians started to use something like that few days ago. Messages from Russian bots are hard to distinguish from real people, while real people are shadowbanned by FB/YT. It's still relatively easy to see them (few dozens of new characters appears and start to sing almost in unison: stop the war, retreat early, leave occupied land to Russia, massive losses, etc.), but it's harder to catch them individually.
          • libertine 6 hours ago |
            What do you think would be a good way to give visibility?

            Like, for example, if a username had an automatic country-displayed flag at least it would stop those pretending to be from other nationalities.

  • antisthenes 10 hours ago |
    Can't wait for a similar paper on Othering in American politics.
    • metabagel 8 hours ago |
      Trump would be off the charts.
  • metabagel 7 hours ago |
    I’d like to point something out. When we look back at the racism of Americans against the Japanese during WW2, that racism was in the context of barbaric and horrific Japanese atrocities, particularly in China, but throughout the area of their conquest.

    Japanese soldiers murdered civilians and raped women. They took “comfort women”. They murdered POWs. They didn’t observe the rules of warfare. They (by mistaken timing) attacked the U.S. before declaring war in what was perceived as a “sneak attack”.

    The scale of the war crimes perpetrated by Japanese soldiers no doubt fed the desire to other them - to portray them in the most disgustingly racist ways.

    I think it’s important to remember that context.

    Unfortunately, the racism continued for a long time after the war.

    • metabagel 7 hours ago |
      The other aspect of this is that the Japanese soldiers were fearsome and won many victories at the start of the conflict. I think that part of othering in this context is to reduce the fear response to a very capable adversary.
    • 082349872349872 7 hours ago |
      I wasn't alive then, but I'd be surprised if that racism hadn't been in the context of pre-existing racism, from well before 1937.

      My distant recollection of 1920-1930s "pulp era" science fiction and comic books is that even by the (for today) relatively lenient standards of the 1960s-1970s, non-white characters tended to obviously racist tropes.

      My father once told me about two classmates of his, who had promptly been christened "Chinee 1" and "Chinee 2" by the PE coach and remained thus thereafter ... despite the fact that their families had originally come from korea.

      The DoD (as it is now) didn't end segregation in the military until after WWII, with the 1948 https://en.wikipedia.org/wiki/Executive_Order_9981 ; in so doing they were of course well in advance* of the remainder of their society.

      For context, a comic cover (of a sympathetically portrayed character!) from 1947, one year before: https://upload.wikimedia.org/wikipedia/en/9/9d/The_Spirit_10...

      * consider a judge in 1965: https://en.wikipedia.org/wiki/Racial_segregation_in_the_Unit...

      EDIT: damn, that wasn't just said in an interview, the judge wrote that down in his legal opinion: https://discoveryvirginia.org/opinion-judge-leon-m-bazile-ja...