Researchers developed an artificial intelligence model that accurately determines the sex of individuals based on brain scans, with over 90% success.
This article describes a new study using AI to identify sex differences in the brain with over 90% accuracy.
Key findings:
An AI model successfully distinguished between male and female brains based on scans, suggesting inherent sex-based brain variations.
The model focused on specific brain networks like the default mode, striatum, and limbic networks, potentially linked to cognitive functions and behaviors.
These findings could lead to personalized medicine approaches by considering sex differences in developing treatments for brain disorders.
Additional points:
The study may help settle a long-standing debate about the existence of reliable sex differences in the brain.
Previous research failed to find consistent brain indicators of sex.
Researchers emphasize that the study doesn't explain the cause of these differences.
The research team plans to make the AI model publicly available for further research on brain-behavior connections.
Overall, the study highlights the potential of AI in uncovering previously undetectable brain differences with potential implications for personalized medicine.
I would be curious what this would predict for trans (including those both on and off hormone therapy), intersex, or homosexual individuals. My guess is that at a minimum in those cases it's accuracy of predicting either their gender or sex would be very poor, although it would be absolutely fascinating if it accurately predicted their gender rather than their sex. The opposite result (predicting sex but not gender) would also be interesting but less so.
I'd be very interested in those results too, though I'd want everyone to bear in mind the possibility that the brain could have many different "masculine" and "feminine" attributes that could be present in all sorts of mixtures when you range afield from whatever statistical clusterings there might be. I wouldn't want to see a situation where a transgender person is denied care because an AI "read" them as cisgender.
In another comment in this thread I mentioned how men and women have different average heights, that would be a good analogy. There are short men and tall women, so you shouldn't rely on just that.
I have a suspicion that this is exactly what’s going on here and may be why past studies found no differences. AI is much better at quickly synthesizing complex patterns into coherent categories than humans are.
Also, 90% is not that good all things considered. The brain is almost certainly a complex mix of features that defy black and white categorization.
Hopefully we will be wise enough to not require trans people to prove their trans-ness scientifically. People have a right to do what they wish with their bodies and express their gender in a way that feels right to them, and should not be required to match some artificial physical diagnosis of what it means to be trans. Even if it turns out that most trans people do share certain brain structures or patterns. There will always be exceptions and that doesn’t mean we get to label someone’s identity as inauthentic.
Someone else mentioned the iris test being more accurate but that it also includes the eye area around the iris, including eyelashes and eye shape. That would clearly bias the model.
I wonder if there’s anything else that’s might be giving clues to the machine or if it I limited to what they say it’s determining sex based on. As a trans-nonbinary person myself, I’m very skeptical and anxious about technologies like this leading to biases and prejudices being emboldened.
There are short men and tall women, so you shouldn’t rely on just that.
I don't think that's a fair comparison. Height is a single value. If you trained an AI on that, it would be guessing. A brain has many, many more parameters to take into consideration when going into an artificial neural network.
I'd be very interested in those results too, though I'd want everyone to bear in mind the possibility that the brain could have many different "masculine" and "feminine" attributes that could be present in all sorts of mixtures when you range afield from whatever statistical clusterings there might be. I wouldn't want to see a situation where a transgender person is denied care because an AI "read" them as cisgender.
In another comment in this thread I mentioned how men and women have different average heights, that would be a good analogy. There are short men and tall women, so you shouldn't rely on just that.
The opposite result (predicting sex but not gender) would also be interesting but less so
I disagree. It could be wildly interesting if somebody born a male got a scan and it revealed a female brain. Dunno if "anti-trans" people would agree then that a sex-change is valid or if they'd disagree and start finding other excuses.
Assuming I'm understanding your point that would be a mis-categorization. I'm assuming you meant a straight non-trans male was scanned and the result predicted a female brain was scanned (a result matching neither the sex nor gender)? I was saying it would be less interesting if it scanned say a female-to-male trans person and returned a result of female (correctly guessing the sex but not the gender), than if it had returned a result of male (that is correctly guessing the gender but not the sex). It would also be interesting if it could detect trans people in general as their own unique group.
Interesting that it's only 90% accurate but looking at pictures of iris scans it's 98.88% accurate while an ophthalmologist can tell no difference between the scans from males and females.
There are several reasons why the ability to determine gender from iris images is an interesting and potentially useful problem (3). One possible use arises in searching in an authorization database for a match. If the gender of a sample can be determined, it can be used to order the search reduce the average search time. Another possible use arises in social settings, where it may be useful to screen entry to an area based on gender but without recording identity (3). Gender classification is also important for collecting demographic information, for marketing research and for real-time electronic marketing. Yet another possible use is in high-security scenarios, where there may be value in knowing the gender of people who attempt entry but are not recognized as authorized persons.
So useful for marketing, and harassing my trans homies. Fuck that shit!
From a biological standpoint it is still quite fascinating
I mean the paper you quoted keeps saying "gender" ...Is it actually for affirming trans folks? Big if true but I obviously didn't read the source material.
While the iris study is interesting, looking at their dataset the pictures seem to include the area around the eye a little bit, including eye lashes, so after a cursory glance it seems odd that they even titled it as iris. However I didn’t read the full thing so it cold be that they cropped it somewhere. Although they are using large convolutions so a lot of detail is lost.
This shit is dumb as fuck. You start off with a 50/50 chance of guessing right to begin with. Then just ask do you like sports? There. You get to 98% acuracy without any fancy chips and circuit boards
Yeah except that it doesn't need to ask questions but instead can tell it straight from the picture. An eye doctor can tell no difference between the irises of men an women.
I don't doubt that there are inherent differences between the brains of most men and women, but "we can measure these differences" and "these differences are inherent" are two different claims. I don't really get what the article is trying to get at by first claiming the latter and then walking back to the former.
btw can someone post the full PDF I can't access it via sci-hub yet
Edit: Also a tangential nitpick, but looking at their code I can tell that they're psychiatrists/neuroscientists first and programmers second lol
"CNN Block 1" comment used twice?
They skip layer 5? (Why even keep it in there??)
A linear layer with 2 outputs??? And then they do "_, predicted = torch.max(outputs.data, 1)" in the training script???? JUST USE 1 OUTPUT WITH A SIGMOID I'M BEGGING YOU
For the record, I've earned some serious cash essentially chasing around data scientists and whipping their code into production readiness and deployability. So, carry on I guess. I've literally seen code like this that a company relies on, that runs one one dudes laptop (but he's a PhD and the brainz of the product! Lol)
From a technical perspective yeah, but from a colloquial manner AI and ML have been used interchangeably for years. An issue only made worse by AI now frequently being used to describe GenAI which, while ML, behaves in a manner where it's somewhat misleading to use AI/ML/Gen AI interchangeably.
They can’t rule out the potential explanation that being raised male changes your brain in a different way than being raised female without having subjects that were raised differently than their birth sex. You would have to control for that variable in order to rule it out.
You can't point to a difference and say it's (directly) caused by chromosomes or the SRY gene or hormones, whatever. Brain differences increase with age, suggesting that they may be more to do with socialisation than genetics. Does this evidence prove that women should be treated differently or is it evidence that women are, in fact, treated differently?
Found myself a copy of the paper for a read-through and it's immediately obvious to me why they couldn't get above 90% accuracy.
The word "Gender" occurs exactly zero times in the text and the datasets they worked with were divided into a strict sex binary. As a result, the accuracy of their models' predictions could not significantly improve upon prior work in the field.
The only new info here is that their XAN is able to point out the specific brain features that influenced its predictions. Potentially useful with regards to the development of treatments for gendered brain issues in neurotypical people, but anyone who falls outside of the 90th percentile of sexually dimorphic normativity won't see any benefit here.
It's not unusual for studies like these to exclude people taking medication, or with any kind of medical condition, such as gender dysphoria, autism ,etc. It's to control as many variables as possible.
(I've personally been excluded from FMRI studies for being autistic and left handed.)
I think so. With a more diverse dataset and fewer binary assumptions baked into the analysis I think we'd start seeing the bimodal contours of a spectrum between the masculine and feminine peaks. The graphics included in the study seem to hint at this, showing nodes of similarity with a tapering tail toward the middle of the distribution for all three sets of data they analyzed:
Yes, but I've heard theories and read studies in the past that suggest the differences in sexuality change over time, also. Like, studies have documented that women can go back and forth from being gay and straight, while men might go gay later in life but never change back. Supposedly there is some mental rewiring that goes on alongside this, however not as something that has been quantifiably measured, only qualitatively observed.
I think this AI processing could be a useful tool in further analysis against this and other hypotheses, but I worry that given the emotionally charged discussions around transgender nature the results will be far too easily misconstrued.
Height is pretty consistent. You grow until adolescence, then maybe you shrink a bit later in life. Men are generally taller than women, but only on average. That doesn't really have anything to do with neurology.
The study may help settle a long-standing debate about the existence of reliable sex differences in the brain.
How many studies on these lines must appear before this "debate" is overcome? It is even a truism! That we are a tabula rasa without sexual dimorphism is as absurd as biological determinism.