Note For Anyone Writing About Me

Guide to Writing About Me

I am an Autistic person,not a person with autism. I am also not Aspergers. The diagnosis isn't even in the DSM anymore, and yes, I agree with the consolidation of all autistic spectrum stuff under one umbrella. I have other issues with the DSM.

I don't like Autism Speaks. I'm Disabled, not differently abled, and I am an Autistic activist. Self-advocate is true, but incomplete.

Citing My Posts

MLA: Hillary, Alyssa. "Post Title." Yes, That Too. Day Month Year of post. Web. Day Month Year of retrieval.

APA: Hillary, A. (Year Month Day of post.) Post Title. [Web log post]. Retrieved from

Wednesday, June 21, 2017

Alyssa Reads: Critical Studies of the Sexed Brain -- Communication thoughts

I continue my thoughts from reading Critical Studies of the Sexed Brain. Because I had more and then forgot to put them up here. Go me.  Here's the citation again if you want it:

Kraus, C. (2012). Critical studies of the sexed brain: A critique of what and for whom?. Neuroethics,5(3), pp. 247-259.doi:10.1007/s12152-011-9107-7  

And now the quote that got me thinking:

Critical neuroscientists frame the question of a science gap between neuro- and social scientists, experts and the public, just as couple's guides conceive of the gender gap in terms of unawareness, misunderstanding, or ignorance, promoting the idea that all matters can be settled through enhanced communication and better knowledge of each other's distinctive language, culture, needs or concerns.”

This needs more attention paid to it. Here is a big issue: there is a power imbalance. Patriarchy is a word for the imbalance in the couple's guide, and it would relate to the sciences one too since hard sciences tend to be thought of as men's fields while social sciences are thought of more as women's fields. (Accuracy of this thinking is another issue, but STEM in general runs man-heavy.)

That contributes to the rhetorical positioning of the fields, where neuroscientific “facts” can't be questioned by social sciences, even if questioning the facts isn't exactly what's going on. Sometimes it's questioning the causes and interpretation of the reported result rather than questioning whether or not the result was correct, or reproducible. Though the fMRI study of a dead fish is relevant, and so is the fMRI of the same person daily for about a year – fMRI is not infallible, no more than any scientific procedure is, and pretending it is will get us into trouble.

The author then asks about “lay expertise” from patients, relatives, and activists. Since I'm studying neuroscience but came from the Neurodiversity Movement before I got into neuroscience, I wonder where that puts me. As a neuroscience student, I'm one of the science people. As an Autistic person, I'm somewhat a patient. (Not much of one, haven't been in therapy related to autistic traits for a while, but when I write as an Autistic person, I go in that category.) And there is definitely a power difference between the roles. There has to be, for Theory of Mind to have been interpreted to mean autistic people can't understand our own experiences. Not everyone making use of the word thinks that, but it's an interpretation I've seen way too much of.

The author then points to this framework as “preventative politics,” where it keeps the peace by avoiding/assuaging conflict in the name of interdisciplinarity. She argues this could prevent good science that would come from controversy. I'd agree, but also say that it can involve silencing of ideas that aren't status quo as part of the peacekeeping.

Another issue with the focus on communication is that it only works if everyone is acting in good faith. It's the same problem with Nonviolent Communication and similar: if everyone is acting in good faith, it works fine. If anyone involved is actually seeking to maintain control or to do harm, consciously or not, it's not going to work. If one person's goals actively exclude the other person's goals, better communication can lead to figuring this out, but not to solving the problem. Seeking to expand the domain of one's own field without worrying too much about the domain of anyone else's field could lead to a similar failure in interdisciplinary communication ideas.

Tuesday, May 30, 2017

Let's talk about fidget spinners and patterns.

Fidget spinners are a fad. Thinkpieces about fidget spinners, therefore, are also a fad. That's how it works, right? On one side, there's people who are arguing that these are toys (true), that they are a fad (true), that they can distract some people (true), that there is not research showing improved focus from their use (true), and that they are not an accessibility issue (false). On another side, there's people arguing that they are a focus tool for some autistic people and/or people with AD(H)D (true), that the lack of evidence is due to a lack of research and not a statement of inefficacy to use against individuals who find them useful (true), that this can be an accessibility issue (true), and that their fad nature among neurotypical students is bad (false) because it is getting the toys banned (mixed truth value). I've also seen more nuanced views, generally from disabled people, but those seem to be the two main camps.

I want to point out a pattern in how accessibility discussions go, especially in educational contexts.
  1. A disabled person needs something for access reasons.
  2. Abled people call the thing distracting, because our existence in public is apparently distracting.
  3. The thing is either banned entirely or permitted only for people with the paperwork to prove they need it for disability reasons.
  4. Disabled people who need the thing either don't have access to the thing or must out themselves as disabled in order to gain access. If outing oneself is required, the thing is heavily stigmatized.
  5. Disabled people who have an actual access conflict with the thing are erased entirely, which makes conversations about possible solutions to the access conflict impossible. One set of needs or the other will "win." Any disabled people who need to avoid the thing are lumped in with the people who want to ban the thing for ableist reasons and therefore vilified. Which set of needs "wins" here varies, but it usually has some relationship to hierarchy of disability stuff and having one set "win" while the other "loses" is a bad solution regardless.
That's not just a fidget spinner thing, but it does apply here. With fidget spinners, autistic people and folks with ADHD (I'd love to know of a reasonably recognized way of talking about this neurotype without the second D/in a neurodiversity paradigm way, btw) end up in both the "need the thing" and the "need to avoid the thing" groups. I assume some other neurotypes are similarly split as well - I just don't have the familiarity to assert so. With visual alerts on fire alarms, D/deaf people need the thing. Since the visual is a strobe, a lot of neurodivergent people, especially people with photosensitive epilepsy, need to avoid the thing. With service animals, the folks who use them need the thing. People with allergies need to avoid the thing, and not everyone with an allergy can safely share a space with a service animal, even if they are treating their allergies. Conflicting access needs exist, and this pattern prevents us from finding ways to deal with the conflicts. Instead, one access need gets lumped in with abled people who don't like the thing because it's associated with disability and therefore presumed not to be a real need.

Now for fidgets: some people need something to do with their hands while listening if they're going to retain anything. I am in this group, by the way. In high school, I knit, I sewed, and I made chainmail - armor, not spam. I've also tried drawing, which takes care of the "need to do something in order to sit" issue but takes enough attention that I'm no longer following the conversation, so that doesn't work for me in class. Writing hurts quickly enough that while taking notes has sometimes been possible at university, there was no way it was going to be the answer for the duration of a school day in middle or high school. (I, specifically, should not have a laptop in class. If I'm going to need notes it's the least bad option, but least bad does not mean good.) So I did assorted arts and crafts that were fairly repetitive and totally unrelated to class. The biology teacher who told us on day one that he had ADHD was both the most understanding teacher about my need to fidget somehow and the teacher most at risk of being distracted by my making armor in class.

That last paragraph is the "no, really, I need to fidget." It's also the "there are several fidget options that work for me." Most, but not all, of the standard fidget toys will meet my needs, as I discovered because they are also a fad and I got some awesome fidget toys. This is important, when access conflicts come into play - if there are several options that meet the access need of the first disabled person, it's easier to find one option that everyone is OK with. When there are several options that work, requesting "not option A in situation W" is not an access issue, because options B through H are still fine. If we're going to come up with reasons that each of B through H are also not fine, individually, then we're going to have a problem.

The fidget toy fad is making options D through H cheaper and cooler. When fidgets are marketed as assistive technology, they are super expensive. Considering that disabled people tend not to have a lot of money, that's an access issue, so the fad is making a set of possible solutions more accessible. That's cool. It's also leading to a sufficient presence for teachers to make explicit policies about the toys (as opposed to banning them person by person), and for a flat ban to seem like a good idea to teachers who are seeing kids appear distracted by them. (My bet is that the neurotypical students who appear distracted actually are. I expect the autistic and ADHD students who appear distracted are a mix of actually distracted because they are just as distractable as any other student and only appearing to be distracted because of ableist ideas about what paying attention looks like. Remember, I'd fail special needs kindergarten as a twenty-four year old PhD student.) The explicit banning for everyone is ... not so good. Mostly because the other options are usually also disallowed or heavily stigmatized, and then we may well be left with no good options.

And let's not pretend handing everyone a fidget spinner, or any other fidget, is going to magically "solve ADHD" or whatever. I think some of the camp that's firmly against the toys is reaching that position for similar reasons to haters of weighted vests - we hand it over and the person is still autistic, or still ADHD. A tool that a person uses to cope in a less than accessible environment doesn't make them stop being disabled by the environment. Plus a fidget spinner isn't going to help everyone. Some people really will be distracted if they have something to play with, and some of those people really will be neurodivergent. Conflicting access needs, again, are a thing. If one person needs a fidget, and another needs not to be next to someone with an obvious fidget, those two people probably shouldn't sit next to each other. Giving people fidgets that they can use while the toy remains in their pocket is also a possibility in some cases. We can have conversations about access conflicts, if we admit that both sets of needs exist. (We also need to admit that some subset of the people making arguments about distraction are doing the bad faith argument where everything disabled people need is a distraction because, essentially, our presence in public is a distraction.)

[Let's also insert a plug for my Patreon. I write. I have a Patreon.]

Saturday, May 20, 2017

"Your taste buds will change"

CN for food and vomit.

That's one of those sentences I read every so often, which is technically true, but which doesn't actually lead to the conclusions I see it used to support. Taste buds really do change with age! This is a thing that happens, and it's part of why there are certain foods kids tend not to like but which adults are more able to tolerate. (I think most alcoholic drinks go in this category, where kids tend not to like the taste anyways?)

As true as it is that tastes change, there's some things my brain has decided I need to explain now about why this doesn't mean getting into a power play with someone over what they eat and how they're "picky"  is a good idea.

  1.  You probably don't know what the result of "pushing the issue" is going to be. I don't just mean long term results. I mean short term, in the minutes to hours right after forcing the (in)edible object down. Obviously, you don't expect it to be a big deal, or else you wouldn't be trying to force a "picky" eater to eat something they can't eat. How wrong are you ready to be? TMI alert, last time I made myself drink something that was an issue, it came back up. (If it hadn't been something I was medically supposed to have, I wouldn't have tried. It still didn't work, because it didn't stay down.)
  2. The fact that someone's tastes may change and they may be able to eat a food later doesn't mean they can tolerate it now. The change hasn't happened yet. So even if you're correct about the nature of the upcoming change, you're still trying to make someone eat something they don't currently tolerate. See point 1.
    1. Also, even if you were going to be correct, you can cause that not to happen by creating an association between being forced to eat the food and whatever sensory issue it's hitting. That can create a new issue with the food in question, besides taste...
  3.  The issue may not be the taste. I can't drink anything carbonated. You might think that's a rather broad category for a taste issue. You'd be correct. It's not a taste issue. It's best described as a texture issue, and you've said nothing about texture sensitivities changing. In fact, most of the foods I can't deal with are texture issues, not taste ones.
  4. The changes in taste may not be the ones you expected or hoped for. Some foods that were issues before can become non-issues, but it can go the other way too. As a very small human, I could eat mushrooms. As an adult human, I can not eat mushrooms. (It's also the texture, not the taste.) Chocolate pudding was a "safe" food for me as a kid. It's about 50-50 on my being able to eat it now. (Texture again. Also, partially related to times when I didn't get the choice about yogurt, which has never been an OK texture and which is close enough to pudding that making yogurt even worse made pudding a problem. See point 2.1.) I ... actually can't think of any foods I can have now that I couldn't deal with as a kid. 
Tastes do change as we get older. That doesn't mean they'll change the way you want them to, or that a possible change that hasn't happened yet justifies acting as if it's already happened. 

Thursday, May 18, 2017

Alyssa Reads Critical Studies of the Sexed Brain

This is another one I read for neuroethics. I was considering using this article for my presentation on a neuroethics related topics, but that didn't happen because someone else split off my too-large group and it wasn't too big anymore. We actually wound up talking about a medication used to treat addiction ... that can itself be addictive. Fun times. So, here's some of my thoughts from reading Critical studies of the sexed brain. 

“They suggest that we work and talk across disciplines as if neuroscientists were from Mars and social scientists were from Venus, assigning the latter to the traditional feminine role of assuaging conflict” (247). sigh I am not surprised that some scientists think of social sciences that way.

Brain plasticity+ identity formation in intersex people, brains vs. genitals. That's going to be interesting. By which I mean, I have concerns. I have friends who are intersex. I know people who do intersex activism. And I know intersex people who concluded that intersex and/or nonbinary is their gender identity rather than picking one of the two binary genders. Hope the author isn't assuming a gender identity must be one of man/woman. Heck, mine isn't that and as far as I know, I'm not intersex.

Oi at calling autism a disease. It is a neurodevelopmental disability [or a neurotype, that's a good word and also let's remember what I'm saying when I say disability - the social model of disability is a thing.] Also I know the author found neurodiversity stuff because the article comes up when I search the journal for neurodiversity, what the heck? I don't expect to hear it called a neurotype in anything done by neurotypical(-passing) academics but really? Disease?

Ok, gender in the brain as a result of plasticity, that's going to be interesting – “reflect gendered behavior as learned and incorporated in a social context” is a thing, but please, please don't let this turn into “male socialization” for trans women or “female socialization” for trans men, or either of the above for nonbinary folks. The socialization of “consistently mistaken for X while actually Y” is not the same as the socialization of “X.” Ok, individual differences are a thing. That's good. “Plasticity arguments are extremely interesting as they wage war against both biological and social determinism, reductionism, essentialism, and other -isms.” Phew that's not the socialization argument I was worried about, I don't think.

Does she mean “cishet” by “normal people”? (Cishet=cisgender, heterosexual.) I appreciate the quotation marks around “normal people” but there probably is another word for what she means and using it would be nice.

Now we have one of my rage buttons. All caps time!

Intersex activist history! I knew about unwanted surgery, gender role training, and folks wanting their own intersex bodies back. I also know someone who was put on unwanted hormones. What are the results of Diamond getting so lauded while speaking in terms of brain sex, though? It's still the language coming from the people who try to enforce the man/woman dichotomy. What are the results of using the "sexed brain" discourse while not necessarily fitting in the binary? 

1 Walker, N. (September 27, 2014). Neurodiversity: Some basic terms and definitions. Neurocosmopolitanism: Nick Walker's notes on neurodiversity, autism, and cognitive liberty. [blog post] Retrieved from is a good explanation of the neurodiversity related vocabulary I tend to use when thinking about neuro stuff.

Thursday, May 11, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- suffering and authenticity

I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant, then some thoughts on collective effects of blunting trauma, and then cognitive liberty. Now here's suffering and authenticity.

The concerns about what we might do to others minds if it were an issue of what person X does/chooses for person X, not what we are choosing for others. The concern seems to be about changing someone's true self, so suffering and authenticity come in again, just like cognitive liberty. These two seem frequently connected to me. If we recognize that people get to define their own "true selves", we don't get to moralize over which experiences are real and true anymore, which kind of kills the "not their true self" argument. Which is an argument I'm really not a fan of, especially considering which experiences it tends to be applied to.

This quote ... gives me the noble suffering/virtuous suffering sort of feeling, where whatever positive you might (not will, might) drag from the hell you go through means you shouldn't try to avoid that hell or save others from going through it.
Or will he succeed, over time, in 'redeeming' those painful memories by actively integrating them into the narrative of his life. By 'rewriting' memories pharmacologically, we might succeed in easing real suffering at the risk of falsifying our perceptions of the world and undermining our true identity. (90)
The version of a person that went through more bad things isn't automatically more real. The version of a person that's suicidal from trauma isn't automatically more real than the version of a person that takes medication to not be suicidal. Our choices define us, not just what we've been through, and using chemicals to get the parts of our histories we never chose to back the heck off? That's not less real. Suffering isn't the only way to be real. Enough of the noble suffering narrative. Enough.

Now to bring back a quote that I also talked about with cognitive autonomy:
And yet, there may be a great cost to acting compassionately for those who suffer bad memories, if we do so by compromising the truthfulness of how they remember. We risk having them live falsely in order to cope, surviving by whatever means possible. (92)
  (Survival is resistance etc)

And the concerns about what happens if we take out everything difficult? Those take a huge slippery slope argument, and not the kind where we've seen from experience that most people stop early or don't stop at all (destructive obedience is one of those.) Trauma is not the same thing as everything difficult in a person's life. Having to spend a lot of time and effort on reading and writing in order to become a good writer is not the same as witnessing a murder or being mugged or being a victim of abuse. One of these things is a choice: we're not under any obligation to become good writers. The other's aren't choices. They're things that happen to us. How we deal with the results is at least partially a choice. (Not entirely. Especially when, due to technological or social constraints, dulling the pain while working through it isn't an option.) There is plenty of opportunity for hard work and achievement without forcing others to keep horrors in their heads for the sake of ill-defined authenticity.

Tuesday, May 9, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- cognitive liberty

 I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant, then some thoughts on collective effects of blunting trauma. Now here's cognitive liberty.

The concerns about what we might do to others minds if it were an issue of what person X does/chooses for person X, not what we are choosing for others. Cognitive liberty. We don't seem to have a coherent definition of the self, and autonomy is complicated, but there is definitely a thing where a person either is or is not making the decisions about interventions taken (or not taken) on their own minds. Also on how folks define their own "true selves." What about who you are is important to you? Not what's important to me about who you are. Of course, that would stop us from moralizing over what experiences other people have are real and true vs. somehow fake. Changing one's own cognition by one's own choice isn't as acceptable as I think it should be. 
And yet, there may be a great cost to acting compassionately for those who suffer bad memories, if we do so by compromising the truthfulness of how they remember. We risk having them live falsely in order to cope, surviving by whatever means possible. (92)
Again, we do to them. Not, we offer them the option. Do we think we know better than them what's right for them? That way lies all sorts of abuse "for their own good." And ... do we really think everyone would choose to dull the pain of a memory or to forget it (remember also that those two things are not the same.) Because I don't think that. I think lots of people would, but not everyone. Despite (because of?) my arguments about cognitive autonomy leaning towards letting people choose to blunt the trauma,  I want the right to remember in my relatively unchanged way. It's just that the arguments run towards why everyone needs to be doing it that way, and I don't believe everyone needs to be remembering that way. I think enough people would choose to remember that we'd get whatever collective benefits the memory would provide, even if we let people choose to dull their pain. Not that I think the supposed benefits are nearly as strong as seems to be argued. Intentional ignorance is already a thing.

Thursday, May 4, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- collective effects

I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant. Now here's thoughts about the collective effects of forgetting, as worried about by the authors (and as I tend to think we deal with even without dulling memories pharmacologically.)

I have a concern about this supposed legal argument against using beta blockers or similar medications to reduce the emotional impact or trauma from publicly important events. (The given example was a terrorist attack. I can ... kind of tell this was written not too long after 9/11.)  The idea is that it's important to have some witnesses remember the event accurately. There's a problem: I remember from my introductory neurobiology class that when a memory is super emotional, we feel quite certain of our recollection ... but that we can still be completely wrong in our memory of what happened. Ask people where they were on 9/11, or when the space shuttle exploded, and some will tell you they were listening to or watching other events that didn't happen on those days. Sometimes didn't even happen that time of year. But we are confidently wrong! So as useful as accurate recollection would be for legal purposes, maintaining the traumatic impact on the witnesses doesn't make accurate recall happen anyways. Also, eyewitness testimony is notoriously unreliable to begin with. This is a bad argument because the thing we're claiming to want to preserve already doesn't exist.

On that note, I wish the authors had said something more about the social and personal effects of blunting our collective traumas. I'm not entirely convinced that leg of the argument is going to hold either. After all, I'm a Jewish (and Queer, and Disabled) descendant of Holocaust survivors, and I know how we're never supposed to forget. I'd be a lot more inclined to buy into the value of collectively remembering and the consequences of forgetting if we'd stopped having genocide or deciding that certain religions are inherently more dangerous or lesser. But we didn't. These things all still happen. The things we're claiming to want to prevent already happen with our supposed preventative in place, and that means I don't trust the argument.

The murder witness example actually does concern me. "Yes, I was there. But it wasn't so terrible." (91). We don't want murder to be thought of as not so terrible. I know we don't want that because sometimes it is already considered not so terrible. See also: "mercy" killings of disabled people by the folks who are supposed to take care of them. It already just depends on the choice of victim, and that's terrifying. I don't want the idea of murder as not so terrible spreading any further than it has. I want it gone. I want all the murders being recognized as being as bad as they are.

I also have issues with the juxtaposition (and sometimes what seems like conflation) of giving a victim relief and medicating away (or relieving, I suppose I should use the same language for each) the guilt of perpetrators. Those are not morally equivalent. Victims and attackers or abusers are not the same. When we're talking about a mutual conflict, as in the case of war (the most talked about cause of PTSD, but far from the only one), there may not be a clear aggressor or victim. There also may be. It depends on what's going on, really (and remember how often the military is painted as the only way out for people in poverty, at the same time we remember the atrocities soldiers often commit.) Still, when we're talking about accidents and survivors of terrorist attacks, there's clear innocents. (Not "perfect victims" in the sense that they never did anything else even slightly wrong, but innocent in the sense that they didn't choose what happened to cause the trauma.)

Monday, May 1, 2017

Jobs for autistic strengths and "autistic strengths"

Full disclosure: Real Social Skills got me thinking about this with some tweets (first tweet, second tweet, third tweet), and then a blog post, both of which I think you should read. That said, I think my thoughts are parallel rather than identical and it's still worth my writing my bit.

To me, what she's saying reads a few main points:
  • Some models of autistic strengths assume that attention to/liking of detail is one of the strengths.
  • They then assume this means we will enjoy repetitive, detail-oriented jobs most people find mundane.
  • That's still putting us into different sorts of jobs than everyone else (segregation!) but calling it strengths based and assuming we're all the same.

Since this is May 1 (Blogging Against Disablism Day), I've got some "spot the (dis)abl(e)ism" thoughts. Let's break those down. Here's what I'm reasonably certain isn't ableism:
  • Thinking it's a good idea to play to an autistic person's strengths does not read like ableism to me.
  • Recognizing that some strengths may be statistically common in autistic people does not read like ableism to me.
  •  Understanding that the jobs we find interesting or want to do may be different from what "most people" find interesting or want to do does not read like ableism to me.
Helping an autistic person find a job that's a good fit for them based on their (autistic, since they are autistic and autism is pervasive,) strengths would also not read like ableism to me It would be helping someone find a job for their autistic strengths. Unfortunately,  the way programs around finding jobs for "autistic strengths" often run ... does have ableism involved.
  • Assuming that "autistic strengths" means exactly a certain set of (perhaps statistically common) strengths is treating us as a monolith, and therefore ableism. Not all autistic people are detail-oriented, for example. (I appear to be a lot more detail-oriented than I really am thanks to pattern-recognition.)
  • Assuming that a given strength will correspond to a given interest is stereotyping based on interests. If you're only doing this in the presence of an assumed disability, it's ableism. If not ... it's still inaccurate stereotyping but it might not be ableism?
  • Celebrating how we can therefore do these jobs other people find boring and pushing us into those jobs is effectively workplace segregation, definitely stereotyping based on autism, and therefore ableism.
And this is what a lot of autism employment programs seem to be doing. It's not what we need. My jobs? Based on my actual strengths, some of which are a bit stereotypical and some of which are decidedly not. Math? Yeah, I'm good at that and I like it. People tend not to be surprised by that one. Grading? I guess that involves attention to detail, or pattern recognition that makes breaks in expected patterns stand out. Teaching? Seems a bit social, yes? Well, explaining things to people in ways they can understand is absolutely part of my skill set. As a student, I often explain math-heavy neuroscience papers to my non-math classmates in the neuroscience program. As a teacher, it means finding the way to explain a given concept that actually makes sense to my students. I don't think any autism employment program is going to suggest that a person who can't always talk become a teacher, but that's what I do. Editing? I guess it's attention to detail, but it's also language. None of my work has been in areas typically considered "boring," and a lot of the work people consider "boring"? Really wouldn't be a good fit for me. Assuming it must work for me because I'm autistic isn't going to work. I'm an Autistic person, not a machine made of autism stereotypes. 

Sunday, April 30, 2017

Thank blob April is almost over

Today is the last day of April. For family reasons, I wasn't online all that much in the last week and a half of the month (this was good and I should probably arrange to spend as much of April as possible too busy to be online in the future, except for the part where if I'm that busy I am also working towards a burnout and I need to fall the heck over now.)

Even so, autism nonsense and attention paid to it tended to be at a high. Sometimes this is useful. Like when one of our own needs a social media crisis thrown against a discrimination issue: Niko won a competition for a trip but doesn't get to go because of his disability.

Sometimes it's frustrating: do I really need to answer for the n+1st time that I am an autistic person, not a person with autism? There exists a cat named autism because autistic humor is a thing, but I do not live with this cat. Do I need to explain for the n+1st time that no, I don't think organization XYZ can be reformed in a way that would make it helpful for autistic people? (Organization XYZ is usually, but not always, Autism Speaks.)

Sometimes it's scary: I don't really need to be reminded just how far many parents are willing to go in order to "get their child back" from this scary autism thing, or what they do when nothing "works." I don't need all the reminders of why I'm scared for (not of, for,) autistic kids today. (A lot of other people do seem to need the reminder, but they don't seem to be the ones getting it, or understanding why this is scary.)

And it drops back to somewhat normal levels tomorrow.

Saturday, April 29, 2017

Are you still afraid of anything?

I got asked that yesterday.

Thought the first: You're joking, right? Anxiety is a big problem for me. What's wrong? I dunno, but something must be. (Or sometimes I do know, the thing I'm worrying about is unlikely to impossible, and my brain is just being a troll. Or sometimes I do know, the thing I'm worrying about has actually happened to me before, and I therefore can't tell my brain it's just being a troll.) ... yeah we just asked someone who has anxiety if they're scared of anything. The answer to that question is yes. This does not seem complicated?

Thought the next: The person asking me this has seen me dealing with a thing I'm afraid of ... pretty often, actually. I'm scared of heights. Like, really scared of heights. I can (and often do) have bad moments with the fear of heights when walking down stairs kind of scared of heights. That might be related to my having fallen down the stairs when I was younger. Here's some things they've seen me do:
  • Go on tall roller coasters, including Batman and Superman (Bizzaro?) at the nearest Six Flags.
  • Zip line between mountains.
  • Rappel down a 150 foot waterfall.
  • Climb "rock" climbing walls to nearly the top (but also get stuck 3 feet up a bunch of times.)
  • Ski.
  • Descend stairs. Remember, that can and does set off my fear of heights.
Thought the next: I'm afraid of driving. That's part of why I didn't get a liscense until I was 23. It's also a fairly rational fear, for several reasons. First, I've got sensory processing issues that make driving overloading. If I'm starting off in good shape and with a lot of energy, I can drive safely, but I hate it. Second, I tend to lose speech when I drive. Getting pulled over while non-speaking, even as a white person, sounds like a seriously bad time. Thanks, but no thanks. And yes, this person knew that my not driving was related to a fear of driving.

Thought the last: I am (sometimes/somewhat unwisely) Gryffindor. Looking at my behavior in order to tell if I'm scared or not tends not to work very well, because my inclination when I'm scared is to do the thing anyways. Scared of heights? Yes, let's go on the roller coaster. That sound great. Looking at how I act in order to tell what scares me works even less well because I, like many (most? probably most) autistic people, have been taught not to show or act on fear or discomfort because it's "weird" or "faking for attention." (Spoiler alert: It may well be weird, but I am definitely not faking. Stuff that doesn't bother other people is often painful for me, and vice versa.) And ... this person is one of the people who's denied that people's perceptions could possibly work the way mine do. So their not being able to tell when I'm scared? Not just because I'm (sometimes/somewhat unwisely) Gryffindor.

Friday, March 31, 2017

Alyssa reads: Ethical Analysis of Neuroimaging in Alzheimers Disease

Anyone else bothered by the consistent framing where we demonstrate the significance of disability related research by citing a significant/increasing “public health burden” and the money spent on care? Anyone? (Fellow citizens, that is your money too.)

Now we're going to focus on ethical issues around imaging/detection. (Which, I note, remain ethical issues surrounding imaging/detection whether or not you talk about public health burden and money!!!)

The “Roles for current imaging capabilities” section seems to take it as a given that identifying risk factors (for this thing we really can't treat that well) in order to predict who's going to get Alzheimer’s before they get it is important. I would have expected that to be one of the ethical issues to discuss: do we identify folks who are going to develop Alzheimer’s even though there's not really a way to change this? (And that's at the 100% certainty level, which, to be clear, is not current reality. We can't predict who will/won't experience this. We can't predict what cognitive changes a person will (or won't) experience as they age with anywhere near that level of certainty.)

(Yes, I think with something that would fall under the neurodiversity paradigm. Also cognitive liberty or freedom – people being in charge of their own minds while also valuing diversity on a societal level! I'm still inclined to treat neurological things that will eventually kill you as things I would like us to know how to change or prevent, because death. And Alzheimer’s will eventually kill you. Cognitive freedom also goes with “people can choose what to do with their own minds” and “not dying of dementia” is a common preference, let us science so people can make that choice.)

Ah, yes, good, stigma is getting addressed.
  • Predictive imaging may expand the pool of disease to people who are much younger, and therefore expand the pool that is stigmatized.
  • Both earlier prediction and stigma have the potential to reduce quality of life, including autonomy and the privilege to drive, and other daily functions.
  • There may be medical discrimination against people at risk, for example, with respect to eligibility for organ transplantation. (4)

My preference is for not stigma at all. Expanding the stigmatized pool is not doing this. Neither is reducing it. Both of those are justmoving the line of acceptable minds around. Nope. (Still don't like shoving people unwittingly or unwillingly into a stigmatized population.)

It's important to point out the quality of life issues where being in a stigmatized group, all on its own, causes problems. Because it does.
Organ transplant discrimination is a thing. I might not be able to get an organ (autism, people get rejected for that all the time, sometimes even when there's a family member willing to donate who isn't offering this for anyone else re: kidney or liver.) 
I think we need to work on the stigma in addition to working with the reality that it currently exists.

I appreciate that “Scan everyone who wants a scan” is one of the considered options. It gets the shortest discussion (probably because “do for person X what person X wants” isn't that complicated) and the issues brought up there are common to the other groups as well. (Who should have access to the results of testing is not only a question when the test was done without medical indication. It might have different answers depending on the level of medical indication for the test. I'm very much inclined towards “The person who had the test decides who even knows the test took place, and similarly who gets results.” It's hard to coerce test results out of someone if you don't know there's anything to coerce. The tricky thing is to make sure employers can't coerce the test itself.) Unequal access remains an issue, but let's not pretend it's a non-issue for any of the other options.

I'm betting the impact of results on personal liberty and similar closely resemble the impacts of other known cognitive disabilities. Just a hunch.

OH MY GOD. NO. “the greater predictive power combined with the growing number of people with AD might be the brick that breaks the back of the current health care system. (6)” NO. YOU DO NOT PUT THE BLAME FOR OUR MESSED UP SYSTEM EVENTUALLY BREAKING ON SICK OR DISABLED PEOPLE. NO. NO. NO. YOU. DO. NOT. DO. THIS. Go yell at insurance companies and congresspeople instead. NO. I hate you when you do this nonsense with autism and I hate you when you do it with AD and just generally hate it when you do this with the people who get screwed over by the current system that really, really wants everyone to be abled and to get briefly and treatably sick in ways that follow the textbook. And you know, this idea that we're a burden on some system always, always gets used to justify measures that reduce our personal liberties. When you write things like this, you are part of the stigma problem. Stop it.

(Try instead “The current health care system is designed for XYZ and not ABC. Given ABC, changes are needed.”)

The incidental findings question. Yes, protocols being decided on for these before the imaging. (Ulysses contract connection?)

Much remains to be learned about functional anomalities.” (8). Well. Yes. We only seem to study this stuff when there's a perceived deficit. If it's worked for the person their whole life, why would we have noticed anything? [Hi, Galton the eugenicist deciding not totake issue with the lack of a minds eye because it seemed most commonin “men of science.” We're biased as heck about what unusual things we decide are problems and what unusual things we decide to study like the people who have them are objects.]

Wednesday, March 29, 2017

What do you mean by severity?

A question I found on Quora (then answered, but the answer here is longer):
Do people with autism have an understanding of their own condition? If so, why does it not lessen the severity of it?
Now, those of you who have been around my blog a while might know that I am an autistic person, not a person with autism, and that I have reasons for this. That's not quite the point of this question though, so it's not quite the point of my answer either. Poking some holes in the premise, on the other hand? Sure.

I'm autistic. I know I'm autistic. This was not always the case. I used to know I was weird but not that autism was a label that could explain some of my weirdness. (Affinity for the absurd is also relevant.)

I know that, related to my being autistic, I am not always able to speak. Sometimes I can, but sometimes I can't. Knowing that I can't always talk doesn't magically make me always able to talk. (There would be a bit of a paradox if it did.)

However, knowing I can't always talk means I can plan around not always being able to talk. I carry alternative communication methods: pen and paper, text to speech software on my laptop, a whiteboard marker... it varies with the environment. But who looks more obviously disabled? Someone who happens not to be speaking or someone using text to speech because they can't speak? I am taking an action that mitigates an effect of my disability. This action also makes my disability more apparent. Am I more severe or less for doing so? Does the question even apply to my situation?

I also know some patterns about what activities or environments make it more likely that I will be unable to talk. That's a fairly thorough understanding of one piece of how I work, yes? Well, this knowledge means I can plan my activities in order to minimize the chance of speech giving out on me. (Or I can choose not to care, since I very rarely have a reason to care about speech per se. Access to one working communication method matters. That one method being speech usually doesn't. But let's assume, for the time being, that we're dealing with a circumstance where I would prefer to be able to speak.) This planning means I may choose to skip an activity or to leave an event early in the interest of maintaining my ability to speak. If I make this decision (and say why, if asked), does my non-presence for disability reasons make me more severe? Does maintaining my ability to speak make me less severe? Does the question even make sense in my situation?

On an entirely different note, I know sitting "properly" still is difficult for me. I could spend lots of energy doing so anyways (and probably not remember much of what I heard in class.) I could bring drawing or sewing with me. (It looks weird, but it's not obviously an autism thing. These take little enough attention that I'll retain more than I would trying to sit properly still, but enough that it's not perfect. It's often been my best option.) I could bring a fidget toy, marketed to neurodivergent people. Really, it's probably marketed to parents of neurodivergent kids but that's another issue. (I'm a bit more obvious now, especially if I'm also flapping and rocking. However, we've maximized my attention and retention.) So, the more visibly obvious my disability is, the more I'm getting done. When am I "more severe"? When am I "less severe"? What does this question even mean?

I get more obviously autistic (less "visibly high functioning", thanks Dani) when I order my life in ways that make it easier for me to get stuff done.
What do you mean by severity?

Oh, and btw, I totally have a Patreon. Support my tea habit?

Friday, March 17, 2017

Dear Well-Meaning Autism Mom Looking For A Surrogate Mom For Your Son, Please Don't Assume The Person You Approached Is A Girl Or Straight

Guest post by Elizabeth Rosenzweig. 

So I run an autism meetup. Parents of post-pubescent autistics are not invited. There’s a number of reasons why but one of them in particular has been making the blog rounds: well-meaning but misguided parents who, out of concern for their son’s (and it is always a son, isn’t it?) inability to fend for himself, look to set up a trust fund for him in the shape of a kindly woman savior who will cook and clean and pay his bills for him, forever and ever, amen. The guys themselves can be the problem, too; a person who should be a grown-ass man asks you out and is then shocked, SHOCKED, to discover that you’re just as shit at getting A Job, remembering to pay bills on time, and feeding/picking up after yourself as he is, if not worse. (I, uh, may or may not have very personal experience with that one.)

But I’ve already had two very smart friends I admire address that aspect in plenty of depth, so, well-meaning but misguided parent, let me address another one that you may not have considered.

That long-haired, girl-shaped, pretty, kind person you met, the one you think would look so cute on the arm of your precious manchild (or your precious self), might not actually be a girl. Or straight.
They could be asexual or aromantic - content and whole within themselves. They might be allosexual but gay. They might use she/her pronouns but feel utterly alienated from femininity as a concept. They might be a genderless android. They might be a trans man. You just don’t know!

It’s almost like that long-haired, girl-shaped, pretty, kind person is… hear me out for a second… a person. Not your personal insurance policy, or your uncompensated PCA, or your romantic-comedy-prize, or your glorified German Shepherd, but an entire human being unto themselves, with weaknesses and feelings and ambitions beyond saddling themselves to some cisgendered guy who wants things done just like his mom did them. *They* might be the one needing a PCA! They might maybe sometimes need someone to hold them while they cry hysterically because they foolishly expended all their energy for the day on folding three-quarters of the laundry. (I, uh, may or may not have very personal experience with that one too.)

How do I even address the sexual side of things with you? You, hypothetical mom, have almost certainly had experience with shutting up and taking it while a male partner got his rocks off inside you. Is that how you want your son treating his life companion? Is that how you would want to be treated? I’m certainly sick of it, or worse, being treated as deranged for exploding in frustration after having my own needs go unacknowledged and unmet for years at a time. I got so sick of it that I quit men and went monogamous with an assigned-female-at-birth genderless android. So far, so good. But how would you know that from looking, unless you saw me and my wife together? 

The point is, you don’t consider those things. You think about your own fears, which are visceral and immediate. What will become of my child after I’m gone? When will I have a chance to feel like a person and not a 24/7 PCA - won’t anyone please help me? And those questions resonate so loudly inside your own head that you don’t stop to ask yourself the ones I’ve posed here. That’s not my problem, though, nor is it the problem of any long-haired, girl-shaped, pretty, kind autistic. It’s not fair of you to put your anxieties on us, when we have so many of our own to contend with.

One of the side benefits of running an autism meetup is that you have the opportunity to meet a lot of people of all ages and genders and walks of life. I have quite a few lovely gentlemen who are regular attendees. Let me reassure you, dear, hypothetical mom, that almost all of them have turned out just fine, with the support of agents and agencies who are meant to do the work that you are looking for from that nice autistic at the meetup. It’s actually the ones whose parents have done the most coddling and interfering who are struggling the most.

So please. Stop putting your cissexist, heteronormative expectations on people you barely know, in the name of providing for your own offspring. You’ll start working on real solutions much faster once you do.

Wednesday, March 15, 2017

Please, autism researchers, study these.

Quite a bit of autism research is what I would call, to put it delicately (as in, I am neither screaming nor swearing at it), abled nonsense. I definitely needed to know that my asexuality as an autistic AFAB is a testosterone-related disorder. I also needed to know that I only think I'm trans (nonbinary to be specific) because autism is an extreme male brain. And it is of the utmost importance that I know I am incapable of humor in any form, but especially sarcasm. Autistic satire is definitely not a thing, right?

Oh, wait. All of that is abled nonsense. So is the idea that the optimal outcome is a loss of diagnosis, by the way. I'm most able to do the stuff I care about when I am visibly autistic rather than spending energy on not being so. Dani briefly achieved so-called indistinguishability, an older "optimal outcome" and it was not worth it. (Also I'm the friend.)

I would like to see research that is not abled nonsense. I especially would like to see more of this research being done by autistic people, because no, I don't think we need neurotypical people interpreting the results in order for them to be valid. I'm with Nick Walker here: when we depend on less-marginalized researchers to "discover" our hard-earned truths, we're reinforcing the idea that the knowledge we've figured out for ourselves as a community isn't valid. Which communities get to have valid knowledge?

That said, there are things I'd like to see researched more. Not necessarily in the current structure (because let me tell you, I expect someone like, oh, nearly any non-autistic autism researcher who presented at the Coalition on Autism and Sign Language where I threw myself into a wall repeatedly, to make a complete mess of the topic.) And preferably by autistic people with experiences relevant to the topic.

  1. Inconsistent speech and AAC support for autistic adults.

    I'm an adult. I can speak (usually.) When I can't speak, I use AAC. AAC research seems to be focused on two groups: adults with neurodegenerative disabilities, and young children. Autistic adults who can sometimes speak and sometimes not are neither of those categories, but there's a lot of us. This might be more common than "always has speech" is, among autistic adults, but thanks to behaviorist approaches and the assumption that "can sometimes" is identical to "can consistently" given a sufficiently strong motivator, professional types tend not to get this. I would like to see research on what supports, including AAC, tend to make communication easier/more effective for us.
  2.  Employment supports that are neither sheltered workshops nor "we think they're all good at technology" start-ups that might pay well but are still pretty segregated.

    Sheltered workshops can (and often do) pay below minimum wage. Autistic people, like all disabled people, are more likely to live in poverty than abled people. Are these facts connected? You bet! Programs like Specialisterne, on the other hand, are founded by (usually parents) based on a stereotypical idea of "autistic strengths" that usually means technology work. Or Microsoft has a program to hire autistic workers now. These can be useful, if you're an autistic person who wants to be working in technology. I worked an IT job for a while. It was a good experience in many ways. I also never want to do that again. I like writing. I like teaching. I like art. I've earned money on all these things (mostly teaching) and would happily continue to. These are not the specific jobs you're going to come up with if you're a non-autistic person trying to provide employment support for autistic people.

    So maybe, just maybe, we need to take a look at employment supports that are not limited to a specific kind of job. (Or, you know, look at more kinds of jobs? Because the needed supports will vary based on what kind of job it is.)
  3. Burnout.

    After reaching some ideal of indistinguishability and hanging out there for a little bit, or just after the demands get to be too much even if we were never indistinguishable, we can hit a breaking point. Then everything is way harder, we have way less energy, and our abilities shift. Sensory overload might be more of an issue. What can we do to make this less likely to happen? What supports would help a person going through this? People dealing with this have written about it, both during and after. Getting some idea of what tends to help us vs. what tends to make things worse would be great for anyone who deals with this in the future. Even better if we can help people not have this happen. Burnout is not fun.

Sunday, February 19, 2017

Divergent, Gattaca, and limitations "for your own good"

Last night I participated in the #FilmDis chat about human gene editing and GATTACA. Which, even though it's been a while since I saw the film (I think the last time was in 2010), I have opinions about. It's a film about eugenics, and in a very real sense it's about eliminating disability in most people but creating a new genetically inferior (disabled) underclass that looks a lot like the old one, people who couldn't afford to have their kids genetically selected birth this underclass. So do people who leave their children's genes up to luck. (AKA, the protagonists parents, at least the first time.) But the only person we see in the movie (which is largely about disability discrimination) who we'd discriminate against today? He's got an acquired disability. It's not genetic. And he's the one who's genetically valid, selling his genetic identity and thereby allowing the protagonist to get in the door to his dream job.

And Divergent is a series I have opinions about. I loved what looked like neurodivergent representation in the first two books, except for the part where I knew what was coming: the Divergent are secretly neurotypical and everyone who really fits a faction has "genetic damage" making them neurodivergent. And sure, we build a city in the end where no one really believes in genetic purity vs. genetic damage, but all through the series we're shown the functional superiority of Divergent people: Tris, do your Divergent magic, think like the Erudite and tell us what they'll do! Tris, come in first in initiation and have it clearly be about your Divergence. Or ... your neurotypicality.

So it's probably not shocking that I want to connect them? They've both got genetic engineering and discrimination based on genetic makeup. And I do:

You see, the entire idea of factions in Divergent is about behaviorally conditioning people to behave in ways that takes their presumed "damage" to an extreme, in a way that's hopefully useful. This ... actually reminds me of Specialisterne? More on that later, maybe. They think it's the kindest thing to do, giving people a way to be useful while using their supposed strengths (that are secretly still defects.) It's still limiting people based on an idea of what their potential is, for what is supposed to be their own good.

And several times in Gattaca, we see Anton attempt to dissuade Vincent from his goals, in the name of "protecting" his "invalid" older brother. He should take the jobs that "invalids" can get, not try to go to space as he's always wanted. He should leave the company he works for. He should accept that his genetics really do make him inferior and work from there, for his own good (for his own safety.) And maybe it would be safer. (Isn't it usually safer, at least in some ways, to stick to the paths laid out for you as acceptable?) But this sort of limiting people for their own "good" and to keep them "safe" exists in the real world, for disabled people. And guess what? It's not actually safe!

So in both Divergent and Gattaca, we have people limiting others (or trying to) in the name of their own good. Adults who only want the best for us, hurting us because of what they do not know. (My fear is not of water, and now I remember Vincent and Anton competing in the water. He didn't save anything to get back.)

Wednesday, February 15, 2017

Alyssa Reads Memory Blunting: Ethical Analysis -- Divergent Thoughts

I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. This part is just the connections I've drawn to Allegiant. More later.

The council starts by asking when we would want to reduce the emotional impact of an experience, and why we should/shouldn't in a given situation. Definitely good questions to consider.

These questions remind me of the ending of Allegiant. We have the protagonists wiping the memories of everyone at a government agency in what boils down to self-defense -- the agency was going to do the same to their entire city "experiment." Not that the experiment was particularly experimental, nor was it particularly based on how genetics actually works, though it was certainly eugenic as all heck. So here we get memory wipe as government control over a eugenic project and as self-defense against said government control.

We also see individual level decisions about memory elimination: Four brings a vial to the city with the plan of using it on one of his parents, who are leading opposing factions in what has become a civil war. He believes that if one of them will stop, so will the fighting as a whole (and then maybe the government won't memory wipe the entire city.) He gives his mother a choice instead of using this vial (he doesn't like memory wipes as an act of war/control/defense/greater good) and this winds up working. She agrees to leave the city.
Or Peter: He is cruel. He knows it. He wants to change. He knows people are the product of their experiences and choices to enough of an extent that he'd have a hard time doing this (and therefore just ... wouldn't) without the aid of wiping his memories. He wants to forget himself. Interestingly, he's the only individual-level memory wipe that we see go through. He forgets himself. In the epilogue, we find out that he's still not the nicest of people, but he's not the person he was before, either. He did make a (slightly) different self, and the difference matters. (Things like not stabbing rivals in the eye while they sleep are just slightly important.)

Four/Tobias, again. After Tris dies, he takes a truck and goes into the city with a vial of the memory serum. His friend Christina stops him, because "The person you became with her is worth being. If you swallow that serum, you'll never be able to find your way back to him." And with eliminating the memory entirely of who he had been and what he had done, I even think I might buy this argument. I will, however, note that this would be a complete elimination of memory. This isn't blunting the emotional impact, making a thing you can remember be less traumatic to recall. This is making the event gone, like it never was, instead of softer, so you can look at it instead of needing to bury it. 
And why do I read and understand the neuroethical arguments in dystopian science fiction?

Maybe it’s something you have to be Autistic to see, but all of their storytelling is 
Every writer is making a narrow and overly specified claim about
the nature of social pressure, taboo, deviance policing, human fulfillment, and
the methods by which a person located in a certain sociological position might resolve
     the needs inherent in their system.
When I read, this is what I examine. A writer’s inability to fully represent society
is simply a way of stating their warrants to me, and the individual scenes carry
not only emotional value, but grounds for the conclusions drawn in the depiction
of the change in the main character’s state. All of your fiction is an argument about a
     time and place. (Monje 29)

That's why.

(That's also from The Us Book, which I read and which you should read. Specifically, it's from "Reintroducing Art to the House of Rhetoric.")

Monday, February 13, 2017

Jewish Protest Thoughts

So. Bannon is anti-Semetic. We know that. Threats and vandalism on Jewish community centers and synagogues are up. We also know that. Jewish folks may not be the primary targets (I'm thinking Muslim people are the big-name target of the moment what with the travel ban that is definitely a Muslim ban let us be real, though there are seriously a lot of targets) but swastikas are on display. Which means Jews are on the list.

A thought I am having, therefore:
Drown out Nazi and Neo-Nazi slogans with clearly Jewish sounds:

  • On Tumblr I saw a suggestion of the Shehechiyanu. Which I am probably transliterating terribly, because Hebrew. It's a prayer thanking God for your making it to this moment, used on the first night of many holidays and at Bar/Bat/Bnai Mitzvot.
  • Songs for holidays of the genre "They tried to destroy us. They failed. Let's eat." I'm thinking Hannukkah and Purim, especially. Bring your gregors and drown out Trump and Bannon's names with those, to be especially Purim-like. Maybe give out Hamentashun?
  • Really any song that's in Hebrew or Yiddish that it makes sense to perform at full bellow. 
You see, doing slightly silly things that really piss off the bad guys while making us laugh is a way of (hopefully) keeping morale up. And singing about how some schmuck who tried to have us killed was swinging from the gallows he had built for us, and now we're going to eat pastries? I think that fits the bill. (And nosh some Hamentashen!)

Thursday, February 9, 2017

Legal protections and shaky ground

I have, I think, finally figured out why I felt less safe, not more, after turning in a formal accommodations letter for the first time this past summer. (That was nowhere near the first time I've had those same access needs I've got the letter for met at university. It was just the first time I had to turn in the letter.)

It's a pattern. When I just turned the letter in, without asking first if the professor cared about the letter, I didn't feel less safe after turning it in. (Note to self: Maybe stop asking, since some will care.) When I turned the letter in with a comment of "don't know if you need this or not, but here it is anyways" and I got a response in the area of "thanks but yeah, don't need it," I felt more safe than I had before turning the letter in. But it was the same amount of more safe that I've felt the times the answer has been that the professor doesn't care about the letter.

Which makes me suspect that the letter itself is less than relevant. My having the paperwork to prove I am entitled to "accommodations," as they like to call it when my access needs are met, that's not the issue. (Seriously, y'all aren't changing anything about the class structure when I use AAC, it's important and it's apparently unusual but I don't want to talk about my typing as something that you're accommodating me specially to allow.) My turning in said paperwork is also not the issue.

Depending on an often inaccessible, bureaucratic process that requires a probably-abled "expert" document that I really qualify for the diagnosis I'm claiming accommodations under in order to access my education and my work, on the other hand? That's an issue. Having said process done so it can back me up on the off chance I need it is useful. I'm glad those legal protections exist. They're important. They're good to have as backup. But I don't like relying on the backup any more than the next person. And I'd much rather have access happen because it's what should happen than because some paperwork says it legally has to happen. Or that some part of it legally has to happen -- my paperwork says I get text-to-speech, and that's actually my least-used AAC solution. 

Monday, February 6, 2017

In which I flip through my textbook and react to something

I'm taking a course on motor speech disorders this semester. (Was this a good life choice? We'll find out! Were my other classes this semester good life choices? Again, we'll find out!)

The text, for anyone wondering, is Motor Speech Disorders: Substrates, Differential Diagnosis, and Management, 3rd edition by Joseph R. Duffy.
"The decision to use AAC strategies is based on careful assessment of speech and communication abilities and needs, the prognosis, and the individual's potential to benefit from them." (387)
I guess?? I mean, I have to assume that's the way it's professionally done. In my experience, the decision to use an AAC strategy is made in the moment when speech isn't working right now and I need to do something. My first several decisions, the first several times I used it, were certainly immediate and uncareful need something now choices.

I'm in a Chinese language classroom in Tianjin, the teacher just asked me to speak, and I can't. I need to do something. I pull out my iPad (good thing I have it today!), open Notes, switch the keyboard to Simplified Chinese input, type something quickly, and hand it over to the student next to me, who reads it aloud.

I'm in measure theory on Yom Kippur (I fasted, but still went to class) and the professor asked me a question. (I don't remember now what the question was.) I can't speak. I don't have my computer or iPad with me. If I write in my notebook, it'll probably be mistaken for ignoring the question/continuing to take notes, because I was taking notes before and he doesn't know speech goes out on me yet. In any case, that's not likely to meet my immediate need. So I reach for a whiteboard marker and start writing on the side board.

I'm not waiting for someone to evaluate how much I can benefit from an AAC solution while I can't speak. I'm just ... not. That's not a thing. I'm getting into situations where I need something now, and I may or may not be grabbing the best solution. It hasn't carefully evaluated by an expert. I'm grabbing the first solution I can think of given my environment. My decision to acquire dedicated applications for AAC on my iPad and laptop was a bit slower and more considered. I didn't look into those options until I realized that speech giving out on me was going to be a regular thing (honestly had been a regular thing for some time, I'd just not communicated with language while speech was out before.) I asked around. There wasn't any sort of formal evaluation. (Though one might have been handy.) Has anyone expert looked at, well, any of my set-ups? Nope. That hasn't happened. Could they come up with something better as long as they recognized that I really do AAC? Probably.

I'm not certain if this is a commentary on how usually verbal and fluent-seeming autistic adults don't get the assessments for communication supports we could use, or if this is a commentary on gatekeeping where someone other than the disabled person is deciding whether or not to implement AAC. Maybe it's both.

Monday, January 30, 2017


Heads up that gender binary stuff is going to be discussed, largely in terms of my reactions to it playing Runescape, an online role playing game. So is dysphoria, both with my actual body and with a digital avatar for the game.

I've played Runescape for quite a while. Long enough that I've watched the graphics change quite a bit. Some of these changes are nice (Priffinidas looks pretty cool. Also, I gave my avatar purple hair and purple wings.) Some are ... not great for me. (The female avatar's chest is quite a bit more noticable than it used to be. Also, armor designs look different based on male vs. female avatars in a way the graphics didn't used to be good enough to support properly.)

In the ideal world, there would be an androgynous option. I don't live in that world. There's male avatars and female avatars. If you use a male avatar, you have shorter hair options (darn), a flat chest (yay), armor graphics that would actually protect your torso (yay), usually facial hair (whatever), and he pronouns (whatever.) If you use a female avatar, you have both short and long hair options (yay), a chest that is definitely not flat (dysphoric), armor graphics that show off said chest (dysphoric), only the new "pirate" beards from a recent event as facial hair options (whatever), and she pronouns (whatever.) "They" pronouns aren't an option, no matter what avatar you're using (darn.)

When I first made my account (and I do still use my original account from middle school), I didn't know I was nonbinary yet. So, of course, I used my assigned gender and made a female avatar. Over time, I started having issues with this. (Hi, dysphoria is a thing.) My original avatar looked a bit like I physically do -- long brown hair, skin that's on the dark side for a white person but still a white person, and a tendency to wear purple. It even had a long braid for a while. As graphics advanced and I became more aware that my problems with the avatar were dysphoria, I made my avatar look less like me. Purple hair not in a braid, purple skin, wings.

But I was still having trouble. Some of the armor options I liked were dysphoric to look at on my character. (Thank blob for cosmetic overrides. I used those heavily, and still do, so the "look" of my character doesn't change when I change the armor I'm wearing. This got me around that problem, at least.) It took me a while to think of "switch the avatar gender" because I'm nonbinary, and that means that a male avatar is still incorrect. However, in terms of the characteristics that show up in Runescape, it's closer. Flat chest for the win. (That's my primary dysphoria issue in meatspace, and it remains so with digital representations.) The pronouns are a question of the usual wrong answer (she) vs. the unusual wrong answer (he) and it's easier for me to be amused by the unusual wrong answer. (They/them/their is a right answer.) And with actual items + keepsake keys, I retain the ability to put my avatar in a skirt. This works so much better.