Note For Anyone Writing About Me

Guide to Writing About Me

I am an Autistic person,not a person with autism. I am also not Aspergers. The diagnosis isn't even in the DSM anymore, and yes, I agree with the consolidation of all autistic spectrum stuff under one umbrella. I have other issues with the DSM.

I don't like Autism Speaks. I'm Disabled, not differently abled, and I am an Autistic activist. Self-advocate is true, but incomplete.

Citing My Posts

MLA: Zisk, Alyssa Hillary. "Post Title." Yes, That Too. Day Month Year of post. Web. Day Month Year of retrieval.

APA: Zisk, A. H. (Year Month Day of post.) Post Title. [Web log post]. Retrieved from http://yesthattoo.blogspot.com/post-specific-URL.

Tuesday, May 30, 2017

Let's talk about fidget spinners and patterns.

Fidget spinners are a fad. Thinkpieces about fidget spinners, therefore, are also a fad. That's how it works, right? On one side, there's people who are arguing that these are toys (true), that they are a fad (true), that they can distract some people (true), that there is not research showing improved focus from their use (true), and that they are not an accessibility issue (false). On another side, there's people arguing that they are a focus tool for some autistic people and/or people with AD(H)D (true), that the lack of evidence is due to a lack of research and not a statement of inefficacy to use against individuals who find them useful (true), that this can be an accessibility issue (true), and that their fad nature among neurotypical students is bad (false) because it is getting the toys banned (mixed truth value). I've also seen more nuanced views, generally from disabled people, but those seem to be the two main camps.

I want to point out a pattern in how accessibility discussions go, especially in educational contexts.
  1. A disabled person needs something for access reasons.
  2. Abled people call the thing distracting, because our existence in public is apparently distracting.
  3. The thing is either banned entirely or permitted only for people with the paperwork to prove they need it for disability reasons.
  4. Disabled people who need the thing either don't have access to the thing or must out themselves as disabled in order to gain access. If outing oneself is required, the thing is heavily stigmatized.
  5. Disabled people who have an actual access conflict with the thing are erased entirely, which makes conversations about possible solutions to the access conflict impossible. One set of needs or the other will "win." Any disabled people who need to avoid the thing are lumped in with the people who want to ban the thing for ableist reasons and therefore vilified. Which set of needs "wins" here varies, but it usually has some relationship to hierarchy of disability stuff and having one set "win" while the other "loses" is a bad solution regardless.
That's not just a fidget spinner thing, but it does apply here. With fidget spinners, autistic people and folks with ADHD (I'd love to know of a reasonably recognized way of talking about this neurotype without the second D/in a neurodiversity paradigm way, btw) end up in both the "need the thing" and the "need to avoid the thing" groups. I assume some other neurotypes are similarly split as well - I just don't have the familiarity to assert so. With visual alerts on fire alarms, D/deaf people need the thing. Since the visual is a strobe, a lot of neurodivergent people, especially people with photosensitive epilepsy, need to avoid the thing. With service animals, the folks who use them need the thing. People with allergies need to avoid the thing, and not everyone with an allergy can safely share a space with a service animal, even if they are treating their allergies. Conflicting access needs exist, and this pattern prevents us from finding ways to deal with the conflicts. Instead, one access need gets lumped in with abled people who don't like the thing because it's associated with disability and therefore presumed not to be a real need.

Now for fidgets: some people need something to do with their hands while listening if they're going to retain anything. I am in this group, by the way. In high school, I knit, I sewed, and I made chainmail - armor, not spam. I've also tried drawing, which takes care of the "need to do something in order to sit" issue but takes enough attention that I'm no longer following the conversation, so that doesn't work for me in class. Writing hurts quickly enough that while taking notes has sometimes been possible at university, there was no way it was going to be the answer for the duration of a school day in middle or high school. (I, specifically, should not have a laptop in class. If I'm going to need notes it's the least bad option, but least bad does not mean good.) So I did assorted arts and crafts that were fairly repetitive and totally unrelated to class. The biology teacher who told us on day one that he had ADHD was both the most understanding teacher about my need to fidget somehow and the teacher most at risk of being distracted by my making armor in class.

That last paragraph is the "no, really, I need to fidget." It's also the "there are several fidget options that work for me." Most, but not all, of the standard fidget toys will meet my needs, as I discovered because they are also a fad and I got some awesome fidget toys. This is important, when access conflicts come into play - if there are several options that meet the access need of the first disabled person, it's easier to find one option that everyone is OK with. When there are several options that work, requesting "not option A in situation W" is not an access issue, because options B through H are still fine. If we're going to come up with reasons that each of B through H are also not fine, individually, then we're going to have a problem.

The fidget toy fad is making options D through H cheaper and cooler. When fidgets are marketed as assistive technology, they are super expensive. Considering that disabled people tend not to have a lot of money, that's an access issue, so the fad is making a set of possible solutions more accessible. That's cool. It's also leading to a sufficient presence for teachers to make explicit policies about the toys (as opposed to banning them person by person), and for a flat ban to seem like a good idea to teachers who are seeing kids appear distracted by them. (My bet is that the neurotypical students who appear distracted actually are. I expect the autistic and ADHD students who appear distracted are a mix of actually distracted because they are just as distractable as any other student and only appearing to be distracted because of ableist ideas about what paying attention looks like. Remember, I'd fail special needs kindergarten as a twenty-four year old PhD student.) The explicit banning for everyone is ... not so good. Mostly because the other options are usually also disallowed or heavily stigmatized, and then we may well be left with no good options.

And let's not pretend handing everyone a fidget spinner, or any other fidget, is going to magically "solve ADHD" or whatever. I think some of the camp that's firmly against the toys is reaching that position for similar reasons to haters of weighted vests - we hand it over and the person is still autistic, or still ADHD. A tool that a person uses to cope in a less than accessible environment doesn't make them stop being disabled by the environment. Plus a fidget spinner isn't going to help everyone. Some people really will be distracted if they have something to play with, and some of those people really will be neurodivergent. Conflicting access needs, again, are a thing. If one person needs a fidget, and another needs not to be next to someone with an obvious fidget, those two people probably shouldn't sit next to each other. Giving people fidgets that they can use while the toy remains in their pocket is also a possibility in some cases. We can have conversations about access conflicts, if we admit that both sets of needs exist. (We also need to admit that some subset of the people making arguments about distraction are doing the bad faith argument where everything disabled people need is a distraction because, essentially, our presence in public is a distraction.)


[Let's also insert a plug for my Patreon. I write. I have a Patreon.]

Saturday, May 20, 2017

"Your taste buds will change"

CN for food and vomit.

That's one of those sentences I read every so often, which is technically true, but which doesn't actually lead to the conclusions I see it used to support. Taste buds really do change with age! This is a thing that happens, and it's part of why there are certain foods kids tend not to like but which adults are more able to tolerate. (I think most alcoholic drinks go in this category, where kids tend not to like the taste anyways?)

As true as it is that tastes change, there's some things my brain has decided I need to explain now about why this doesn't mean getting into a power play with someone over what they eat and how they're "picky"  is a good idea.

  1.  You probably don't know what the result of "pushing the issue" is going to be. I don't just mean long term results. I mean short term, in the minutes to hours right after forcing the (in)edible object down. Obviously, you don't expect it to be a big deal, or else you wouldn't be trying to force a "picky" eater to eat something they can't eat. How wrong are you ready to be? TMI alert, last time I made myself drink something that was an issue, it came back up. (If it hadn't been something I was medically supposed to have, I wouldn't have tried. It still didn't work, because it didn't stay down.)
  2. The fact that someone's tastes may change and they may be able to eat a food later doesn't mean they can tolerate it now. The change hasn't happened yet. So even if you're correct about the nature of the upcoming change, you're still trying to make someone eat something they don't currently tolerate. See point 1.
    1. Also, even if you were going to be correct, you can cause that not to happen by creating an association between being forced to eat the food and whatever sensory issue it's hitting. That can create a new issue with the food in question, besides taste...
  3.  The issue may not be the taste. I can't drink anything carbonated. You might think that's a rather broad category for a taste issue. You'd be correct. It's not a taste issue. It's best described as a texture issue, and you've said nothing about texture sensitivities changing. In fact, most of the foods I can't deal with are texture issues, not taste ones.
  4. The changes in taste may not be the ones you expected or hoped for. Some foods that were issues before can become non-issues, but it can go the other way too. As a very small human, I could eat mushrooms. As an adult human, I can not eat mushrooms. (It's also the texture, not the taste.) Chocolate pudding was a "safe" food for me as a kid. It's about 50-50 on my being able to eat it now. (Texture again. Also, partially related to times when I didn't get the choice about yogurt, which has never been an OK texture and which is close enough to pudding that making yogurt even worse made pudding a problem. See point 2.1.) I ... actually can't think of any foods I can have now that I couldn't deal with as a kid. 
Tastes do change as we get older. That doesn't mean they'll change the way you want them to, or that a possible change that hasn't happened yet justifies acting as if it's already happened. 

Thursday, May 18, 2017

Alyssa Reads Critical Studies of the Sexed Brain

This is another one I read for neuroethics. I was considering using this article for my presentation on a neuroethics related topics, but that didn't happen because someone else split off my too-large group and it wasn't too big anymore. We actually wound up talking about a medication used to treat addiction ... that can itself be addictive. Fun times. So, here's some of my thoughts from reading Critical studies of the sexed brain. 


“They suggest that we work and talk across disciplines as if neuroscientists were from Mars and social scientists were from Venus, assigning the latter to the traditional feminine role of assuaging conflict” (247). sigh I am not surprised that some scientists think of social sciences that way.

Brain plasticity+ identity formation in intersex people, brains vs. genitals. That's going to be interesting. By which I mean, I have concerns. I have friends who are intersex. I know people who do intersex activism. And I know intersex people who concluded that intersex and/or nonbinary is their gender identity rather than picking one of the two binary genders. Hope the author isn't assuming a gender identity must be one of man/woman. Heck, mine isn't that and as far as I know, I'm not intersex.

Oi at calling autism a disease. It is a neurodevelopmental disability [or a neurotype, that's a good word and also let's remember what I'm saying when I say disability - the social model of disability is a thing.] Also I know the author found neurodiversity stuff because the article comes up when I search the journal for neurodiversity, what the heck? I don't expect to hear it called a neurotype in anything done by neurotypical(-passing) academics but really? Disease?

Ok, gender in the brain as a result of plasticity, that's going to be interesting – “reflect gendered behavior as learned and incorporated in a social context” is a thing, but please, please don't let this turn into “male socialization” for trans women or “female socialization” for trans men, or either of the above for nonbinary folks. The socialization of “consistently mistaken for X while actually Y” is not the same as the socialization of “X.” Ok, individual differences are a thing. That's good. “Plasticity arguments are extremely interesting as they wage war against both biological and social determinism, reductionism, essentialism, and other -isms.” Phew that's not the socialization argument I was worried about, I don't think.

Does she mean “cishet” by “normal people”? (Cishet=cisgender, heterosexual.) I appreciate the quotation marks around “normal people” but there probably is another word for what she means and using it would be nice.

Now we have one of my rage buttons. All caps time!
OH MY GOD STOP CALLING NEURODIVERSITY AN ASPERGERS THING. THE ANI PEOPLE WERE CLASSIC EVEN IF THEY TALK NOW, AND ALSO DIAGNOSED BEFORE ASPERGERS WAS IN THE DSM. MEL BAGGS IS NONSPEAKING. AMY SEQUENZIA IS NONSPEAKING. I'M CLASSIC EVEN THOUGH I USUALLY TALK. STOP. STOP. SERIOUSLY THE ROOTS ARE OLD ENOUGH THAT ASPERGERS WASN'T A DIAGNOSIS YET WHEN A LOT OF OUR FOLKS WERE DIAGNOSED, WHICH MEANS THEY WEREN'T DIAGNOSED ASPERGERS. THEY ARE NOT ASPERGERS, WHICH IS ALSO NOT A DIAGNOSIS ANYMORE. (maybe was when written?)

Intersex activist history! I knew about unwanted surgery, gender role training, and folks wanting their own intersex bodies back. I also know someone who was put on unwanted hormones. What are the results of Diamond getting so lauded while speaking in terms of brain sex, though? It's still the language coming from the people who try to enforce the man/woman dichotomy. What are the results of using the "sexed brain" discourse while not necessarily fitting in the binary? 


1 Walker, N. (September 27, 2014). Neurodiversity: Some basic terms and definitions. Neurocosmopolitanism: Nick Walker's notes on neurodiversity, autism, and cognitive liberty. [blog post] Retrieved from http://neurocosmopolitanism.com/neurodiversity-some-basic-terms-definitions/ is a good explanation of the neurodiversity related vocabulary I tend to use when thinking about neuro stuff.

Thursday, May 11, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- suffering and authenticity

I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant, then some thoughts on collective effects of blunting trauma, and then cognitive liberty. Now here's suffering and authenticity.

The concerns about what we might do to others minds if it were an issue of what person X does/chooses for person X, not what we are choosing for others. The concern seems to be about changing someone's true self, so suffering and authenticity come in again, just like cognitive liberty. These two seem frequently connected to me. If we recognize that people get to define their own "true selves", we don't get to moralize over which experiences are real and true anymore, which kind of kills the "not their true self" argument. Which is an argument I'm really not a fan of, especially considering which experiences it tends to be applied to.

This quote ... gives me the noble suffering/virtuous suffering sort of feeling, where whatever positive you might (not will, might) drag from the hell you go through means you shouldn't try to avoid that hell or save others from going through it.
Or will he succeed, over time, in 'redeeming' those painful memories by actively integrating them into the narrative of his life. By 'rewriting' memories pharmacologically, we might succeed in easing real suffering at the risk of falsifying our perceptions of the world and undermining our true identity. (90)
The version of a person that went through more bad things isn't automatically more real. The version of a person that's suicidal from trauma isn't automatically more real than the version of a person that takes medication to not be suicidal. Our choices define us, not just what we've been through, and using chemicals to get the parts of our histories we never chose to back the heck off? That's not less real. Suffering isn't the only way to be real. Enough of the noble suffering narrative. Enough.

Now to bring back a quote that I also talked about with cognitive autonomy:
And yet, there may be a great cost to acting compassionately for those who suffer bad memories, if we do so by compromising the truthfulness of how they remember. We risk having them live falsely in order to cope, surviving by whatever means possible. (92)
  (Survival is resistance etc)

And the concerns about what happens if we take out everything difficult? Those take a huge slippery slope argument, and not the kind where we've seen from experience that most people stop early or don't stop at all (destructive obedience is one of those.) Trauma is not the same thing as everything difficult in a person's life. Having to spend a lot of time and effort on reading and writing in order to become a good writer is not the same as witnessing a murder or being mugged or being a victim of abuse. One of these things is a choice: we're not under any obligation to become good writers. The other's aren't choices. They're things that happen to us. How we deal with the results is at least partially a choice. (Not entirely. Especially when, due to technological or social constraints, dulling the pain while working through it isn't an option.) There is plenty of opportunity for hard work and achievement without forcing others to keep horrors in their heads for the sake of ill-defined authenticity.

Tuesday, May 9, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- cognitive liberty

 I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant, then some thoughts on collective effects of blunting trauma. Now here's cognitive liberty.

The concerns about what we might do to others minds if it were an issue of what person X does/chooses for person X, not what we are choosing for others. Cognitive liberty. We don't seem to have a coherent definition of the self, and autonomy is complicated, but there is definitely a thing where a person either is or is not making the decisions about interventions taken (or not taken) on their own minds. Also on how folks define their own "true selves." What about who you are is important to you? Not what's important to me about who you are. Of course, that would stop us from moralizing over what experiences other people have are real and true vs. somehow fake. Changing one's own cognition by one's own choice isn't as acceptable as I think it should be. 
And yet, there may be a great cost to acting compassionately for those who suffer bad memories, if we do so by compromising the truthfulness of how they remember. We risk having them live falsely in order to cope, surviving by whatever means possible. (92)
Again, we do to them. Not, we offer them the option. Do we think we know better than them what's right for them? That way lies all sorts of abuse "for their own good." And ... do we really think everyone would choose to dull the pain of a memory or to forget it (remember also that those two things are not the same.) Because I don't think that. I think lots of people would, but not everyone. Despite (because of?) my arguments about cognitive autonomy leaning towards letting people choose to blunt the trauma,  I want the right to remember in my relatively unchanged way. It's just that the arguments run towards why everyone needs to be doing it that way, and I don't believe everyone needs to be remembering that way. I think enough people would choose to remember that we'd get whatever collective benefits the memory would provide, even if we let people choose to dull their pain. Not that I think the supposed benefits are nearly as strong as seems to be argued. Intentional ignorance is already a thing.

Thursday, May 4, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- collective effects

I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant. Now here's thoughts about the collective effects of forgetting, as worried about by the authors (and as I tend to think we deal with even without dulling memories pharmacologically.)

I have a concern about this supposed legal argument against using beta blockers or similar medications to reduce the emotional impact or trauma from publicly important events. (The given example was a terrorist attack. I can ... kind of tell this was written not too long after 9/11.)  The idea is that it's important to have some witnesses remember the event accurately. There's a problem: I remember from my introductory neurobiology class that when a memory is super emotional, we feel quite certain of our recollection ... but that we can still be completely wrong in our memory of what happened. Ask people where they were on 9/11, or when the space shuttle exploded, and some will tell you they were listening to or watching other events that didn't happen on those days. Sometimes didn't even happen that time of year. But we are confidently wrong! So as useful as accurate recollection would be for legal purposes, maintaining the traumatic impact on the witnesses doesn't make accurate recall happen anyways. Also, eyewitness testimony is notoriously unreliable to begin with. This is a bad argument because the thing we're claiming to want to preserve already doesn't exist.

On that note, I wish the authors had said something more about the social and personal effects of blunting our collective traumas. I'm not entirely convinced that leg of the argument is going to hold either. After all, I'm a Jewish (and Queer, and Disabled) descendant of Holocaust survivors, and I know how we're never supposed to forget. I'd be a lot more inclined to buy into the value of collectively remembering and the consequences of forgetting if we'd stopped having genocide or deciding that certain religions are inherently more dangerous or lesser. But we didn't. These things all still happen. The things we're claiming to want to prevent already happen with our supposed preventative in place, and that means I don't trust the argument.

The murder witness example actually does concern me. "Yes, I was there. But it wasn't so terrible." (91). We don't want murder to be thought of as not so terrible. I know we don't want that because sometimes it is already considered not so terrible. See also: "mercy" killings of disabled people by the folks who are supposed to take care of them. It already just depends on the choice of victim, and that's terrifying. I don't want the idea of murder as not so terrible spreading any further than it has. I want it gone. I want all the murders being recognized as being as bad as they are.

I also have issues with the juxtaposition (and sometimes what seems like conflation) of giving a victim relief and medicating away (or relieving, I suppose I should use the same language for each) the guilt of perpetrators. Those are not morally equivalent. Victims and attackers or abusers are not the same. When we're talking about a mutual conflict, as in the case of war (the most talked about cause of PTSD, but far from the only one), there may not be a clear aggressor or victim. There also may be. It depends on what's going on, really (and remember how often the military is painted as the only way out for people in poverty, at the same time we remember the atrocities soldiers often commit.) Still, when we're talking about accidents and survivors of terrorist attacks, there's clear innocents. (Not "perfect victims" in the sense that they never did anything else even slightly wrong, but innocent in the sense that they didn't choose what happened to cause the trauma.)

Monday, May 1, 2017

Jobs for autistic strengths and "autistic strengths"

Full disclosure: Real Social Skills got me thinking about this with some tweets (first tweet, second tweet, third tweet), and then a blog post, both of which I think you should read. That said, I think my thoughts are parallel rather than identical and it's still worth my writing my bit.

To me, what she's saying reads a few main points:
  • Some models of autistic strengths assume that attention to/liking of detail is one of the strengths.
  • They then assume this means we will enjoy repetitive, detail-oriented jobs most people find mundane.
  • That's still putting us into different sorts of jobs than everyone else (segregation!) but calling it strengths based and assuming we're all the same.

Since this is May 1 (Blogging Against Disablism Day), I've got some "spot the (dis)abl(e)ism" thoughts. Let's break those down. Here's what I'm reasonably certain isn't ableism:
  • Thinking it's a good idea to play to an autistic person's strengths does not read like ableism to me.
  • Recognizing that some strengths may be statistically common in autistic people does not read like ableism to me.
  •  Understanding that the jobs we find interesting or want to do may be different from what "most people" find interesting or want to do does not read like ableism to me.
Helping an autistic person find a job that's a good fit for them based on their (autistic, since they are autistic and autism is pervasive,) strengths would also not read like ableism to me It would be helping someone find a job for their autistic strengths. Unfortunately,  the way programs around finding jobs for "autistic strengths" often run ... does have ableism involved.
  • Assuming that "autistic strengths" means exactly a certain set of (perhaps statistically common) strengths is treating us as a monolith, and therefore ableism. Not all autistic people are detail-oriented, for example. (I appear to be a lot more detail-oriented than I really am thanks to pattern-recognition.)
  • Assuming that a given strength will correspond to a given interest is stereotyping based on interests. If you're only doing this in the presence of an assumed disability, it's ableism. If not ... it's still inaccurate stereotyping but it might not be ableism?
  • Celebrating how we can therefore do these jobs other people find boring and pushing us into those jobs is effectively workplace segregation, definitely stereotyping based on autism, and therefore ableism.
And this is what a lot of autism employment programs seem to be doing. It's not what we need. My jobs? Based on my actual strengths, some of which are a bit stereotypical and some of which are decidedly not. Math? Yeah, I'm good at that and I like it. People tend not to be surprised by that one. Grading? I guess that involves attention to detail, or pattern recognition that makes breaks in expected patterns stand out. Teaching? Seems a bit social, yes? Well, explaining things to people in ways they can understand is absolutely part of my skill set. As a student, I often explain math-heavy neuroscience papers to my non-math classmates in the neuroscience program. As a teacher, it means finding the way to explain a given concept that actually makes sense to my students. I don't think any autism employment program is going to suggest that a person who can't always talk become a teacher, but that's what I do. Editing? I guess it's attention to detail, but it's also language. None of my work has been in areas typically considered "boring," and a lot of the work people consider "boring"? Really wouldn't be a good fit for me. Assuming it must work for me because I'm autistic isn't going to work. I'm an Autistic person, not a machine made of autism stereotypes.