Head: A Short Story

John was convinced that he was a machine. There was input, and there was output. He put food in and faeces came out. He put water in and urine came out. He put LSD-25 in and he had experiences that are dulled by clichés in the telling. He was sold health plans hedged in the rhetoric of the automotive industry. Working order, tune ups, oil changes. Machines tend to have purposes. He wasn’t entirely sure what his was, but he understood the functions, whatever end they were toward. He wasn’t sure where he was built. He had memories of his early childhood, sure, but he couldn’t remember the moment he was switched on, the moment he attained consciousness, or a very effective simulacra of consciousness. He tried to look into it but he just found pro-life literature on the topic of souls. If he was ever in a womb, with a soul, he certainly couldn’t recollect it, and if he was ever in a factory, without a soul, because he was pretty sure he didn’t have a soul, he couldn’t recollect that either. There was no command to view uptime. He just didn’t know.

He didn’t dislike being a machine. He reasoned that, for a machine, he had quite a few friends. In his addledness, he was too addled to realise that his friends were addling him with various inputs both chemical and information based, but they were his friends, he reasoned. They didn’t move when he sat next to them. They answered his phone calls. They remembered his birthday when he didn’t. He discussed things with them and he input the same things they did. One of them turned him on to Huxley and he latched on to the theory that psychedelics don’t actually stimulate activity in certain parts of the brain, but instead dampen activity, and that the dampened areas were what could be referred to as a reality filter, parsing the information from reality and making it sensible to us. His friends introduced him to ways of turning off these filters. DXM, LSD, DMT, THC, 5-HTP. They introduced him to ways of sharpening and understanding these filters. He interpolated the DSM-V in to his own patterns of thinking. He knew that, as a machine, he had levers, circuits, switches, and that there were ways of pulling them, shorting them, flicking them.


-Reality is just electrical signals interpreted by your brain.

Have you been watching The Matrix again? Fucking hell John, I’m tired of this shit and I’m tired of your babble. We’re just trying to have a good time, not pierce the fucking veil.

-But when your brain stops working, reality goes away.

And it comes back when your brain starts working again. It’s still there, for fuck’s sake. You just filter it for a while. Things are still there when you aren’t.

-How do you know?

Oh, christ. You were in my first year philosophy seminars weren’t you? I’d still be studying if it weren’t for you. There’s questioning towards a deeper understanding and there’s questioning to draw an argument out and around, and guess what you’re doing?


-Reality is that which, when you stop believing in it, doesn’t go away.

-Have you ever read any Dick?

-A bit, but I prefer my philosophy peer-reviewed.

-When the electric ant cuts the tape, reality goes away. That tape is in. My. Fucking. Head.

John’s was a very sensitive instrument. He had a habit of letting things get to him. There’s drugs and there’s information and where the twain shall meet you get a lot of mad fancy about MKULTRA and chemtrails. John tried on ideas like he tried drugs. Smell it, swill it around, sense the mouthfeel. You’re supposed to spit it out, though, and he never did. One overarching theory gave way to the next but little bits and pieces that dropped off would begin to snowball and he would become pretty sure that he was a machine and that only he experienced his reality and that with enough experience, lacking the manual, he could change it for himself. His friend had been right, though. He had been that guy in the first year philosophy seminar.

His friends, too, had gained experience with his machinery. They knew that pressing certain buttons would lead to certain undesirable outcomes, say, vibes that would cause a mondo bummer, man, and so they learned how to steer him and how to stop him exploding. Not that he information in his head didn’t stop accumulating and feeding back. Curiosity got the better of one of them, when he first confessed that he believed himself to be a machine.

You mean, you are certain that you are not human?

-Fairly certain.

How did you figure this?

-I have inputs and outputs. I respond to certain stimuli in a consistent manner. I can manipulate what I perceive to be reality by various methods.

You’re human, that’s what humans are like.

-Yes, so we’ve been led to believe. But only I can appreciate reality. I cannot know things except for how I know them. I cannot be empathetic when I cannot forget myself and be someone else.

You can imagine.

-Yes, I, can imagine. That’s socialisation.

Yes, it’s socialisation, we’re humans, we’re socialised animals who experience the symbolic realm. That’s what makes us unique, if anything. Animals do not have these crises and believe they are machines. We do, and that means we’re human, not machines.

-Or we just have a specialised way of parsing and applying information gained through certain stimuli. We understand ourselves differently, but so much of it is filtered. Animals act like machines.

Dude, I’ve seen you swat flies. You hate them. You automatically whap out with your hand and they become mush. Mush, John. Have you ever seen any gears? Any cogs?

Any gears, any cogs? He didn’t know. He couldn’t say. It was possible he had and they had been filtered. If he was a machine, he had programming, and he didn’t know where it had come from. The transposition of certain images? It was possible. He didn’t understand his own set of electrical signals but he knew he could manipulate, amplify or diminish them. He knew so much and it just gave him more questions. He considered the gears and the cogs as he came up on a dose of DXM. In that first year philosophy seminar, he had heard someone say that the truly intelligent know how much they don’t know. He didn’t know how intelligent he had been programmed to be, and he couldn’t know how intelligent anyone else was. He did know how to calculate the dosage for a fourth plateau DXM trip, though, which he had been planning for a while. He was turning off filters. The trick in making humans believe they are humans is making them aware they have a corporeal body. When John crested the fourth plateau, he no longer had a corporeal body. He had a set of experiences encoded in electrical signals that he remembered from a life where he had been led to believe he had a body. Cogs and gears. The measuring of intelligence. First year seminar questions. Can you program a machine more intelligent than yourself? He didn’t know who had programmed him, but he was certain, now, that his manipulation of the aforesaid was fairly strong proof that he was indeed a machine that had achieved some degree of self-knowledge, self-mastery. He couldn’t be sure though. He needed to find cogs, gears.

As he stumbled back down the plateaus. He decided it would be a good idea to conduct a more conclusive experiment. He had dropped out of first year philosophy, too. All that talking, all the thought experiments, no real results, just more questions. Around the point where he was again aware that his body was corporeal, he took some LSD. The DXM didn’t just disable filters, it put up stronger ones so that reality could not get through at all. The LSD would allow him to see it as something like it was, taking things down without raising too much else. This needed to be a conclusive experiment. As he readied the knife that his earthly body had found, the sharpest, most serrated one he could find in the kitchen, it occurred to him he was about to commit a form of sudoku, before he then realised that the act was actually called seppuku. The sudoku was a stupid joke one of his friends was fond of. He had no control over that word appearing in his head, if it was even in his head that it appeared. Language as virus causing machines to realise that they are in fact, as opposed to. What? He’d forgotten.

The signals that might have been interpreted as pain were not interpreted by John as such, but we cannot know. Reality was his and he was discovering himself as not himself. As his midriff opened, a curled, coiled mass spilled out.

%d bloggers like this: