On the 26th of October I attended a symposium at St. Michael’s College on the University of Toronto Campus called Reading Frankenstein: Then, Now, Next. The day was divided into those three parts, regarding the past, present, and future context of the novel to society. Here are my rough notes pertaining to the Now and Next portions of the day.

Frankenstein Now

### Keynote: Josephine Johnston

Josephine Johnston studies human reproduction, psychology, genetics, neuroscience. From the Otago peninsula of NZ, now at The Hasting Centre, an independent research institute founded in 1969 by Dan Callahan, philosopher and journalist who wanted to work outside of academic institutions to grapple with ethical questions regarding science in an interdisciplinary way. Two goals:

  • Just and compassionate Care
  • Wise Use of Emerging Technology

i.e. helping people like immigrants in detainment. Questions about Genomics. Tomorrow hosting an event called Frankenstein in the age of Gene Editing. “Gene Editing and Human Flourishing” project funded by The Templeton Foundation (interested in meaning in science). She is author of Traumatic Responsibility in the annotated Frankenstein for scientists, engineers, and creators.

Frankenstein did not anticipate the results of his creation. Victor is responsible for the damage the monster does and also has a responsibility to his creation itself. Responsible for and to. Shelley also shows the impact of responsibility on the self through Victor’s self-torture. People read into the way the monster is treated in precisely the same way people treat women, black and brown people, disabled people, trans and intersex people. Victor LaValle’s Destroyer graphic novel about reanimated young black youth killed when mistaken for adult gun holder by his mother. Her rage is like Frankenstein’s Monster’s rage, reanimates him as a destroyer but the young man is thoughtful and doesn’t want to harm people. Racial violence in America is reflected in this book. Climate Change plays a part in Frankenstein’s story with the ice sheets. Johnston found those themes resonating in the novel, which she only had just read before the conference — she had not before known even that Frankenstein wasn’t the monster.

Frankenstein is a terrible scientist: uses bad science, warns nobody, abandons his creation, gets no funding and steals body parts. Real scientists don’t want to talk about Frankenstein; he haunts them as something they could be accused of: creating in secret something to be unleashed, to which they will then disavow knowledge. She lists many “franken-” things, like biologically engineered mice, etc.

We have mechanisms in science to, if not eradicate, then deal with the problems Victor Frankenstein ran in to. She does not think that Shelley wrote the novel as a critique of science. That’s just how we interpreted it. She was just writing a great story, she loved science.

Two papers regarding gene editing, or CRISPR/CAS9. Since the discovery of the DNA double helix there have been ideas about changing genes. Some traits are dependent and some independent. Since mapping the genome, we’ve learned a lot about all the small differences that produce what we are. It was assumed it would “be great” to be able to change genes…. methods were hard and clumsy and change on gene at a time. CRISPR is just a much better tool to operationalize all the old ideas which, over decades, have been theorized about what gene changing could do. CRISPR is a way to wrap around some DNA, cut it with RNA like scissors and splice in some new stuff… could be used in sperm, eggs, babies, old people, just about every animal.

CRISPR-CAS9 was not just an explosion/revolutionary within science, but also in the public imagination. It was covered in Wired, Scientific American, MIT Tech Review, Time.

There are five main concerns regarding CRISPR-CAS9:

Neferious Use

Could be put to dastardly ends. Weaponizing something. The 1918 influenza virus (killed 50 million) was sequenced and concerns were raised about it being released so as to be perhaps made again. Most obvious of concerns.

Use of Animals

New model organisms for studying disease in new ways: chimerical animals. Animal chimera research is now more powerful — human/animal hybrids, for instance mice with 60% or 80% human brain neurons. One might suspect the discussion is about the ethics of creating little humans, but will likely be more about animal rights in general.

Funding Priorities

What kind of biomedical research is funded? What is the disparity between the burden of a disease for the world at large versus available funding for researching it? Much focus today on genetics, what genetics can do for society. She shows a cover of a book about the impact of genetics on education and achievement, i.e. minorities. Big mismatch between what we think we can get out of tech, vs. what solutions would actually be good for addressing social disparities or problems. The idea of thinking the solution for educational disparities is tweaking kid’s genes is crazy, compared to thinking of the harder solutions we already know work i.e. learning styles. Funding glitzy, trendy science as solution is bad.

Progress or Well-Being

Increase in longevity, GDP, lower-morbidity as goals might be the focus. But these ends might be disconnected or conflicting with human notions of what a good life is. Atlantic article “Will Editing your Baby’s Genes be Necessary?”. Can/must we “tweak” kids to give them an advantage, or keep them from suffering. Why would any parent not do it, not provide their child every possible advantage? But conflicts with norms of parents who must accept parents for who they are, letting them flourish in their own way, radical acceptance. Today we judge parents who want a boy so they can have a football star son: kid might have wrong gender anyway, will they still be accepted? We no longer accept that sort of thinking. Yet from a very neo-liberal perspective, what’s better than choice, i.e. kids eye colour?

Trust in Science

America has science trust issue now. GMO protest sign. Just like arguing for/against GMO in food, how much worse will it be with CRISPR? New Zealend is gene-free, eradicating foreign predator species to protect original habitat. Different approaches here: what does it mean to trust science? You have not believing science itself, and then you have the ethical arguments of trusting scientists themselves.

International Governance

This tech does not respect international boundaries. Gene editing is easy; you can order this stuff on the internet and do it in your garage. The people who first did this tech raised the alarm that we have no way to manage this. Jennifer Doudna (co-created CRISPR, admits what she doesn’t know) contributed to 2015 article asking for a moratorium on research until international conversation can be had on ethics. She created a technology and then immediately asked questions about how it could be governed. International Summit on Human Gene Editing was held in D.C. on 2015, but no real concrete results. Nobody is sure what a real system of governance would look like for this stuff.

Kevin Esvelt, Asst. Prof at MIT Media Lab is either Victor Frankenstein or a genius. Wants to release a genetically modified mouse into the wild on islands to test fighting ticks, canvassing various islands and their population trying to appeal to them. Get their consent before changing their ecosystem. A small-scale example of public deliberation on this issue, uncertain of scalability.

Questions:

Manhattan Project is example of how scientists bear the brunt of unforeseen consequences of their work. Jennifer Doudna hence is very vocal about what she doesn’t know, her fears for application of tech. It would be a mistake for “us” to blame her for what she’s unleashed on the world.

Why is Victor Frankenstein a bad scientist? No oversight, not well studied. How is he mad? Fevered, works in manic state, he’s afflicted, if not technically mad.

What does gene-focused education look like? In the imaginary scenario, it’s like learning-style focused but genetically informed. Put people in different learning environments based on their best learning style. Nothing innately wrong about learning-style focus, she pays lots of money for her daughter to go to a private school, but it’s a waste of money to pursue this in education. Robert Plomin is one of the most well-known behavioral geneticists on the planet; these ideas are not being put forward by fringe names.

Problem of participation/representation in public discussion: what about non-human voices animals, non-existent voices (future and unborn). Don’t like having discussions which lack voices, but better than alternative of having no public discourses. “Deliberative democracy”. New Zealand has “oversampling” of Maori voices due to constitutional commitments. Targeting of specific groups who would not be included due to disenfranchisement or marginalization.

Brain drain from academia into private industry… don’t get funding? Go get slice of advertisement money. Monsanto employs the guy who writes the open-source software underpinning most climate change research. Facebook is doing brain interface research. Johnston apologizes if she made it seem like institutions have it going on or haven’t totally dropped the ball. China has more of a grip on their own research, owing their more totalitarian government. She thinks, coming from law (in NZ), it’s insane that America can only regulate federally-funded research but companies can do anything they want. Makes sense legalistically, regarding the particulars of America’s founding. There is more surveillance in the states, which she finds consoling, and patents are suppressed which are dangerous. There are mechanisms on a national security level, but they can’t prevent brain drain, or commercially-advantageous yet still horrible research. Women’s prenatal care are completely undermined by silicon valley…. sorry, she didn’t mean to come off as Pollyannish in the first place.

Who needs big government when everyone can exercise their own control? Well, the US has it’s own balance between entrepreneurship, intellectual curiosity, academic freedom, pursuit of knowledge is it’s own good. That argument has a lot of weight in the U.S. compared to most other places… things that wouldn’t fly in Europe. It’s probably more about free thinking leading to advances in commerce than in other advances. The strongest assertions about the primacy of invention is the strong patent system and export of patents to the world. You could celebrate it as a free thought and knowledge, but its’ more a celebration of money. There are two stories coming out of science: scientists committed to public good, dabna people(?)… scientists stepping up saying we want to be open, engaged, don’t’ want to be left to ourselves, accountable to public. And then there is the opposite. The current administration of Republicans are taking advantage of their current power to rail against the FDA, despite the paucity of any scientists complaining about being regulated. “Right to Try” is being framed as patient’s rights blocked by the FDA, but they’re just anti-government in general as ideologues.

Roundtable: Mark McCutcheon

It was the McLuhan Center’s research in EDM and Rave culture which got him into academia. Graduated in ’97 under Derrick De Kerckhove. During moral panics afterward McLuhan’s insights in to the 60s culture illuminated the cyberdelic movement of the 90s. Afrofuturism is about taking apart white technology and not putting it back together properly. Multimedia. deadmau5 performs in cyborg costume. Since the early days of raves and techno, his field of study has moved into technology and Frankenstein. The result is his book Medium is the Monster.

Shelley reinvented the word “technology” for English. McLuhan redefined technology as human-made monstrosity. Technology used to mean any art or craft. Modern meaning regarding machines and tools started late 18th century. But earlier authors, Benthem, Thomas Peacock, Harvard Bigalow started it, largely thanks to Shelley. The novel does not use the term technology, but it ends up being used in all future discussions regarding Frankenstein. The novel lead to the Frankenmeme. Extensions of man as prostheses or replacements.

the real world continues to turn into science fiction. “The future of the future is the present, and this is something people are horrified of”

Roundtable: Catherine Stinson

Formerly worked in A.I. and Tech industry… she recognizes Frankenstein in her former job.

Victor is into alchemy, but is told “the pursuits of men of geniuses, even when flawed, still contribute something to mankind” (more or less), causing dramatic irony for the rest of this novel. Victor’s education is a very arrogant one; he’s a genius, everything he puts his mind to is easy to master. This attitude is what she recognizes within the A.I. industry.

In the novel The Big Disruption by Jessica Powell, she writes about how entering the lobby of an A.I. firm makes one feel one is intellectually superior to 99.7% of the population. A.I. companies always are explicit about looking for “top minds”, not a thing in many other fields (according to quick and dirty google search). “The best minds”, “the worlds brightest minds”, “top minds”, “the very best minds in our country” etc. etc. always in A.I. Couldn’t find this kind of thing in math… most fields would be embarrassed by using this sort of language. #NotAllNerds.

Criticisms of this, what the responsibilities of this sort of genius…. problems are usually seen as technical limitations; solutions are trial and error, rough and ready rather than preparing in advance through study of the greater field. Diana Forsythe (1993) on Construction of Knowledge in A.I noticed that many people did no prep work, couldn’t do interviews properly. A.I. experts questioning those who worked in other fields expected them just be able to pull all relevant facts from their brains easily and then blamed them when relevant knowledge was not acquired. Interviews were done all wrong.

A paper on an algorithm on how to be fair in algorithms. Elon Musk quote: “Failure is an option here…..” exemplary of trial and error. Blaming others, or using self-exculpatory language when things go wrong. For instance, excuses like “I’m just an engineer. It’s basic research” have come back regarding tough questions about applying A.I. research in the study of gangs and gang crimes in the field of law enforcement.

How to change the culture of A.I.:

Codes of Ethics

Engineers, doctors have codes of ethics. Data scientists should have one too. suggestions coming from within the communities instead of inviting ethicists… still must grapple with the hubris of genius devaluing other expertise.

Jobs focused on fairness, bias, transparency

Uptick of these jobs, but they’re for computer scientists; not other fields and humanities.

Ethics training in CompSci departments

Increasing Diversity in the Field

Seems good, but skeptical that these programs are being done in a way which is effective and being evaluated for effectiveness. Girls can code: okay you’re training girls to code and you let them loose and they’ll encounter a wave of everyday harassment and must either develop personal strategies to deal with it or just leave. No next step for how to actually change that culture.

Safiya Noble (2018) Algorithms of Oppression discusses problems in search engine results. She suggests input into algorithm design from fields like Black studies, ethnic studies, American Indian studies, gender and woman studies, and Asian American studies. Stinson would add psychologists, and basically anyone who actually knows how to change cultures.

Roundtable: Dav Clarke

Shows video of classroom students with advanced video tracking of kid’s heads, eyeballs, “attention”. All the kids are being profiled by A.I. He’s constantly evaluating the ethics, trying not to do things horribly dystopian. Major tech companies, tech capital will do this research anyway, so schools must do this kind of invasive research too in order to keep up. Or else all knowledge will be owned by Mark Zuckerberg… but, full-disclosure, Clarke gets funding from Facebook anyway.

In Shelley’s depiction of the monster, (and Asimov’s claimed invention of the computer, originally in humanoid form) what’s important isn’t two arms and two legs, but the “something” that we also have, which can turn on us, take over, harm the people we love. We can treat it cruelly or with compassion. As educators, activists, people of faith… we try to grapple with this monster which might not look like a person, but will still be stitched together from pieces of us.

To understand that you have to understand cybernetic systems. Start with the thermostat. We outsource our feelings of “too hot”/”too cold” and decision making in those regards to a totally-automatic device. We’ve taken part of our mysterious substance out of us… we can still control it for now, but we lose our ability to fend for ourselves. Now Honeywell and G.E., as much larger systems, have power. How would you stay warm at night if you didn’t have a thermostat? Do you really know? Would you set your couch on fire? Manually turn on your furnace?

There’s a way that, evermore explicitly, everything gets turned into numbers. Some process operates on those numbers and stuff happens.

There is good reason for anxiety, but there are solutions. The idea of labour organization is one; we can’t rely on institutions and there are other systems of organizing that can arise to bring them into check. How do we organize?

Roundtable Questions:

This is the Now panel… Now is when you can do things — Dav

Questioner: The denial of social-constructivism: the bad scientist. If you do your science right then everything will be okay. There is much pushback to the idea that there is something that shapes how technicians do their work. As a social scientist, mocked as a social constructivist, how do i get through to engineering students I have something to teach them?

#NotAllNerds My only strategy (Stinson says) is to say hey, i was one of you, you can’t brush me off as someone who doesn’t know math.

Josephine says that doctors influenced by gifts from drug companies all thought they were independent and perfectly objective. Everyone thinks they are the exception and that it is their colleagues who are influenced by social constructions. Using science to help convince individuals doctors they are being socially constructed is an uphill battle.

Frankenstein Next

Keynote: Jason Scott Robert

Lincoln Chair in ethics. frankenstein.asu.edu.

Science is supposed to be, today, not lawless, but open and controlled. How did we get from 1813 to all the Franken-memes, and the images of groaning, dumb Frankenstein? science fiction does become science fact.

1816: The Year without a Summer: large volcanic eruption in Indonesia. Mary Shelley was with Polidori (the vampyre), Jane Clairmont, Percy, Lord Byron (darkness). All stuck inside, decide to write scary stories in a contest. Shelley obviously won. Victor was an autodidact, worked in secret, studied alchemy.

Uncanny Valley is the human approximation of a robot. Fully organic creatures can be uncanny too, like human-pig chimeras or synthetic embryos. The closer the creatures to humans the better scientifically for biomedical purposes. But it also gets more and more discomforting and the response to these creatures grows vehement. Some will object to the sanctity of human nature being infested by the bestial. Some will object that we must treat human-chimeras as human. Victor and others had the first response to the creature: shun and run or fight. Einstein hinted the world is dangerous because of those who look on to evil and do nothing. Nietzsche said that those who fight monsters must see to it they don’t become monsters.

Another reaction is possible: Love. It’s a challenging case; I draw on responsibility for and responsibility too. Patricia Piccinini sculpts animal-chimera mothers nursing her pups. “The Young Family” is pigs with human faces which are repulsive in an uncanny-valley sort of way. Yet “love” is sweetly displayed.

Patricia Piccinini – “The Young Family”

Robert asks us to try and identify the love, intimacy, communion inherent within the work. “The Long Awaited” is a young boy sleeping on the shoulder of a weird human-porpoise hybrid with toes on its fishtail. Love is more dignified and dignifying. We must be responsible to and responsible for each other no matter who we may be. No matter how uncanny. Love is actually all around. We can reinvent ethics with solidarity to each other, re-enchant science and engineering with humane values. Will this approach put us in a better stead to deal with results of bio-engineering? That’s my hope; without hope there is nothing.

Changing professional motivations to vocational motivations involves “re-enchantment” toward the work being moral. Why did we get into the work that they do? That personal meaning could be enchanted in a way which engenders a deeper responsibility. I’ll call it love and people can laugh at me for it and that’s fine.

Questions:

Dan White mentions Victor saying “no mortal could bear it’s countenance”, so in the world of the novel, that easy answer is not available. Human eyes involuntarily shut at the monster’s presence. Is it possible, in the novel, that Victor could have embraced his hideous 8-foot baby? Jason Scott says that “uncanny” just scratches the surface of how uncomfortable the topics we’re discussing today are. Must invoke Foucault’s notion of liminal (limit?) experiences — the more uncomfortable it makes you feel, all the more you must pay attention to it.

Josephine Johnston asks, regarding A.I. how do we have a real love for it? [i think she’s implying the idea that to train an A.I. to be benevolent you have to train them as if love is bi-directional, i.e. for A.Is to learn love you must love them first]

The idea of living cosmopolitan-wise. Engenders a deep commitment to ‘building’ (not revealing, since there’s probably nothing to reveal) a shared vision of the world that we can collectively buy in to. An ethic of responsibility for knowledge and for knowledge producing… not just going about doing what we’re doing unreflectively, but actually being worried about what we’re up to. Allowing what we’re up to to keep us up at night. Back in the old days, worried lab students could leave their worries in detailed notes, elaborated in a take-home lab notebook. Of course, now regulations are different now and the kids can’t take it home and ask deep philosophical questions during the night in their own private log book.

Robert asked students why they use Rhesus Macaques. “Because that’s the one everyone uses.” When asked why, they had a difficult term figuring it out. Pathways like that are very hard to address. People who work with multiple animals are untenurable. It’s “canalization”: experts get stuck in valleys they can’t get out. To do what you do unreflectively you lose the genuine passionate connection to what they do… this passion must be re-ignited.

Mark Canual says Victor has not just given up on love, but on communication. Someone with loose commitments could do something.

Reading Faces, Reading Minds: the problem is largely sight.

Dav points out Victor’s terrible self care which is a powerful part of the story. It’s abject failure. Victor’s care for himself. How do we train scholars to love? Love themselves? Part of that could be self-care. Part of his program involves going to a retreat at a monastery involving self care. Who are you? Are you dying a little inside? Or are you thriving? And if you’re dying a little inside, what can you do to fix it? We can’t give it to them, but give them room to self-diagnose.

Question: maybe humans are transitional. I’ll die, we’ll all die, maybe we should look at giving over the planet to a new species, what can we do for them. “We must expand the circle” responds Robert. Questioner asks what to do about the instinct to destroy the inheritors of Earth before they come. Can we have love?

Roundtable: Michael Sims

I type, I hide out at home a lot and type all day. When I write (say, Arthur and Sherlock, about what went into the creation of the rational detective archetype), I go out on tour and see things differently than if I were in rural Tennessee where I grow up. Book tours offer the ability to talk to many different people who read books. Things come out of nowhere… slowly my map of the world is really a map of literacy.

At my stop at the Merrill Collection library there was some quotes from readers that boiled things down. “The bottom line is that Victor Frankenstein was a dick” said a lady. We spoke for two hours about what it means to be human. Another was that Shelley made life as messy as women who have born children knew that it was. Communication was important, too. Sims’ son began to stop paying attention to people after 3, as autism became more and more evident. Therapy helps, but my son reminds me  of Frankenstein’s Monster as whenever he stands there struggling for a word, trembling, pent up and going to burst. I keep thinking about Sherlock Holmes and Frankenstein together. The classic texts survived because professors teach it. But, also, very passionate readers still exist, still engage. Sherlock is very Protean, there is so much in him. Different actors give different versions, i.e. Cumberpatch gives autistic version, Downey Jr. gives the fisticuffs version.

Balloon flights, discovery of Uranus, idea of deep space, deep time all happening around the time Shelley wrote the novel. Coleridge attended the conference where the word “scientist” was chosen to replace the term “natural philosopher”. “I attend Davie’s lectures to increase my stock of metaphors”. All this stuff inspired literature, poetry, science fiction, sermons. these fantastic tales — if you imaginatively explore science as it is then you are immediately looking into the future. Cut to Jules Verne describing a submarine and you’re looking into the future. “The Senator’s Daughter” written in the mid 1800s, takes place in distant future of 1937 — scifi is more convincing if people are just living their every day lives. Travel by pneumatic tubes, recieve constant incoming teletype news in every home, and the whole issue is interracial marriage. Also wrote “The Clock that Turned Backward”. Alice Fuller (pen name) wrote the first romantic robot role. There are still people who care, who are turning toward these stories…. they won’t just be cared about by academic institutions, but will exist on their own because of their properties.

Roundtable: Yulia Frumer

The future is shaped by the past. There is a lot of fear in the world regarding artificial humans in the U.S. Japan is totally different: robots are friends and American’s are weird to be afraid of robots. America: robots are pathological; Japan: fear is pathological.

There is a strong case that feelings about electricity and technology were socially constructed in this way. “Sparks of life” and “Galvinism” are clearly on Mary Shelley’s mind. Galvani was the scientist who made frog leg’s twitch with electrical current. Giovani Aldini does this on cadavers and executed criminals — there was lots of discourse about electricity and the revival of the dead, perhaps without a soul. “Lola/Without a Soul” (1914) is a film about reviving to life resulting in a soulless, evil thing. Meanwhile, in Japan, electricity had much better connotations. Compared with Chi, a vital energy for health instead of something that animates the dead and creates soulless automation. Also, in the U.S. the electric chair is a symbol of death, while in Japan there was no electrocution as execution.

Historical approaches to the theory of evolution also interesting. Consider the view of artificial humans as “a next step”. In the west there is “survival of the fittest” — you must beat up others to survive, those who live means all the rest die: zero sum game. In Japan botanists and biologists did not accept this — yes there is struggle but there is primarily symbiosis and mutual aid.

Fiction in the West is frightening, with terrible artificial humans. R.U.R. Check out Lang’s Metropolis 1927, the chair infusing the robot with life while it sucks life from a real human being, who dies once the transfer is complete. In 1931 Frankenstein we have all the ominous scary things about lightning, very important for the social construction of fear regarding creation of life. Frankenstein cries.

Japanese artist saw this, said it was wrong, not science, not biology. Made his own robot… It was affable, had rubber skin, we don’t make it a slave. We don’t it to be a slave. The robot holds a star, is about helping people. If you make a robot in the form of a human it can’t be a slave. Beauty is a social thing — the artist said I don’t want to build something beautiful to just one society, but something made of all the races of the world so all is represented.

Astroboy, robot son. The doctor is still mad. The scene of creation is mad, like Frankenstein, but not scary. Beethoven’s fifth plays as the maestro mad scientist creates his new robot son. Mori Masahiro, 91yo, engineering school at the end of the war. Worked on engineering. In the 50s he worked on prostheses, getting back to disfigured human bodies (especially after war, bombings in Tokyo). What’s important in how we construct the future: in the west we remember the “uncanny” nature of artificial limbs, but the intent of the article was the normalization, the affinity with our limbs. He was influenced by technological systems. You can’t just think about the goal of design, the kind of technologies, you have to think of the whole system of production and consumption that this entails. How to build robots that don’t trigger us, but create something we feel at ease with.

Questions:

Avery Slater quote George Dyson, “Darwin Among the Machines” >I’m on the side of nature, but nature is probably on the side of the machines. Who will invent the future? Blade Runner: it will be corporatism, and the artificial humans must blend in. Also, how do we love a complex surveillance system? (Ava, Ex machina). Think about people who work on how to turn A.I. off. In A.I. we see more Gollem than Frankenstein’s monster. Gollum: can we turn it off? Frankenstein: Can we turn it on? Footage of insects driving robots: a silk moth patched into a little Turtle-like robot who steers it around, trying to get the robot to understand the moth and vice versa.

Yulia: in Japan no dichotomy between man and machine, nature and machine; linguistic origin. Japan had no word for nature in the Western sense, so the Japanese translation of English word nature was a truncated Japanese word that included the concept of machines too.

Simms is interested in the Turing conundrum, not being able to read the robots anymore. Yulia mentions that training A.I.s means getting them able to read the environment and humans are the hardest things for A.I.s to learn.

Jason Robert doesn’t want us to love surveillance systems, rather we re-enchant technology as the way forward.