The economics are so familiar that nobody actually gives them much thought, anymore than people give serious thought to explaining what food is, why bathrooms are necessary, and how those two things are linked. Explaining money, resources and scarcity is no more necessary than going into gravity or breathing.
But if all a space opera does is translate modern problems into a setting with spaceships and rayguns, is there really any point? A Song of Ice and Fire could be translated to a space opera easily enough. Star Wars could be recast as a fantasy. The choice of setting is little more than aesthetic. But sometimes there's a compelling reason to pick a genre. Lord of the Rings would feel different if the magic was just sufficiently advanced technology. Frankenstein's monster could have been a homunculus or golem but those would have been monsters of the occult, old and familiar. The Creature must be a product of modern science, a magic not stolen from the gods but of man's own devising. Gepetto making Pinocchio out of wood relegates it to fairy tale but Noonian Soong making Data out of a positronic brain and blinky doodads makes it science fiction, somehow more plausible.
I'd like to have conflicts that remain believable but require a scifi setting. Star-crossed lovers? That could be in Verona or LA. Two brothers struggling for control of the family business? That could be Memphis, either Egypt or Tennessee. But the lovers might not be of different classes or races but different species. A freeborn prince of the financial empire falls for a genetically-engineered pleasure slave? Different. The brothers are actually a series younger clones and their "father" pits them against one another to see who is the worthy successor? A little more interesting.
So, what are the conflicts?
Practical
It comes down to something that makes sense. Nobody has to take a lot of time explaining it.Economic. You want a new market to sell your products to, access to raw materials, or transit through a region to get there. Someone stands in your way. Or you don't feel you're getting a fair shake and you can't settle your differences in the marketplace.
Territorial. They have land you want. Access to markets isn't enough, you want it all.
Practical Politics. Who is in charge, who calls the shots? I subscribe to von Clausewitz's suggestion that war is politics by other means. "To jaw-jaw is always better than to war-war," as Churchill said, but sometimes the only way to get what you want is to take it. This could lead to war between polities or civil strife within a polity. Power struggles often turn ugly.
Doctrinal
This is not going to be a necessary conflict, though the people involved may feel differently.Religious. Something about what the other guy believes is so repellent they must be disabused of it by any means necessary. Could be an understandable moral failing like slavery or something silly like sexual practices.
Philosophical. Presented as more reasonable than a religious belief. Slavery can be presented as evil in light of reason but the same righteous reason can be applied to a conflict over eating toast butter-side-up or butter-side-down, something that is ultimately quite silly. And while one person argues capitalism vs. communism has the same weight as slavery, another will argue it's buttered toast. I would also call this impractical politics.
The practical stuff is all familiar. Hitler wanted another country for lebensraum, Space Hitler wants a planet. Same difference. Imperial Japan wanted access to oil and raw materials, Space Japan wants access to antimatter and magnetic monopoles. The United States sends Nixon to China to normalize relations and open markets, Space America sends Bat Durston to the Empire of Space Amazons. It's all familiar. Maybe Space Germany is flooding the market with cheap automation machinery that ruins the value of labor on your planet. That last one is called the luddite fallacy by economists who insist new jobs will always open up for the displaced. We're entering an era of structural unemployment where there simply aren't enough jobs and entire classes of people will be shut out of the economy. It's actually going to be a very immediate problem and not something for the realm of scifi.
I think that Doctrinal disputes will be the avenue for the most esoterically scifi of conflicts. We can see culture shocks and conflicts where ideas are seen as poisonous. What happens if biological immortality is discovered? What if brain backups and clones allow multiple copies of the same personality operating in a society? What happens if a post-scarcity society exists in the same geopolitical space as a scarcity society? We flip out over polygamy, homosexuality, incest, certain sex acts, cannibalism, etc. Blasphemy and apostasy are hot buttons for other contemporary cultures.
Could transhumanism cause the same level of disgust as transexualism does today? First and Last Men brought up the idea of creating successors to our own humanity and Brave New World had not just ubermenschen but untermenschen created to serve society. While it may be worrisome to imagine designing a superman, it feels as repugnant as foot-binding to cripple a human mind to make for a better service animal. Dune gave us a jihad against thinking machines. When what it means to be human becomes fluid and open to debate, some might decide to say "NO! It's not up for debate!" and tell us what the answer is. This sort of thing was postulated in the Night's Dawn Trilogy where those who embraced advanced biotechnology along with consciousness and memory transfer came to be known as Edenists and those who rejected it were Adamists.
There's thinking that certain scifi technologies could be considered too dangerous such as causality-violation weapons, certain kinds of nanotech and bio-chem weapons. David Langford created the concept of an image that could hard-crash a human mind just by looking at it, something he called a basilisk. Other ideas that have been floated are perfected brainwashing techniques that could be every bit as effective as love potions from fairy tales. Simulation as Lotus Eater Machine and virtual reality = the ultimate drug have come up before. Red Dwarf's take on the fatally addictive game Better Than Life is a personal favorite. It's easy to imagine a Women's Christian Temperance Union going after VR saloons.
So, this is not new, the idea of things that are taboo because they are too inherently dangerous, morally corrosive, or distasteful to be tolerated. But this is the future. Can we think of really good new ones? Or bring up old ones that have been forgotten?
Did some more thinking about the transhumanism thing. I think that could be the real ticket if played off not just for body horror (graphic destruction, disfigurement, rearrangement, transformation) to mind horror. I mean mind horror as something different from psychological horror which is an exploration of the fears and vulnerabilities shared by humans, I mean the the fear of being transformed into something other as human, mentally.
ReplyDeleteIf we subscribe to the religious notion of morality then what is human is what we are, some deity made us and gave us a book, it's been decided. If we sweep all that away, we realize that our morality is for us to decide and thus we can choose what we call human, what we call right and wrong.
So right now we have a body modification sub-community that uses crude methodologies to transform the body. Tattooing, scarification, piercings, voluntary amputations, bifurcations, genital modification, castration, gender nullification, it's all pretty nasty. But up to this point the only mental changes are cultural. Everyone is working with the human brain 1.0, just adding different social paradigms. Some people may have deliberately turned themselves into freaks but be compos mentis, at least as far as the courts would judge. Zombie Boy (the guy with a full-body skeleton tattoo), Lizardman (the guy who's surgically altered himself enough to pass for a bumpy forehead alien in Star Trek), these people aren't diagnosed with any mental illness though most people think they're out of their minds. Someone who pop psychologists might consider to be mentally ill like Michael Jackson can radically transform his outward appearance but the knives can't touch his mind.
The next step in body horror is adding functional new parts. What if the Lizardman can graft on a functional, living tail? What if some Indian popstar can alter her skeletal structure to add extra arms? It would actually require modification of the brain to allow for independent control of each new limb. And if you can make brain modifications like that, you can get into proper mind horror.
In the Call of Cthulhu, Lovecraft presented the possibility of us becoming like the Great Old Ones, beyond good and evil, killing and ravening in delight. And we have the biblical example of the Tree of Knowledge and the Tree of Life. Having eaten the fruit of the one tree, we have gained an awareness of the world that separates us from the other animals. If we are made in God's image, we have now stolen for ourselves the knowledge of good and evil, we have moral agency. Animals are seen as incapable of sinning. We have the same knowledge as God but not his wisdom so we know better but still do wrong. If we ate of the Tree of Life, we would not only have immortality but possibly the power of God. We wouldn't just be children playing with matches but children playing with nuclear bombs.
We have the cultural horror of the Strawman Agenda. Muslims are going to put us under Shariah law. The communists are going to take over and make us a vassal state. The Hollywood liberals want to turn our daughters into sluts. The gays want to recruit our sons. But the thing about heavy-handed doctrines is that people who grow up under them can still reject them. Many kids who grow up in fundamentalist Christian families reject the ideas. Kids from fundamentalist Muslim families find western liberalism far more engaging.
(comment continued)
ReplyDeleteBut what if we're not just talking about male and female genital mutilation and banning sex ed and science? What if the very structure of the brain is rewired?
Let's avoid pissing contests between monotheistic religions. Let's look at Buddhist horror. This is the divide between individualism and collectivism. Western religions like the idea of a personal afterlife, a preservation of the ego through all eternity. The thought of losing individual identity in Nirvana is nightmare fuel for any adherent of the Abrahamic religions. We get freaked out by cults where people dress and act alike but we know that brainwashing techniques can't truly reshape the human mind, not yet.
Now some might say "Well, Jolly, this is no new epiphany of yours. you're just coming around to the Borg from a different route. And you already mentioned Brave New World. What's your point?"
My point is we can already have human cultures with the same mental hardware feeling very alien to our own way of being, our own sensibilities. If we not only fundamentally disagree over what makes us human but can change the very platform humanity operates under, not only how we think about the world but what we use to think with, things can only get more alien and potential for conflict can only multiply.
The one caveat I'll put in is that I really dislike putting strongly superhuman intelligence into conflict with bog-standard humans because you're either talking about man getting crushed like an ant (man v. Cthulhu) or cheesy, deus ex machina victories that feel like cheats (Picard v. Borg, Man v. Martians.) I don't like author cop-outs.
One other thought based on something Charlie Stross wrote.
ReplyDelete"I'd like to take an optimistic view of the future. Perceiving it as dominated by the bankrupt ideologies of the present is unpalatable. But so is bloody-handed revolution. As for attempts to redefine humanity ... that's all well and good, but an irreducible hard core of actually-existing humans don't want to be redefining, and acts of redefinition involving razor wire and guard towers always tend to end badly."
Which is true. So one way of looking at it is that the radically transhumanists pushed out to the stars to put some distance between them and their mainstream culture, distance both literal and ideological. And perhaps it was a good deal of distance indeed. But gradual improvements in technology over long time scales could make space a smaller place and put those radically incompatible world views back into closer contact.
So, who is the aggressor? Are the transhumanist pilgrims returning to evangelize to and convert the people they left? Are they content to peacefully preach by example while respecting boundaries and the leaders of the mainstream society find even that an intolerable threat?
So long as one side of the fight is human, I think most of the motivations behind war can be found in the seven deadly sins. The interesting question is how these motivations are passed from leaders to the nation as a whole ("was it the wicked leaders who led innocent populations to slaughter, or was it wicked populations who chose leaders after their own hearts? On the face of it, it seemed unlikely that one Leader could force a million Englishmen against their will...[The Once and Future King])
ReplyDeleteOne thing which could make an interesting twist is the rise of super empowered individuals with access to advanced technologies. Consider the anthrax attacks on Washington DC after 9/11. No group ever claimed responsibility and it is possible that the attack was the act of a single person (a sort of bio-unibomber). The Aum Shinrikyo also carried out CBRN attacks on Tokyo with a very small group. Now instead of war like we traditionally think of it (a contest between States) it is now people against the State.
Even farther out, when the power of the State is no longer sufficient to ensure some disgruntled individual cannot inflict havoc for whatever cause interests him/her, then "war" might become contests between individuals (oddly, something like the superhero comic books transformed into movies), and motivations for fighting might revert back to the ones we see in the Old Testament or the Iliad.
The future will be stranger than we can imagine.
I hear what you're saying about individual super-powers. I've seen others point out just how much power has been placed in the hands of fewer and fewer individuals over the years. While individuals could commit acts of political murder since the dawn of time, computers and biotech really raise the potential for bad juju. Also considering Jon's Law about interesting starship drives doubling as WMD's.
DeleteIn the here and now, private citizens don't own WMD's. They certainly don't own atomics. The idea of family atomics such as in Dune seems a bit outscaled. Then again, in Dune the various houses individually wielded more power by various measures than the two superpowers at the time Herbert was writing. Hell, the NYPD headcount has been as high as 41k. In some eras, that's bigger than an army. All that for an entity that isn't really a city-state even.
With the idea of a self-replicating industrial base, it does seem plausible that what we mistake for an entity is a collection of sovereign individuals.
The one other thought is that it only takes one side to declare unilateral war. If you have six major powers that get along and a seventh that declares general war, you've still got a war. It doesn't matter whether or not that seventh's power's rationale for war makes any kind of sense, doesn't matter whether or not they can actually win the fight. All that matters is they have a reason they believe in and they're doing it.
I need to give Blindsight a read to get a handle on how the alien races are presented: "a universe in which sapience (that is, self-awareness, sentience, and the empathy that goes with it) is unnecessary for advanced intelligence and creative thinking. In fact, it's a inefficient, tending to lead to solipsism and wasting resources on pointless endeavours like art. Apparently most other species in the Blindsight universe may not be sentient at all, despite possessing vast intelligence and the ability to travel the distances between stars." Since I haven't read the book I'm having trouble with what they're getting at.
I can buy the starfish alien concept where their ways of thinking are so alien that we cannot meaningfully understand their point of view, model their thinking process, or communicate. If they don't obviously use tools and create technology then we might not even recognize them as such.
But non-sapient, unconscious, and unaware are used in ways that don't quite seem to fit.
If a human society spawned an entity like this and it's hostile towards all the other human powers, it's pretty much a Berserker race. And that's pretty much a variant of bug war/robot war.
Wow, many interesting things to answer at here. In no particular order:
ReplyDeleteIf we subscribe to the religious notion of morality then what is human is what we are, some deity made us and gave us a book, it's been decided. If we sweep all that away, we realize that our morality is for us to decide and thus we can choose what we call human, what we call right and wrong.
'Morality' is the rules in a society that define what is right or wrong. There is no such thing as a 'personal morality'. Every society has morality ; those that we would see as 'amoral' in fact have a very different (and incompatible) morality than ours. (For example, 'Might makes Right' can be a morality.) People, on the other hand, have ethics, which is personal.
Religion is one of the ways morality can be defined and enforced. It is especially efficient at it, as it has a superior guarantor (be it God, a pantheon...). It also means that religion-based societies tend to be more stable, but also to adapt less well to change. Which is a trend but not a rule, of course. It is also why people are less prone to try to change their society's morality in a religious society.
But with a liberal enough society, it can still happen despite religion, and in a society with a strong enough ideology and/or a conservative enough society, people will be inversely prevented to.
What is possible is that future societies are more liberal, and thus more prone to change in function of individual's ethics. It could be explained by the lowered scarcity, for example. On the other hand, we could see new societies with extreme ideologies (religious or not - look at totalitarian states), which would prevent those changes.
IMO, while technical details will change, many human motivation will be there as long as there are humans. That's why I said in the previous thread that the Thirty Years war could be the basis for a great epic tragedy in a space-opera setting.
The technical details would change (appearance of firearms, 2d battles...), but the human motivations could be taken with little changes (war fuelled by ideology, many struggling powers, pillage of civilians by warring armies, use of mercenaries).
If you don't want to make it feel 'copied' from History, some points can be changed. (Adding or removing some factions, who is the neighbour of who...) But the tragedy of an ultimately avoidable conflict, fuelled by the greed of some and the madness of many, who caused so much destruction, is probably an atemporal story.
Compare the varied Shakespeare plays who continue to be adapted to other times. The King Lear, adapted in feudal Japan in 'Ran', or the modernized versions of Hamlet or Romeo & Juliet... Those stories are about humans, and will be atemporal as long as humans are there.
Now, for new stories made possible by operatic settings.
The idea of mind horror is quite interesting. I remember a book with loose conversion viruses, used by religious cults, propaganda machines and such, who makes you feel religious/patriotic fervour, dream propaganda dreams... So you can't even trust your own feelings.
To go further, means of slowly changing the mind of someone opens many interesting avenues of storytelling. Both if the person wants it or not.
You can have memetic weapons : pieces of informations that simply knowing may change how you act. It's already how disinformation work (particularly during wars), but it may become an entire technical field, making them far more dangerous (a text making you suicidally depressed, a image driving you mad...)
The human-transhuman conflicts are also full of possibilities, because one side may not be human anymore, and such not act as a human opponent. Scary Dogmatic Transhumans can be far scarier and more dogmatic than dogmatic humans, because they may use alien logics.
ReplyDeleteIf they became a Hive mind, for example, their psychology may become completely different, for example. They may see individuals as an aberration, or as drones. That said, there are very few examples of hive minds done right in SF, AFAIK.
Lowered scarcity may also let new ideologies appear. The right to not work may become a fundamental right for some. The right to information (some countries already declared cutting someone's access to the internet as anti-constitutional).
Terrorism (people vs states) may become a bigger problem, as Thucydides points out. On the other hand, new ways of preventing it may appear : mind-reading devices, or even brain implants to control them, omni-surveillance... The consequences of those, and their slippages can be explored. On the other hand, it may feel like an allegory/extrapolation to the current problems, which may or may not what you want.
Another point is that information transmission is not instantaneous anymore. This will have consequences, though I'm not sure which ones exactly. There is also the fact that, basically, there is near-infinite territory to be explored and/or settled. And factions will have more inhabitants than today's entire mankind.
What I tend to do is to place humans with humans motivations in the setting, and 'look at what they are doing'. It's a little trickier with alien transhumans, so I'd tend to push them as secondary characters/forces of nature, but exploring their own psychology, while tricky, can also be interesting. (Getting a hive mind right is one of my long-term goals)
Multiple comment reply....
ReplyDelete'Morality' is the rules in a society that define what is right or wrong. There is no such thing as a 'personal morality'.
Well, when I said "we" I meant collectively as a society.
But it is possible to have nested sets of morality. We can all agree not to rape and murder. That's basic morality for the whole society.
A religious subset might find drinking, gambling, dancing and fornication to be objectionable and hold themselves to that standard while allowing others to do as they will. Conflict comes when they apply their personal standards to the rest of society.
Personally, I find graphic depictions of sex perfectly fine within the context of porn and erotica but find advertising offensive and the use of sex in advertising to be obscenity. I find advertising directed at children to be mental molestation, ethically no different from physical molestation. I'm hard-line on this and understand few people would agree with me.
You can gather this set of ideas under morality/ethics/philosophy/way of living. The definitions can get a bit hazy with people using different words to mean the same thing.
For those who make the argument of "We need a moral basis for society and, if not from these holy books, then from where?" This is an answer: Science of Morality.
If an action is not illegal, then it's personal choice. I can choose to dress like a biker with a beard and tats, I can choose to dress like a Republican. Then people object to how bads are classified. Homosexuality was a crime just like miscegenation, incest, pedophilia, necrophilia and cannibalism. The gays say that's not right and are fighting to be treated as normal people. The same goes for mixed-race couples. Then NAMBLA says "We're just like that!" and the gays say "Like hell! Children can't consent. Don't act like you're just misunderstood."
You can get into a whole snarly argument about the age of consent. In America we're used to the idea of 18 being legal and the idea of dropping below that seems immoral even though it's an arbitrary line drawn. Why 18? Why not 21 like with alcohol? It's a conversation that never ends well.
As to your points with tragic, avoidable conflicts, well-taken. I think in the abstract the really good conflicts will be between ideological fellow travelers who agree on the big picture but not on who is in charge and between rival ideologies where entire world views are mutually unintelligible.
ReplyDeleteSo for example, the protagonist's star empire has standard humans who idealize the human mind and form, have exacting standards on what humanity is (even though we might disagree from our 21st century perspective!) and reject transhuman or posthuman concepts as immoral and evil with the same sort of revulsion we have now for raping, killing, and eating a child. The thing is, these are the guys we'll identify with the most!
The alternative would be groups who embrace going weird ways. How weird? Up for debate. While I don't want to go with mind uploads in this setting, that's a prime example. Making duplicates, clones, running parallel minds, death of self, maybe not full-blown Borg but going weird? Sticking human brains in vats and letting them run starships, modifying minds so that truly alien thinking patterns develop, radical new body plans, social structures, etc.
So within the Human Idealization society, we could see power struggles we're familiar with. Protagonist has intellectual and political rivals, romantic entanglements, stuff we can understand. Internal struggles.
There can also be ideological conflict with rival sects. We've seen with religion that even when we have a set of books handed down from the gods, human interpretation of what is divine with curses directed at any who alter the texts will still give us schisms. So, we agree that the Human Ideal must be preserved? Well, our branch says don't modify the body of humanity and you say don't modify the mind. Heretics! Pew! Pew!
Between Human Idealization and strongly post-human societies, you're looking at radical disconnects, black and white morality running into blue and orange morality. And that's where mind and body horror come into play. You may disagree with your brother about the proper course for the family business but you love him as family. You may disagree with a political rival's plans for the economy but respect his mind. You may find body modification repellant but the practitioners only allow it after the age of consent and never modify the mind. You do not condone but you do not condemn. But all of you, brother, rival, and body mod freaks, all of you find the Other Guys reprehensible because they perform nonconsensual mind modifications on their own kind, removing freewill and self-determination. They don't even understand such things. You call them Human Desecration societies. They call themselves Perfectors. Pew! Pew! usually follows.
Memetic weapons are cool but I think are highly susceptible to wanking. Personal definition of wank: (1) major self-indulgence.(2) Pushing a pet idea so hard it overwhelms the setting. Nano-wank, bio-wank, singularity-wank.
ReplyDeleteThe thing is calling wank is subjective. One man's brilliant insight is another man's wank.
A really, really cool story about a lethal image: comp.basilisk FAQ.
Scary, Dogmatic Transhumans. Yes, that could be frightful. Same with Scary, Dogmatic Human Idealizationists. The protagonists could be caught in the middle of a fight between these competing orthodoxies.
Regarding hive minds: Yeah, they're one of the cliches where I'm not sure how to do them differently/better. It's kind of like rebellious robots which are just slave rebellions with shiny, metal asses... It's been done.
It's kind of like with the Matrix, the implied backstory would resemble the Terminator machine war but you only think of that after the movie's over. Thus the first one feels like a really interesting alternate take to man/machine conflict. With the two sequels, the machine war is played straight and you're left feeling "So yeah, this is pretty much exactly like Terminator only not as good." Sorry, flying squids cannot compete with killer skeleton robots.
The only possible way to make a robot war interesting is just a variant on the Agent Smith explanation. Rather than giving the humans a crappy world to be miserable in, the AI's that take over have done so seamlessly and incontestably but realize humans would hate being kept as pets so they are therefore setup with a G-rated, GI Joe war where android mooks can be gunned down bloodlessly and the heroes will always have time to bail out from their damaged vehicles and the robots keep getting defeated only to come back with a new plan. Sure, the occasional death might happen but it serves the human need for self-sacrifice and helps to define what life is about in contrast.
You are correct in that I don't want to make this story Allegories About Today's Headlines in SPAAACE!!! But the idea of constraining what individuals can do in a highly empowered society remains relevant.
ReplyDeleteMore areas to explore, more inhabitants: This is actually the brain-fault I'm running into. Post-scarcity societies imply little need for workers to keep things running. That's the point I've made earlier: nobles only keep peasants around because they're useful. If land and capital generate wealth without the need for much labor, why keep them around? A planet that goes from scarcity to a post-scarcity economy may be stuck with that population but a post-scarcity colony setup in a brand-new star system, what would that even look like? We're not terraforming a planet to Earth-standard just so we can grow grain to ship back to Trantor. "Ok, smart guy, then what are we doing?" I'm at a loss. More thinking required.
Setting humans in the setting: Yeah, that's what I'm trying to do but I don't want "ants scurrying around the feet of the gods" here.
In the Orion's Arm setting, it takes the Singularity all the way to the end-game. Hyper-powerful AI gods, designer worlds, vast wars over incomprehensible conflicts.
Why are there still bog-standard humans? It's explained like this. Multi-cellular life did not displace unicellular life. There's room for algae and protozoans and plants and animals and people. We all serve a role in the ecology.
More specifically, the higher AI's see an inflexibility in their thinking based on how they arose. While they may try to compensate for those intellectual blind spots, that's the sort of thing you have trouble seeing, even when you look for it. Therefore the lower orders of intelligence represent entirely new solution sets whenever they uplift, providing new insights the existing AI's might not have.
I'm sure good stories can be told in that setting but they're not the ones I'm thinking of.
One other thought for highly transhuman motivation: the trend towards physics is moving experiments off the lab bench and into increasingly more expensive particle accelerators. If the trend continues, meaningful experiments a thousand years from now could require equipment on the scale of an orbital mega-structure.
ReplyDeleteKardashev-scale civilizations I think would imply a ridiculously superhuman level of intelligence and organization. And some of the craziest, mind-bending stuff like making wormholes and potential causality-violation machines and the like require cosmic engineering like this.
But I don't even know where to begin to write a story from inside such a civilization, just from us ants on the outside. The most I can say is "Well, they're doing this because it's furthers a goal of theirs." Wow, I'm a goddamn genius. :D
A couple other thoughts for limiting vastly superhuman intelligences...
ReplyDelete1. The Blade Runner Replicant Effect, the light that burns twice as bright burns for half as long.
2. A techno-babble limit to the complexity of thought. If consciousness is the result of quantum this and that in organic thinking material, there's only so much power you can scale a mind to before it becomes unstable. Therefore you can only get to an IQ of a thousand tops before the mind is dangerously unstable. The primary drawback with this idea is it's pure, uncut, handwaved BS as far as I know. Psycho-history is the same sort of thing but so are morphogenetic fields. Maybe I'm on safer ground inventing my pseudo-science rather than using existing pseudo-science since I won't come across as a promoter.
3. If we allow for the quantum limit of consciousness, we could apply it to AI, too. Any substrate capable of supporting the quantum effects that give rise to consciousness (organic, inorganic) is still operating within that same limit.
4. Immortality. As the Vorlon said, we are not ready for it. So if cellular death and old age are removed and humans given the treatment, barring accidents, will not die, what then? I'm thinking that immortality can bring a sense of ennui and world-weariness that is often lethal. The obvious drawback for one man being immortal is that he would outlive friends, family, children, and the world he is accustomed to. If everyone is immortal, losing lifelong friends is less frequent but can happen. Potentially older memories drop away if the brain can't hack having multiple lifetimes of experience so a 500-yr old is a different person from his 100-yr old self. Bored immortals might willingly risk their lives to bring back a sense of meaning. Those who don't would cloister themselves away from all threats for fear of death. And there will always be those who just kill themselves because they cannot cope anymore. So potentially if a strong polity embraced immortality, they could see their power fade in time against the teeming multitudes of mortals. This is sort of cribbing the Passing of the Elves from Lord of the Rings by a different route but it's not a diminishing due to magic fading from the world, it's simply the weakness of old blood, no new blood ever coming in to invigorate the old polity. It would also take a very exceptional immortal to not be completely hidebound and set in his ways.
Lots of things to think about, but a few quick thoughts:
ReplyDelete1. War in SPAAAAACE is essentially a naval contest if we accept the various tropes presented in Rocketpunk, Atomic Rockets etc. In military theory this can be seen as 2GW (war of attrittion) or 3GW (war of manoeuvre). There are lots of historical examples going back to the 30 Years War (Prior to that we are talking 1GW, which is essentially contests between agrarian warrior societies), so both readers and writers are usually well versed in the various conventions, even if only through a lifetime of exposure to movies, TV shows, books and video games. OTOH, while naval combat is relatively "pure"; the situation on the home planet may possibly be much messier, especially if 4GW is being employed. 4GW aims to attack the will of the decision makers and the population through all available channels, and is messy, protracted and has very fuzzy boundaries; political agitation and organized crime can be part of the mix (along with many other things).
Space fighter jock Starbuck may be uneasy as he sets off on patrol knowing his wife and family may be at risk from the opposing faction. His family might be attacked; they might become smothered under an oppressive government security regime, he might be recalled or have his orders changed multiple times as the government becomes unclear as to what goals they need to reach and so on.
2. Post humans will have different motivations almost by definition. The most common motivation for conflict will probably be the access to resources; ultimatly they might get into conflict with Humans over access to the 195PW of energy the Earth receives (displacing us and much of the biosphere with whatever floats their boat.) Warfare might be a battle between human farmers and "triffids"; genetically modified plants that aggressively populate the soil for whatever purposes the post humans devised them for. Livestock and wildlife might be in the same boat; they can't eat "triffids" and they are being crowded out by rapidly reproducing organisms the Post Humans introduced to the biosphere.
3. Immortals may be considered "post human light", since their motivations will be somewhat alien to the mundane population. They will have long term goals that span normal generations, use compound interest to gather resources on a scale we can scarcly imagine and (unless they choose not to) crowd out normal humans through population increases and demographic pressures. Luckily, Immortals will have a statistical average lifespan of @ 1000 years due to fatal accidents (even falling down the stairs), and it is thought that the human brain will also reach full capacity then as well.
1. I think that 4GW is just a trendy repackaging of old ideas. We've long had fears of saboteurs, infiltrators, and Fifth Columnists. This seems plausible with the presence of foreign ethnic groups in your territory, a "sea" for enemy agents to swim in. The Germans tried it in WWII and it was a total failure. The biblical account of the Battle of Jericho isn't history, of course, but does feature a traitor, Rahab, who betrays her city for her own safety.
ReplyDeleteBut with the previously mentioned growing trend in individual power, and especially if technology does allow for practical brainwashing, defeat from within becomes plausible. And what really makes it great is if it actually isn't happening even though it's technically feasible! A paranoid enemy could cripple his military and industry hunting for spies who don't really exist. Not being able to find them just means they're more clever than he feared!
You are right that showing life on the home front does make the story feel more real.
2. Access to resources seems to be an essential no-brainer. The only problem is what constitutes a resource? The solar energy issue made a lot of sense when talking about rogue AI in the next 100 years. They want solar power and so it seems reasonable enough to start covering the planet to soak up all the rays. But pesky flesh beings are ruining our arrays. EXTERMINATE! EXTERMINATE! But if the AI is in space, no problem, right? Maybe there could be an argument there's only so much space in the solar system. But if you've got interstellar travel, why fight someone for access to the same system?
We oxy-nitro breathers might fight other oxy-nitro breathers for the same planet but would have zero interest in a chlorine-breather's paradise. AI's interested in solar energy could be sitting in free-flying orbitals happy as digital clams. The only conflict there is if they decide to go the exponential growth route and start turning the entire asteroid belt into new nodes.
The triffid argument is a good one and could be a good "100 years in the future without space travel" sort of conflict. But, if the post-humans have space travel, why are they fighting over a single planet? We're back to the problem of "the aliens aren't invading Earth for our water." Of course not. But they have to be invading for something or else we don't have a story.
The only answer I'm coming up with is that they're not after a material resource and the locals are getting in the way (why is our oil under their land?) but the people are the resource. The attackers are evangelists.
(continued)
ReplyDelete3. Immortals could well be playing a long game on a board so big we can't even tell which square we're on. The cop-out for the writer is to imply that's what's going on and otherwise leave their actions, cryptic and inscrutable.
One idea I had is that decadent immortals could become sadistic aesthetes. One of the major villains from the Hellsing manga had a speech. "You should be aware, fräulein, that there are some people in this world, some irredemable louts, for whom the means do not require an end. I speak of course of myself." The war does not advance any cause, the war is the cause. "Gentlemen, all I ask for is war. A war so grand as to make Hell itself tremble. Gentlemen, I ask you as fellow brothers in arms, what is it you really want? Do you wish for further war as I do? Do you wish for a merciless, bloody war? A war whose fury is built with iron, and lightning, and fire? Do you ask for war to sweep in like a tempest, leaving not even ravens to scavenge from this Earth?!"
So for these aesthetes, the only goal is the experiencing of the torture and destruction of sentient life. Why attack someone else's territory to do it? Possibly because it's more fun for them than just creating their own sandboxes to smash. Cosmic griefers.
AI's seeking the total power output of the Sun would be even more dangerous to us when operating in space. Their solar arrays could be upsetting the cycles of living beings by light pollution (disrupting the day/night cycles) and even thermal pollution (so many solar arrays in Earth orbit that the black body temperature of space rises noticably in the local region).
ReplyDeletePost Humans may not be interested in Space for whatever reason, or dominating the Earth may be the springboard for their next step. The "ultimate" motivation may always be to extract the maximum possible energy from the biosphere, but "how" they go about it may be quite obscure.
WRT 4GW, the argument is mostly that this tecnique has grown to dominate military discource and operations, rather than being employed as an auxilliary to the main force actions. In its evolved form, 4GW can be used (for example) to create political conditions that totally prevent the United States from deploying a carrier battle group to the South China Sea while the Chinese use diplomatic and commercial levers to get what they want without deploying the PLAN.
I think that if the AI's are trapped on Earth, there's conflict. But if they can get out into space, why stay in Earth orbit?
ReplyDeleteI'd have to do a lot of thinking on this before I can come around to a reasonable conflict for a story. The strong case with Singularity is techno-rapture but the weak case simply says "beyond this point, we can't predict much at all." Post-humans fit squarely into the weak case here.
Per 4GW, one could also posit that cloak and dagger spy stuff displaced overt military operations in much of the Cold War because we could never afford the risk of open warfare directly with Warsaw Pact powers.
It makes me think of the details of mob wars. Destructive struggles are bad for business and mob bosses tend to be more exposed to danger than heads of state.
What's interesting is that the mythologizing and practical matters went hand in hand during the American mob's history. There was a specific anecdote concerning a mob figure going to see a picture about the "real" mob, its traditions and practices. He was impressed. "Boys, write this down. We should be doing this shit."
Films like the Godfather romanticized the life and other films like Goodfellas ruthlessly tore down the myths.
The history behind the mob wars is pretty fascinating. What scares a wiseguy into biding his time? What makes him think he can take a shot? Does he think he can survive blowback if it fails or does he have nothing to lose?
Brings to mind a bit of military strategy: always allow your opponent a way out. If you have the strength to defeat him in detail, do so. But if you do not and you cut off his escape, he will fight for his life and you may lose much to secure a victory that could have been had for far cheaper.
You can gather this set of ideas under morality/ethics/philosophy/way of living. The definitions can get a bit hazy with people using different words to mean the same thing.
ReplyDeleteThat's why most modern philosophical works begin by re-defining the words they use. Still, I insist on separating morality and ethics this way because they are often confused, sometimes voluntarily, sometimes by simple incompetence, and often by journalists (who are responsible for most of the mess with words, AFAICT). And those two notions are very different, confusing them (and not just the words) is something common but very problematic.
For those who make the argument of "We need a moral basis for society and, if not from these holy books, then from where?" This is an answer: Science of Morality.
Science of morality isn't incompatible with religion. Again, let's be wary of this Science vs Religion fallacy. Science of morality is based, like all sciences, on axioms. It is just saying, 'if we want X respected, then we should do Y'. You still have the problem of choosing X.
You cited rape and murder before, for example. Seem pretty obvious, doesn't it? But Genghis Khan's armies would disagree with us, for example.
I'm not saying that holy books are the only source for those axioms, of course. They are an efficient one, as they have a superior force guaranteeing them. But any ideology can define those axioms, sometimes by using other methods, sometimes by using near the same, for example following the words of previous figures (be it Mao or the Founding Fathers).
Also, for what I can tell, science of morality is just putting a name and formalizing what most societies are already doing.
The examples you give are interesting. They show how there are both conflicts about the methods (choosing the age of consent, for 'sex is only for adults'), than about the axioms (should gay or child sex be allowed)
But it goes more complicated, as we can also choose more fundamental axioms, for example, 'what is best for people', then the above axioms become methods. In societies like ours, where we can question those, it tends to happen more often. The problem is that by doing so, mistakes can be more dangerous. What if someone demonstrate that sex isn't bad for children? If he's wrong but convincing, then you end up deeply traumatizing an entire generation...
On the other hand, more restrictive societies where the axioms can't be questioned, it's harder to evolve to a better set of axioms.
But I digress...
Axioms also change with time. In the Middle Ages, people were at the age of majority/consent in their early teens, beeing married and Lord of the estate at 14 would not be considered strange by the locals, the other Lords or your 12 year old wife. 100 years from now, people might be amused/horrified that people had to wait until they were 18 years old to vote, sign contracts or go to a bar (or they might be outraged that anyone would let someone so young do such things...)
ReplyDeleteThe entire point of 4GW theory isn't to say that indirect approaches were never used before, only that they have reached a point of primacy in modern operations. The SOE conducting sabotage and assasination could never have won the European campaign in WWII; Maoist "Revolutionary War" wore down much larger and more powerful opponents and now evolved 4GW has the potential to strike the political center of gravity without firing a shot (the First Intifada literally used children throwing stones at tanks to ultimatly change global opinions and convince the Israeli government to trade land for peace. The fact that these political decisions were not honoured by subsequent governments or the Hamas changed tactics to suicide bombing and rocket attacks in no way negates the achievement of the initial leaders of the First Intifada).
WRT the dangers of unconstrained activities in space, look up "ServerSky", which suggests the upper limits of solar energy collection around the Earth and that a fully developed "server Sky" would need to be spaced out around the orbit of Uranus in order to maintain the local black body temperature of space around the Earth.
http://server-sky.com/
There can also be ideological conflict with rival sects. We've seen with religion that even when we have a set of books handed down from the gods, human interpretation of what is divine with curses directed at any who alter the texts will still give us schisms.
ReplyDeleteAgain, this is not limited to religion. Any ideology is prone to this. Look at today's US politics. Most if not all are claiming to follow the spirit of the Founding Fathers. Yet, you see pretty big differences between parties. Then, you have communism. Based on the big and detailed work of Karl Marx. Yet, the XXe century showed us how hostile communist countries could be to each other.
As those things have to be interpreted, there will be differences of interpretations, then divisions, even in the cases of the founding texts are not themselves questioned.
AFAIK, there is one guy who tried to subvert this. Jacques Lacan, a French psychoanalyst, who basically brought back and modernized Freud's work (most of today's Freudian psychoanalysts are in fact following this guy's ideas). He tried very hard to not become a 'master' whose words would be 'religiously' followed.
It didn't work well. While some really try to do so and develop a sound opinion based on his and other's works, there were several 'Lacanian schools' arguing about each other appearing right after his death.
And this guy was a genius and a human mind specialist. So I guess it's part of us.
Memetic weapons are cool but I think are highly susceptible to wanking.
Didn't think about that. That's quite a risk with those kinds of ideas, yes. The only way I see to avoid wanks is to try to define what a tech can do, then come up with a story. But even when defining what it can do, there are risks of wanking it...
With the two sequels...
What sequels? The Matrix didn't have sequels, only big-budged fanfiction movies *slap* Sorry, instinct reactions.
More areas to explore, more inhabitants: This is actually the brain-fault I'm running into. Post-scarcity societies imply little need for workers to keep things running.
Then go for low-scarcity instead of post-scarcity. There are fully automated car factories, but there are still people who design cars. There are super-advanced virtual testing systems, but there are still engineers to develop new engines. There are moon-sized Higgs beam projectors, but there are still scientists arguing whether the 6 extra dimensions are required or not to explain this new particle variation.
Thus, people are still needed. Not as many as before, so you have plenty of rooms for artists, dreamers, politicians, soldiers, explorers, half-time workers and unemployed (though having no activity at all is deleterious, so people may be encouraged to do something).
Thus, you also have something to fight for. Purely ideological conflicts can happen, but they are not the majority, and in History, there is almost always something behind it.
But there, you have resources to fight for : first, people. Valuable workers are still needed, and thus are still worth fighting for. Their skills, but also their knowledge can be valuable.
But industrial basis are also important. This automated factory? Sure, you could develop a new one, but it would take time, efforts from people, and it would take time before it would be fully efficient. Thus, stealing it to someone else would be far easier. Particularly if the someone else is your enemy and the factory is a warship factory.
Everything else flows from there.
Then, you can keep all the old reasons for conflict, the resources fought over are just advanced instead of raw. (Though you can throw some unobtainium if you do want fights over raw resources)
It also gives a reason to expand (in addition to the instinctive need to do so) : you may not want to develop things next to this big imperialistic neighbour who will want to capture them.
Immortality. As the Vorlon said, we are not ready for it.
ReplyDeleteInteresting points. The bored Immortals were touched upon in Chasm City (they end up creating a business of professional hitmen chasing the clients to relive them of their boredom).
The immortal becoming a cruel aesthete is also a quite interesting one, though it could depend on societies.
What would happen to their minds, as they grow ageless? Would they forgot little by little? Would they grow senile? Would they periodically wipe their mind out to prevent that?
I would describe them as forgetting their older, less important memories, like any adult does. After 300 ans, they would not remember their first twenty birthdays, but maybe still their first explosive decompression accident when they were child. They would become keener, more experienced, but also have a hard time to learn new things. They would end up with a deep practical intelligence in their domains. They would also grow wiser, more thoughtful, thinking longer-term, less in the immediateness, but also more wary of change and unknown, and have less initiative, and less will to intervene and change things, or change their opinions.
This would be the average, of course. Some would probably continue to be foolish (though those may statistically tend to die sooner). Some, on the other hand, may keep an iron will and a will to adapt. Though ancient enough immortals would be like geniuses in their field, the 'natural' geniuses may either diversify, be simply more precocious than the others, or become deep thinkers like the world never knew before the immortals.
So the world wouldn't be very different, there would just be smarter people, and more smart people.
This can be combined with all the other problems you described for more fun, though.
Then what happens when an aberrant, Napoleon-like figure emerges (assuming they survive their eventful lives)? What this kind of person would become with centuries before them? Nothing. They would break any suspension of disbelief in half in any serious fiction.
Pertaining to morality: It's pretty much going to be whatever the dominant group claims it to be and is willing to enforce. If people believe human sacrifice is required to appease the gods, then doing so is necessary to keep life going. Willing sacrifices like corn kings are doing a great good and nobody sees this as murder. If the priest were to kill anyone other than the sacrificial victim, that probably would be treated as murder. Most people do not like the idea that right and wrong are constructs and that core beliefs they feel define them would differ if they grew up in another society. There's an understandable reflex for believing in natural law.
ReplyDeleteEven with my own morality, there's subjectivity. I believe in "do unto others as you would have them do unto you" and "And it harm none, do what thou wilt." Your own person is included in that one. People can weasel around rules and creatively reinterpret them to do what they wanted all along.
So for your Genghis example, the Mongols played by different rules. If you disagreed with them, you'd better have an army to back up your argument.
You are right that the science of morality is trying to formalize what societies have been doing but it's remarkable how many of the rules seem to be shared. It argues that this morality was not transmitted like language but is something that spontaneously arises like the religious instinct. Every culture, no matter how remote, has their own concept of gods and spirits. We've never encountered tribes of aboriginal atheists.
This is really good from the storytelling perspective because it gives us the opportunity to have tremendous culture clashes. That's why I brought up our own taboos. What would posthumans consider anathema? What could they get up to that would elicit a similar response from baseline humans?
As for the ongoing debate, you also bring up a good point. While our child-rearing concepts haven't quite gone towards "sex with kids is great!", we have had plenty of pseudo-scientific fads that have proven harmful. Dr. Spock is no longer considered authoritative, the tough love doctrine promoted by Christian family experts is considered abusive, the idea of women feeding with formula instead of free breast milk was promoted by food companies selling a product and directly contradicts scientific recommendations, etc.
This is the sort of thing I call the authoritative scientific fuckup. Scientific racism was one of those. Sounded really convincing to the layman because the guys with the serious glasses and respectable hair used big words. It was bunk.
On the other hand, I'm always curious about the prospects for what I call scientific heresies. Many things I believe are biased personal opinions that happily are backed up by fact. I hate smoking. I hate the smell, I hate the litter, I'm repulsed. Science doesn't care about that but it does prove that smoking will kill you. But what if science proved smoking is no more harmful than spitting? I can still consider it disgusting but it's now relegated to personal opinion. If I'm completely honest, I'll accept that and get on with my life.
But what if science proved something awful? What if DNA didn't prove we're all pretty much the same but instead established that there are at lest four distinct human species capable of breeding? Even worse, what if science proved black people were clearly the inferior race with poor cognition skills? And just as white people start puffing out their chests and opining on natural superiority, the scientists say the smartest, most advanced thinkers are Han Chinese? Oh, shit! There's a master race and it ain't us!
Thuc and drifting axioms: More good points. I'd opined on possible cultural drifts between spacers and planet-dwellers over on Rocketpunk.
ReplyDeleteI can *easily* imagine some near-future culture clashes once we've seen people out in space colonies for a while. And I can see both sides feeling completely vindicated.
Spacers see planet-dwellers as ignorant and soft yet barbaric. They live beneath bare atmosphere open to space. They are subject to the random whims of nature with so many things under control. They eat living animals as opposed to the cultured vat proteins civilized people eat. (Heck, maybe even their veggies come in indistinct patties, too. Eating a plant would seem very strange to them.) Planet-dwellers live with too little order in their lives, no structure. They live useless, unproductive lives.
The spacers, by comparison, are seen as alien to sensible planet-folk. They start having sex whenever they feel like it with whomever they'd like, male or female, old or young. All breeding is selective and the computers match the parents -- their fertility implants will only allow conception with the matched pair and will prevent any other unwanted pregnancy. The children are raised in communal creches and there's no proper families. And without that and with the fertility implants removing the risk of accidental pregnancy, there's no incest taboos. Immoral, disgusting degenerates living like rutting animals rather than god-fearing human beings! And there's no property rights up there, everyone's got a common share in their ship, in their habs. It stinks of dirty hippie socialism. They're socialist drone people with filthy sex habits. (I'm just imagining the sort of culture that'd be in a Heinlein novel if he were still writing today, mainly the sex bits.*grin*)
I figure you don't really need aliens in most stories when you consider just how alien human cultures could get when separated from the mainstream. Even if they remain in cultural access with Earth, a Jovian colony is going to very quickly become a lot different from the mainstream of Earth, especially if there's not a lot of swapping of personnel -- if these are people living out there for keeps instead of working an assignment.
4GW: Right, I think you hit on the point -- achieving the political outcome with scratch forces that used to take professional armies. That's become a newer thing. The American Revolution was started with ad hoc militias but it took the training of a proper cadre of European-style musket troops trained by professional officers with financial support from France to win the war. Vietnam could not have fended off the capitalists without major external support.
ReplyDeleteGandhi had a quote about being able to use an enemy's shame and decency against him if he had the right kind of culture. He said what he did against the British could not have worked against the Nazis. Harry Turtledove wrote a story exploring exactly that, starting with a Nazi victory in WWII and assumption of control over British colonies. Passive resistance was met with machine guns and tanks rolling into and over crowds.
@Eth and ideological fragmentation: Rival sects could mean religions or politics. I'd opine that the moment religion moves from individual to organized it's politics with a different funny hat. :) Harris made a good point that supposedly atheistic political beliefs from the 20th century had little to do with logic and science and more to do with adherence to blind faith, i.e. religion by another name. We manage to screw everything up.
Suggested reasons for future conflict: yeah, I'm still having my ongoing debate on that. I just don't quite feel it in my bones yet. I'm going to read a few more post-singularity or post-scarcity writers and see if they've come up with anything interesting. The problem I've had with some of the more esoteric authors in these realms is they can't manage the trick of making a story interesting.
In a post scarcity world, the only real limitations are time and attention (AKA bandwidth). Something akin to the DNSchanger malware that diverts your attention from what interests you to whatever interests "them" would be an act of war, as would sophisticated attempts to waste your time so you are not working on the project or goals you have set for yourself.
ReplyDeleteOnce again this is almost a "superhero" world where superempowered individuals can make their own destiny (at the price of leveling a large section of downtown Gotham) and warfare might have devolved towards vigilante justice and "clan" battles (although the definition of clan may have little to do with genetic kinship in the future; clans of shared interests could lead to bizzare scenarios like hordes of Jane Austin fans wreaking havok on fanfic writers and followers of the Bronte sisters in a truely apocalyptic showdown...)
There's also the possibility that the post-scarcity technology is fragile and sufficient disasters can wreck the cornucopia machines. Therefore vast changes could have been made via the seemingly magic technology and then paradise is lost. Such a disaster could put the story squarely back in a scarcity setting with familiar economics even as dreamers and visionaries try to reacquire the lost knowledge.
ReplyDeleteIn such a situation, knowledge remains power as much as ever and citizens of high-technology cultures retain a significant advantage over fallen cultures.
The key issue there is once you have one cornucopia machine you can use it to make another one. Indeed one of the tradeoffs to make the jump to post scarcity economics might be at what point do you start letting the cornucopia machines make something else (if you allow for geometric doubling, you vastly increase your potential output for each generation you double).
ReplyDeleteI would think that once you reach 1024 machines (9 generations) then you have sufficient to take half off line and put them to work creating goods and services (each subsequent generation produces another 512 machines, so you will still be able to grow your economy at a rapid rate).
Of course if the machines themselves are black boxes that cannot be directly figured out by human engineers then you have the potential for large problems (and not even worrying about if they stop working; how do you ensure they are producing what you want them to? This relates more to your latest post than this one).
Well, the other bit with the cornucopia machines is that they will have to operate within the limits of physics. How long does it take to fabricate a motorcycle? it takes roughly four years for a horse to go from zygote to old enough to ride. How would a cornucopia machine compare with a traditional factory mass-producing goods? Is it only advantageous when you don't have a pre-existing industrial base? What about feedstock? Probably the simplest way to do it would be sticking a hose in the water and filtering out necessary elements. To keep things balanced, if you have a gold mine it'll be quicker to mine and refine a pound of gold than waiting for it to be filtered out of the water. But if you don't have a gold mine, this is your best option.
ReplyDeleteI'm specifically excluding the possibility of a cornucopia machine also being able to transmute elements at will.
One potential danger is the idea of weapons targeted at higher technology. We're familiar with the dangers of EMP. We're familiar with the idea of viruses that can either brick hardware or actually cause it to damage itself. But what if a designer fungus that eats modern meta-materials, say the magic dust inside the cornucopia machine? Just like we have to worry about citrus canker destroying our cash crop in Florida, perhaps something like this could be a threat deployed against higher technology. Therefore a serious war could leave both sides no longer capable of designing advanced technology, it might take a serious effort from outside to decontaminate the planet.
One idea used in a previous story was a weaponized tsetse fly with a boosted version of the sleeping sickness. The flies weren't supposed to be able to reproduce and the boosted disease wasn't supposed to be transmissible except via the weaponized flies. Neither belief proved correct. So large swaths of Africa have been depopulated by the bioweapon. Nobody is sure who amongst many warring nations were responsible but there's no one left to take the blame at any rate.
I suspect that the trope of nature being wide angle and human engineering being narrow will apply for centuries to come. The human ecology (technosphere, noosphere or whatever term you wish) is narrow and highly specialized, so viri, malware and other "things" will be similarly narrow. fighting specialized or highly adapted parasites may be the best model: difficult to do but not impossible.
ReplyDeleteTo give you a great example; if the Human race were to go extinct, cockroaches and several varieties of lice would rapidly follow (having become highly adapted to live of humans and their wastes). Defenders will be akin to medical technologists and epedimiologists taking samples and devising countermeasures, then releasing "antibodies" into the environment and perhaps engaging in producing and releasing counter offensive malware of their own.
We can also borrow from nature as well to power the defense; subtle mutations can be built into the control packages so specific malware might not affect every device. Devices might also be allowed to "mate" and each generation will become more differentiated (with the downside being you no longer have 100% contol of the devices; the black box problem again).
Cornucopia machines will start filling specialized niches in the economy (high end, bespoke devices and prototypes), and the industrial ecology will adapt to the introduction of these machines. First they will inhabit the R&D facilities; then they will move to the factory floor and mine shafts (one thing a cornucopia machine might be albe to do is process raw materials more quickly and efficiently, especially in small operations or using low grade ores). The cornucopia machines will eventually form an indistrial ecology of their own, perhaps making large and complex items (EADS is experimenting with using a huge 3D printer to make airplane wings) while small widgets continue to be made in the old way (screws and rivits, for example).
The future of industry may well be a form of batch processing and artisanship, you place your order (or rub the magic lamp) and things kick into gear from deep sea mining barges in the Pacific Ocean to shale oil rigs in Utah, various cornucopia machines begin putting the raw materials into packages and shipping them to the big assembly rig nearest your house, where the rest of the process happens until the FedEx robot arrives at your door and asks you to place your DNA sample here....