Ok here’s a dark little thought about post-1982 Flynn.
So, on the Discord we were discussing something I’ve mentioned a bunch of times before: the idea of copying a program to another system.
The reason I believe Legacy Tron was a copy, is basically just this: In real-life computers, unless a program is running off a removable disk, copying is the only way to move it from one computer to another. If you move a program to a new computer, the original will, by default, still be in the computer you moved it from. You’d have to actively choose to delete it, if you don’t want it to be there.
And a few more ideas that came up were:
- What if this happens when moving a program out into the “real world,” too? Like Quorra, at the end of Legacy? Does it leave behind an in-system copy, who can’t ever fully leave the system, except by being deleted?
-What if this happens to digitized Users, too? What if Flynn left behind a program version of himself, when he got back out of the Encom system, that first time?
-What if it happens every time? What if Flynn went back in and out of the Encom system a bunch of times, and every time it created a new Program Flynn?
-What would happen to all these duplicates? Would they simply be stuck there forever unless they were deleted? Could they be merged together?
And since then… my mind has extrapolated this to the backstory of Legacy, in some really disturbing ways.
What if, by the time he makes the new Grid, he’s figured out a way to stop these proliferating copies from happening every time… But he still knows how to make it happen on command, if he wants to.
And what if this is how he created Clu2.
This would explain why Clu seems to be made in his image in a way other programs aren’t– seems to be a duplicate of him on some deep level, perhaps even sharing his memories.
And it would also explain why he’s so sure that neither he nor Clu would survive reintegration.
Because that would mean he’s done it before.
Not on himself – but on his earlier copies, in his attempts to reduce their numbers without outright killing them.
He found that his copies never survived being merged back together. And based on his knowledge of how it works, he sees no way his “real” self could survive it either.
Maybe he doesn’t know for sure what else would happen, if his real self were to merge with a copy. But he has plenty of experience to suggest that neither one would survive.
And the fact that he, prior to all those “zen” years of becoming more enlightened, was someone who did all that to his programs…
…might also explain part of why Clu turned out the way he did.
To expand a little bit on that last part:
I don’t really want to believe this about Flynn, and in my own fic I try to find ways around it… but the fact remains, canon has a disconcerting amount of evidence that Flynn doesn’t take the life and sentience of Programs very seriously.
It’s not quite so pronounced in 1982. We do pretty often see him putting his own desires ahead of any concern for the Programs, and we see him recover and go back to his happy-go-lucky self disturbingly fast after each time he sees someone get derezzed. But that may be explained as him just being desperate to get back to normal, and/or repressing bad feelings out of necessity.
We see worse behavior in Legacy and Uprising and the various games. He seems to continue being oblivious to the needs of his creations– which becomes worse when he’s in charge of their whole world.
(See: the obviously impossible “perfect system” directive he gave Clu. And his disregard for Dyson’s injury.)
(Also, separating Tron from his user, his original purpose, and perhaps from Yori too? …But then we don’t see the exact circumstances of that, so whatever.)
And then there’s his “ooh! shiny distraction!” problem. If I found out that the programs in my work computer were all sentient, I would put some serious effort into trying to make things better for those programs. Life as office software has gotta SUCK, especially if none of your bosses even know you’re alive!
But Flynn’s response seems to have just been, “Oh COOL! Look what I can experience by going inside a computer like that! I’m gonna go make my OWN setup for doing the same thing!”
Again, we can’t be certain, because it never really shows much of anything he did in the Encom computer post-82. And again, there’s ways around this, for when we want to show him sympathetically in fanfic.
But on the surface, it looks as if his “cool-new-project” inspiration just grabbed his attention away from actually caring about any of the friends he made in there.
And this happened again with the ISOs. They were new and cool, and he was more interested in them than in helping his old programs with their problems.
He also seems to have said (in some of the videogame material, I think?) that the ISOs were special because they “had free will,” and the regular programs didn’t.
Now, I have no idea just how Flynn defined “free will,” or how he thought he could possibly know who has it and who doesn’t. Personally, I think the whole concept of “free will” is too subjective, too impossible to prove, too impossible to agree on a definition, for it to be any use in real life.
But… that concept, whatever it means, is pretty often thrown around as part of the definition of being a sentient person.
Which makes me concerned about how Flynn viewed the basic programs. Did he not actually believe they were sentient? Did he never really value their lives on the level of human or ISO lives?
So, if this is really what Flynn was like (and again, I’d prefer to think it isn’t!) …imagine what would happen if any of this callousness came through in his creation of Clu.
(Of course, if it did, it wasn’t directed the same as it was in the original Flynn. If Flynn was callous only toward the Basic programs, Clu was callous toward almost everyone, including, especially, ISOs and humans.)
(But then– the habits we learn from our parents, mentors, etc. don’t always get focused at the same targets. Can’t really predict how they’ll be focused. And even if Clu was at one point an exact copy, rather than being like Flynn’s son or student… well, there’s still a lot of ways experiences can change you over time. Deep drives are harder to change. But the way they’re directed… I think that change can happen in more sudden, volatile ways.)