In practice, the process of uploading a mind would almost certainly be creating a copy, or a mindclone, of a mind, but the difference between the two terminologies is separated by two very important aspects, outlined below as two (overly) simplified definitions.

The following is speculative, part of a group of fantastical writings exploring humanity and philosophy through futurism and realistic science fiction

Definitions:

  • Uploading - Accurately capture the pre-existing biological matter (the brain) and their one-to-one connections in the form of a pervasive direct copy of the mind and their biological (i.e. natural) processes with or without a conversion to a computational medium.
  • Cloning - A simulation of a mind that allows for non-natural modifications without the need for either a natural backward compatibility layer or a recreation of the biological structure and components to function.

The following is an excerpt from a discussion I had with another on the topic of 'uploading' and 'cloning' a mind.

Before we could even start working on the herculean task of fully copying a mind, we'd first need to completely understand how the brain works. We're not even close enough to that point to speculate how far away it might be.

I disagree. While I think understanding the mind is helpful, I don't think relying on it as a basis for recreation is required. We wouldn't recreate the way a tree grows just to have more paper - instead, we would find how to create the pulp without growing the tree. The brain has millions of years of cruft, evolutionary dead ends, and is painfully not as efficient as it should be because of it. Tech debt and brain debt both derive from the same formulaic iterative processes, and both are inefficient.

What even is a mind? We know that neurons are widely distributed throughout the body, 500 million (recently up from 100 million, we have no idea yet) in the gut alone, and that experience is not produced solely in the brain. There are also hormones and other neurotransmitters to consider. Variables evolve and change throughout life, alongside the brain in our skull. It could turn out that once we figured out how to tease all the pieces of self and memory out of the brain, we'd still be missing vital parts of the person's sense of "I".

What is an "I"? the "I" of myself five minutes ago is different than the "I" of myself five years ago and certainly different than the "I" of myself now. I don't subscribe to the belief there is a concept of consciousness, we are all just a form of in-depth simulacra of what we perceive could be us, a method to understand the madness of the chaotic causality of the mind. Reality is subjective to the users experience, my green is your yellow, my blue is your red, we can't determine what an "I" is if we can't determine what the true reality is on its own.

The brain has created a wonderful mechanism for SDRs, where what we see and hear are negotiated with ourselves, and what we perceive is essentially a tarball that has been so compressed that the real true values are encrypted even from ourselves by using simpler calls of generalizations.

I don't believe we will ever upload a mind but we will clone them. Cloning them would require less evolutionary cruft, more enhancement, faster iterations than evolution can ever do for it/ourselves. Crafting and creating a simulacrum that is similar enough to our existing mindscape, a method of transferring the core of our personality, experiences, opinions, and decisions, but not the 1:1 connections. Brains are fallible, prone to error on the smallest of things, with no real ability to change or upgrade (on the contrary, tends to degrade steadily and rapidly).

Uploading ones mind isn't, or shouldn't be the goal, there are too many problems at home. Cloning a mind? Creating the ability to create decision matrixes based on prior experiences at a 75-90% similarity rate of the human subject? That is the goal. If you placed two of yourselves at this moment at a table and asked it a question, would they answer the same? Or would they have minor changes? Likely the latter where the minor changes are due to the different position they are resting in, if they are to the left or right of "you", the distance to the nearest wall or table, or even if a breeze passes that they feel, but not "you".

The goal of cloning a mind is to create a similar enough enough decision tree that at inception they are the same, but with every moment afterward being rapidly different as a feature, not a bug.