This is a question I found myself thinking about a lot when I was working on a relatively simple C++ program for my (now ex) girlfriend.
Basically I was getting to grips with the basics of programming when I started trying to think of ways to create a fully competent chat bot that would be able to work from information it acquires through conversations and actually account for context beyond just immediate inputs of text.
Before I'd even got to testing some of the concepts I'd come up with I'd got annoyed with how my now ex would basically throw a fit if I wasn't texting her during every waking hour and I jokingly thought to myself about trying to write a bot just to give me a break...
I was thinking about this while looking back through conversations that I realised that most of the conversations that I'd had online were almost painfully predictable; I noticed I tended to generally make the same responses to the same questions and I saw the same pattern from others which made me wonder whether anyone (online) would even notice if I was replaced with a chat bot.
Following this I decided to start working on an incredibly simple chat bot with which I could very easily add input and responses manually with the only limit to the number of conversation trees being the space on the hard drive, at this point I'd told my ex about it but now it was just a silly app she could use if she missed me, but I had a much more serious goal in mind.
I basically got her to try it out as it logged inputs that it didn't have a response to and the next time I saw her I would have written a tree for the conversation I predicted we would have.
So I'm just going to get to the point, I think the end goal in seeing this project through would be to basically end up with a digital, interactive version of myself that could almost be immortalised (I now realise how narcissistic that seems) it's not something that is achieved through particularly sophisticated algorithms but through a small commitment of time used to write in more dialogue in small amounts each day until I no longer can; it was able to hold a conversation for a good 10 minutes after a couple of weeks of working on it, surely with a lifetime of consistent development it would contain a relatively complete version of what my personality was?
How would you feel about consolidating your personality into a computer program? Is there anyone you wish you could talk to as a program like this? I know there are people who are no longer with us who I would love to have even a remnant I could speak to, it's such a basic concept technically but I feel it could have quite meaningful consequences if applied consistently...
I think a similar idea was in either Snowcrash or Neuromancer. I forget which. I remember the simulated hacker requested he be deleted! In one of the dragonrun games disappeared hackers/deckers post on bbs but seem to be imposters. The personality was incorrectly implemented.
I'll have to look them up, I find the whole concept quite interesting!
As for the novels, both those mentioned are worthy reads...
I don't know why I keep calling Shadowrun Dragonfall 'dragonrun'. I must stop!
I think it's definitely possible to create a chatbot that can respond as we might, but having our whole personality... Hmm, I am not so sure. Although most learning algorithms are far more simple than their counterparts, the databases required for them to mimic our personality (literally every decision we ever make.. Or at least a large enough sample that we can be predicted!) would require far more computing power than we currently have. I think that a chatbot that acts as we do and answers as we do is definitely possible, but one that actually has our opinion/personality? I think in the future for sure, but not with our current technology.
Very interesting and informative article! I can't wait to hear where your program goes (please keep us updated! :D).
Having a background in programming, i find your idea very interesting. This reminds me of a story somebody told me...this guy was a sysadmin and he programmed some cronjobs to send random apologies texts to his boss for when he was late for work (the trigger was not being logged in by a certain time).