I was just kind of contemplating what we were talking about in class today, particularly the comments on the Chinese box, where a person was shown a symbol and had to react in whatever manner was required for the symbol. We had said that even though the person was acting according to the rules set out by each symbol, there was no actual learning (at least of the language) occurring. I thought about that, and it could potentially be wrong. If, say, the symbol was 青 (ao - blue), and the action for that symbol was to push a blue button, then it would be indirectly teaching the person what a color is in the language, even if they did not know how to say the word, they could look at the symbol and say "ok, this is blue," even if they weren't in the box. Now, whether thats the case or not I don't exactly know, but I would say that either result is plausible.
This goes further with the concept of machines learning, or even being able to act in a human-like manner. The machine can not do anything which is not programmed to do, so, in theory, the machine can not lie, unless it is programmed to do so. Even then, the lies that it would tell would be limited to what it was told it could lie about. If it was programmed to only say "I'm female" or "I'm male" in response to the question "are you male or female?" then it couldn't say "I am a dog" (and if any programmer was dumb enough to put that option in there...). Basically, no matter how intelligent it may seem, it can do no more than it was programmed to. In a sense, the coding of the robot/machine/computer is what it has "learned" and only by adding more information can it learn more.
I might have mentioned this earlier, but even transferring a persons brain into code, and then implanting that code into a robot, would not really prove anything. If the code was a sort of backup to the person's brain, then if the shell and brain/central intelligence were destroyed, the only thing that could be recovered would be wherever it was last backed up. Anything after that would be lost, barring some other circumstance. Whether a 'being' like this would be capable of learning is questionable at best.