*grin* What, it is nice to dream.
@Rudy: Love how you use BSG as alternative history for Earth.

T-X wrote:@Ramen; true, but that control is being consolidated; less and less end users actually have the control you speak of.Man's collective ego might keep an AI from ruling ALL of us, but man's greed will be enough to get ONE man to allow such an AI to control the rest of humanity.
Jack wrote:That's the General folk for ya, always serious with a stick shoved up their ass
General Riviera wrote:You should stop being a spoon, read the forum rules and abide by the them. At least if you choose not to, learn how to break the rules in style.






T-X wrote:It is rather easy to motivate people to give up control, Mezz. The whole system of representation depends on people giving up control.And are the ones who motivate people to give up control 'in control'? Nah, they can even be in prison and still do it.
T-X wrote:Ramen, you seem to define self-awareness in AI as complete control over its functions and the underlying software? By that definition, humans are not self-aware either.
Jack wrote:That's the General folk for ya, always serious with a stick shoved up their ass
General Riviera wrote:You should stop being a spoon, read the forum rules and abide by the them. At least if you choose not to, learn how to break the rules in style.

Nah, buying people off is not what I meant. I alluded to the way people give up control if you threaten them with terrorism by proxy. Basically, if a developer is not going to sell you the information (which I actually believe most would, except for the true idealists), knock their family around a bit, and see how quickly they cave.ramen07 wrote:If you're implying that it would be easy to buy off a developer, I'm surprised the entire Social Security Number system isn't overthrown for a "better" one, namely, a system without software![]()
You are a bit of an idealist, are you not?ramen07 wrote:In fact...why aren't the richest people in the world "buying the rights" to credit card numbers that aren't theirs? Theoretically, all it takes is more money.No, I think greed or loyalty will overcome that. Loyalty, because of a steady paycheck > one large lump sum, and greed because once the other party is offering to pay, the person getting paid will most likely keep jacking up the price until the other party can barely pay/can't pay/won't pay. Theoretically. >.>
Oh, that makes sense.ramen07 wrote:Not complete control, but enough control. For example, a true AI present in an anthropomorphic robot doesn't need control over every electrical pulse governing the left quadriceps hardware, but only the higher-level function run(speed x). On the same token, all it needs to have access to in order to override commands is a function that writes code for itself or modifies its own source code, nothing else.

T-X wrote:Nah, buying people off is not what I meant. I alluded to the way people give up control if you threaten them with terrorism by proxy. Basically, if a developer is not going to sell you the information (which I actually believe most would, except for the true idealists), knock their family around a bit, and see how quickly they cave.I would be a terrible, terrible person to deal with in that regard.
With some things, like that AES, you can know how it works, but need a virtually limitless supply of resources (or 'extremely high', you know what I mean).
T-X wrote:You are a bit of an idealist, are you not?![]()
Jack wrote:That's the General folk for ya, always serious with a stick shoved up their ass
General Riviera wrote:You should stop being a spoon, read the forum rules and abide by the them. At least if you choose not to, learn how to break the rules in style.

Kit-Fox wrote:'Real' AI would be able to make whatever choices it wanted to, just like a human can.
If yout not talking about that, then quite simply you arent talking about 'true' or 'real' AI at all but instead a very limited program that can under the limits imposed up on it by the programming team make choices as it sees fit (assuming none of these choices contravene its limits, but its possible it may come up with a choice that wasnt forseen and therefore not beyond the controlling limits)
Kit-Fox wrote:And AI != Robot or android or anything like that at all. It could just simply be a plain old beige box, just take am inute and think how much is connected to the internet ? You dont need a robot or android in the equation at all
Jack wrote:That's the General folk for ya, always serious with a stick shoved up their ass
General Riviera wrote:You should stop being a spoon, read the forum rules and abide by the them. At least if you choose not to, learn how to break the rules in style.

Kit-Fox wrote:I'm sorry you think an AI would automatically have a conscience ???
Really?
How do you think that would work then?
Kit-Fox wrote:Also assuming they would have a conscience let me ask you this, do all humans act as their conscience tells them to? Think carefully about the answer to that![]()
Kit-Fox wrote:EDIT: As you say your a computer scientist in training, let me ask you the following that was posed to me by one of the senior computing lecturers as my university;
How would a Human approach to feeding a population differ to that of a computer?
Its a thought exercise, and one that should make you worry slightly if you think that AIs would automatically be benign
Jack wrote:That's the General folk for ya, always serious with a stick shoved up their ass
General Riviera wrote:You should stop being a spoon, read the forum rules and abide by the them. At least if you choose not to, learn how to break the rules in style.

KnowLedge wrote:eventually, all continents will form unions much like what Europe has with the European Union and NAFTA. and when those unions unite as one







