atomic_fungus (atomic_fungus) wrote,
atomic_fungus
atomic_fungus

#2900: One hundred to go a go go

*sigh*

Last night I decided I had a hankering for brownies, so I got up from the computer and hied myself out to the kitchen.

Well, the saucepan that I use for making the brownies was in the dishwasher, being washed, so I decided I'd make chocolate chip cookies instead.

...had the margarine out and softening in the microwave when the dishwasher switched to "dry" mode. I'd already started the cookies, so it was going to be cookies!

And dang, are they good. I ended up getting 47 cookies out of the batch. I made them with milk chocolate chips and pecans, and I can't stop eating them.

I'm taking a bag with me to Bible study tonight. Heh.

* * *

Last night I was listening to David Arkenstone while playing WoW. Arkenstone did some of the music for the game (particularly the music which plays when you're aboard ships or in certain taverns) and I had a hankering to hear some of his other music, so I threw in first "In the Wake of the Wind" (vintage 1990 as I recall) and then "Valley in the Clouds" which comes to us across the mists of time from 1987.

I have a couple tracks from the latter work on my MP3 player. I'm not sure when I converted the tracks; the file date is 9/27/2003 but I recall that those MP3 files came from a CD I had burned a couple years earlier...so God alone knows.

What I do know is that I was hearing stuff in the original CD that I don't hear coming from my MP3 player; and I concluded that it might be time to grab the CDs and re-rip the MP3s using a slightly more modern codec than I had in 2003.

Jeeze louise.

* * *

Incidentally: "Is Rick Perry dumb?" No, he speaks quite well. Unlike a certain sitting President who can't give a three-minute announcement without a teleprompter....

* * *

I was sitting here, playing Freecell, and thinking about Terminators.

Computers do what they're programmed to do. In Terminator Two: Judgement Day the T-101 has been reprogrammed by John Connor (future) to protect John Connor (present)...and the young John realizes gleefully, "You have to do whatever I say!"

So then he's trying to explain to the machine that it can't kill people.

"Why?"

"Because you can't!"

"Why?"

"Because you can't, okay? You just can't!"

...being--what, twelve?--he doesn't realize that he could simply say, "Because I said so!" and the machine would accept that as a valid reason. It doesn't care about morality, but it does care that it's programmed to accept all orders from John Connor as valid, and it doesn't matter if he's 13 or 30.

It got me to thinking about intelligent machinery in general--I don't think I need to make the case that the T-101 would pass the Turing test--and I came to realize that as long as your sentient machine is programmed properly, there will be none of this nonsense about the machine refusing to comply or going nuts and murdering everyone.

In Clarke's 2001, HAL-9000 is given conflicting orders and it ends up killing everyone aboard Discovery, except for Dave Bowman, who deactivates the computer and escapes into the monolith.

In practice, though, the computer would just throw an error code: "Hey, numb nuts! The order you just gave me contradicts this other order I've got. Straighten it out or I'm not going to work."

...if it were truly able to pass a Turing test, it would be smart enough to do that rather than accept both orders, try to execute both orders, and behave erratically.

Most of the stories in Asimov's I, Robot revolve around conflicts that arise due to the Three Laws of Robotics and various interactions between humans and robots. It works, though, because the robots are not very smart compared with humans--even ordinary humans--and the conflicts arise due to unforeseen circumstances, and manifest in odd ways.

So a robot is told to go collect some selenium from a pool on the day side of Mercury (story written when they still thought Mercury was tidally locked with the sun) and it turns into a prankster because it's dangerous to the robot to go near the pool of selenium. It gets close to the pool and the third law (self-preservation) kicks in; when the robot moves away from the pool the second law (obey humans) kicks in...and the thing oscillates back and forth, wandering lazily around the pool while babbling nonsense to itself.

(I think it was selenium. Whatever it was.)

Contrast that with the T-101 in T2; it's mission is to protect the younger John Connor...and that includes allowing its own destruction in order to prevent a future which includes the T-101.

And the T-101 most assuredly is not programmed with the Three Laws of Robotics; I mean come on....

Somewhere in all this meandering is a point, and I'm not sure what it was. What I do know is that I haven't dwelled very much on the issues of sentient machinery in my SF world, despite the fact that it exists. So I may have to remedy that, somehow.
Subscribe

  • #8974: Stand-alone

    That one had to stop there. So I'll put the personal stuff here. Yesterday-- No, this whole week, I've been "not safe for fabrics". I expect that…

  • #8973: The way I feel right now, 9K may be it for a while

    We're 27 posts from 9000 and 22 days from the end of 2023. I might be going on hiatus in early 2024. I have, of course, pondered doing a hiatus…

  • #8972: Christmas lights

    Mrs. Fungus and I went to the zoo to look at their Christmas lights. It was quite nice. Hit an IHOP on the way home for a late dinner. In a little…

  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 0 comments