Frank Fiore – Novelist & Screenwriter

May 23, 2010

CYBERKILL: The Lost Chapters – Part Two

Filed under: CyberKill — Frank Fiore @ 9:48 AM

They say that an author has to be brutal with his or her edits.

The first few drafts of CyberKill were filled with quite a lot of technology and other information pertaining government programs in an attempt to make the novel as informative and pertinent as possible.

But in the process of some fine advice, editing and story polishing that I received, some of that material was deleted.  But, it is interesting non-the-less.

So, here are some of tee lost chapters of CyberKill that contained some very interesting information and technology for discussion.

BTW: These are the raw unedited and unpolished chapters.

This week – Just smart or intelligent? Top down programming or bottom up?

“So how’d it go?” Taylor asked as Cole walked into the team’s office.

“Worked like a charm,” Cole replied. “The programming worked flawlessly. Got you and Dallas to thank for that.” Catching himself, he said, “Oh, sorry. You too KC.”

Stone looked up from the book he was reading and said, “Huh?”

“I said thank you for helping Dallas with the dust’s programming.”

“Oh, sure,” he said. “No problem.”

“What are you reading, KC?” asked Cole.

Prey by a guy named Crichton.” Looking up at Cole, he asked, “Travis, could that nano-dust in Unit 14 get intelligent? Like in this book?”

Cole just smiled and said, “KC, that’s fiction. What we’re doing here is reality. The dust is smart – not intelligent.” Adding in a self-mocking tone, “My programming is not that good.” Looking around the office, Cole said, “Where is Dallas, by the way?”

“Playing with that metal monster of his in the conference room,” Taylor replied.

“Jeez. I promised Bartley I’d talk to Dallas about Isaac and those battlebots of his. I better go find him.”

“Better you than me. Besides, that walking trash can gives me the creeps,” Taylor said. As Cole began to leave, she asked, “By the way, did you get caught up in that snafu with DirecTV?”

Her question brought back the whole dismal morning that Cole wanted to forget. “Yeah, and then some,” he said.

“What do you mean?“ asked Taylor.

He told Taylor the litany of his woes that took place that morning.

“Sweet Jesus,” she said.

“Sound like you’re having a Salmon Day,” said Dallas jokingly.

“A what,” asked Cole.

“You know,” said Dallas, “the kind where you swim upstream all day, only to get screwed in the end.”

“Taylor and Cole just rolled their eyes.

“Your car too?” said Stone.

“Yeah,” Cole replied. “Which reminds me, I need a lift to the lot this afternoon to pick it up. Can you give me a ride?”

“Not in her clunker, “ Dallas said as he walked into the office followed by his metal companion. “Every time she gasses up that heap, it doubles its value.”

Taylor turned, glared at Dallas and said, “A sharp tongue doesn’t mean you have a keen mind, asshole. And if you add your brains to your metallic fiend’s there, the both of you wouldn’t have the IQ of the room temperature.”

“My friend, as you call him,” replied Dallas indignantly, “has a name. His name is Isaac and he is a modified Honda ASIMO. Modified by me, of course.”

Turning his back to Taylor, Dallas said, “Why don’t you just have the Colonel’s driver drop you off in his car?” said Dallas. “Ask him. He’s a good Joe.”

“Good idea,” Cole said. “And, why don’t you two learn to play nice, nice together?”

“So, how did the test go?” asked Dallas changing the subject and his tone.

“Perfect,” replied Cole. “Look, I’ve been meaning to talk to you about Isaac there,” Cole said pointing to the shinny white robot standing by Dallas’ desk. “Bartley wants him and your battlebots out of the Lab.”

“Oh come on, Cole. Can’t you stall him a little more? I have a big bout coming up. I still have some more work to do on Isaac and my battlebots.”

“What does Isaac have to do with your battlebots?” asked Cole.

“Oh, you haven’t seen the latest yet. Have you?” Dallas said. “I’ve programmed him to help me build my battlebots.” Pointing to Isaac he said proudly, “I’ve reconstructed his hands and programmed him for fine motor movements.” Picking up the remote control he brought with him into the office, he said. “Watch. I’ll show you.”

Moving the little joystick on the remote with his thumb, Isaac moved toward Taylor. She quickly stood up from her chair and said, “Dallas, you idiot, what are you doing?” Isaac walked closer and closer, lifting his robotic legs one at a time in deliberate movements towards Taylor. “Quit it, Dallas! Stop it! That’s not funny!” she yelled, but Isaac moved relentlessly forward, forcing Taylor’s back to the office wall. When he had her cornered, Isaac reached up to her face, extended a cold metallic arm – and very gently grasped her hair in his fingers and stroked it.

Taylor just stood there motionless, the back of her head against the wall.

“See,” said Dallas. “Gentle as a baby.”

“Impressive,’ smiled Dallas. “Are you impressed, Taylor?”

Taylor eased herself away from Isaac and his grip, and returned to her desk to regain her composure. “That thing could have killed me, you jerk!”

“Nope. No way. Not in his basic programming,” said Dallas.

“What do you mean?” asked Taylor in a belligerent voice.

He’s an ASIMO. This new version is programmed with Isaac Asimov’s Three Laws of Robotics.”

Cole was quite familiar with the Laws being an AI professional and a Golden Age Sci-Fi buff. But Taylor was not.

“And what are those,” Taylor said guardedly.

“Yeah,” said Stone taking his face out of the book. “I’d like to know, too.”

In an exaggerated manner, Dallas straightened up his tall frame, cleared his throat, and began his lecture. “The First Law. A robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. Third Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.’ Looking around at his audience, he said,  “Impressive, huh?”

“Yeah,” she said sarcastically, “impressive.” Under her breath, she mumbled, “Jerk.”

“So you see, uber-grunge girl, you were in no danger at any time. Isaac is incapable of harming a human being.”

“I don’t care,” she said. The damn thing still creeps me out.”

“Hey,” Dallas said, “You might hurt Isaacs feelings!”

Taylor made a grunting noise, shrugged her shoulders, and walked to a spot furthest in the office from Isaac.

“We can assume that’s why you named it Isaac?” Cole asked, “After Asimov?”

Dallas nodded and put down the remote. Pausing for effect he said, “And now, in the words of the late night commercials,” continued Dallas, “but there’s more! Not only does Isaac help me build my battlebots, but he controls them in competition, too.”

“Controls them?” Stone asked.

“Yeah,” said Dallas. “You need a gimmick these days. The battlebot bouts are looking more like the World Wresting Federation matches – with bimbos and all.” Staring at Taylor’s ample bosom, he added, “That could be a nice career path for you, Taylor.”

“Pig!” replied Taylor.

“I can build on that, sister, “winking at Taylor. “But, I digress. My gimmick is to have Isaac here control my battlebots through infrared commands – not me.”

“What kind of gimmick is that?” asked Stone skeptically. “You control Isaac with your infrared remote and he controls the battlebots with his. So what?”

Dallas paused a moment for again effect, looked at Cole and said, ”That’s not entirely true. You see I don’t control Isaac with a remote. I programmed him to command the battlebots himself.” He continued. “Your common everyday battlebots are just glorified RC cars that use off-the-shelf RC controllers. Unfortunately, this makes them very difficult to drive Things get hot and heavy in the ring and you have to make decisions fast; too fast most times for a human to make. So, I programmed Isaac to make those decisions for me and control the battlebots. You see, Isaac is programmed to learn – and learn quickly.”

“Does it work?” asked Stone.

“In tests, yes. But I won’t know for sure until I try him out at the next match.”

“So you programmed him do this, huh?” asked Cole.

“No, not quite,” Dallas said in a sheepish voice. Looking at Cole he said,  “You did. In a manner of speaking.”

“I did?” Cole said in a surprised voice.

“Yes,” said Dallas. “I took your CA programming we used for the dust and, well, sorta, adapted it to Isaac.”

“Took my programming?” Cole replied in a stunned voice.

“Uh huh. Now, by watching the bout and responding to competitor’s moves, Isaac can learn and re-lean the moves necessary to defeat any opponent. Seems to work like a dream!”

Cole looked at Dallas in amazement and said, “Dallas, you probably broke several national security laws and…..”

“And a few laws of nature,” Taylor piped in.

“You’re not going to tell ‘Colonel Sanders’ about this are you, Cole?” asked Dallas.

Cole just shook his head, but inside, he smiled at the ingenuity of his nonconformist.

“OK,” said Cole, “But let’s keep this under our hats. Everyone agree?”

Dallas and Stone quickly nodded yes. Taylor was still thinking. “Taylor?” Cole said. “Agree?”

Taylor reluctantly nodded yes.

Stone who was no longer reading his book but looking at Isaac, said, “You said that thing can learn and think on its own?”

“Yes,” replied Dallas.

“I didn’t like that thing before,” said Taylor. “Now I like it even less.”

“Whatcha griping about, Taylor?” Dallas said, “Toys and games have been intelligent for years. The smart toys and games have added enough complexity to their behavior that their actions are no longer predictable. We accept that as part of the fun. Hell, unpredictability has been a part of games since, Peek-a-Boo. With AIBO, Furby, and the like, we surrender our desire to predict the actions of our technologies in exchange for more “entertaining” behavior.”

“Like TiVo,” chimed in Stone. “It surprises us with its accuracy in choosing programs that we’ll like.”

“Right!” said Dallas. “And take the Roomba vacuum. It cleans your carpet and knows when you’re out of the room. Or our modern jumbo jets, they have ‘fly by wire’ controls that prevent pilots from doing things that would be dangerous.”

“I agree that these behaviors make for a better experience,” said Taylor, “ But at the cost of predictability. What if you don’t want to watch every cop show? What if the pilot really needs to do something that the software designers never thought of? Sorry, give me that ol’ time predictable control any day.”

“Spoken like a true top down programmer, “Cole replied.

Turning to Cole, Taylor asked, “Speaking of programming, Travis, we know about your CA programming and such. But you never did tell us what you were doing with it at MIT.”

“Yeah,” Stone added. “What kind of project were you involved in up there?”

Cole pulled his chair closer to the three and said, “OK. If you really want to know.”

“Yes,” replied Dallas, “What were you doing there at the AI Lab that got the warheads here so interested in you? Inquiring minds want to know.”

“The short version is pretty simple,” Cole replied. “I created a very large, complex and inter-connected region of cyberspace that I inoculated with digital organisms that were allowed to evolve freely through natural selection. That inoculation was my CA programming.”

“How can that simple CA programming you wrote for the dust produce complicated digital organisms,” asked Stone.

“The CA programming I used in my research project at MIT was a lot more complicated,” replied Cole. “As simple as CAs are, they come in an amazing variety of flavors. Some of the programs produce very simple, easy-to-predict output; some produce results that look, to the naked eye, completely random. And others, such as the Game of Life, are somewhere in between, producing results that display visible structure and yet are extremely difficult to predict.”

“Game of Life?” asked Stone.

“It’s an artificial life simulation invented by the Cambridge mathematician John Conway,” replied Taylor.  “Correct me if I’m wrong, Travis, but doesn’t it consist of a collection of cells which, based on a few mathematical rules, can live, die or multiply. And depending on the initial conditions, the cells form various patterns throughout the course of the game.”

“That’s right, Taylor,” said Cole. “At any rate, my CAs were agent based.”

“What does that mean?” asked Stone.

“Well, you have two different approaches to creating artificial intelligence. The first approach, or let’s say school of thought, are the top-down programmers. They believe that to achieve a truly artificially designed intelligence, you needed to program it first with everything it needs to know. AI programs such as those like flight simulators tend to be complex, striving to recreate every detail of reality.”

“That’s what I agree with,” chimed in Taylor.

“I know,” replied Cole.

“And what school of thought are you from, Travis?” asked Stone.

“The second one and the one in direct opposite to Taylor’s. Mine is what you would call the bottom-up approach. In bottom up models, the rules are simple. They are intended as tools for thought experiments, rather than as fully accurate models of the real world. In my approach, you program the AI entity with a ‘skill set’ that would allow it to learn and evolve. The CA does just one thing. You give it an input, and it produces an output. Give that output back to the program as input, and you’ve got a new output. And so on. Cellular automata, as you’ve seen, are typically simple enough to be implemented in a line or two of code, yet from these simple instructions, repeated thousands of times, they can produce very complex behavior.”

“Sound more biological then logical,” said Stone.

“You’re right,” replied Cole. “My approach is influenced more by biological structures than by logical ones. Bottom-uppers don’t bother trying to write down the rules of thought. Instead we try to conjure thought up by building lots of small, simple programs and encourage them to interact.”

“And this is what you did at MIT?” asked Stone.

“Yes. You might say that I’m sort of a digital zoologist. I spent the better part of my Ph.D program tracking the lives of digital organisms – their life and deaths – through millions of generations, and painstakingly classifying which species are best at certain kinds of tasks before I started the research project at MIT.”

“What was the objective or the research project?” asked Dallas.

“I wanted to create a set of circumstances where I could set off a digital analogue to the biological Cambrian explosion of diversity, in which multi-cellular digital organisms would spontaneously increase in diversity and complexity.” Adding, “Creating, I hoped, an intelligence that is very different from our own.”

“Using parallel MIMD processes?” asked Taylor.

“Correct.”

“And what was the nature of this region of cyberspace?” asked Stone.

“It was basically a private peer-to-peer network on the web,” Cole replied. “It consisted of dozens of volunteer MIT students running my cyber-diversity reserve on their P2P machines. That allowed the mutlti-cellular digital organisms I let loose to move around the private P2P network. The objective, like I said was to give them the opportunity to ‘learn’ and increase in diversity and complexity.”

“A cyber-diversity reserve?” questioned Dallas.

“That’s what I called it. In reality, it was nothing more than the volunteer’s PCs running client server software on their machines.”

“You mean a file sharing program like Gnutella or Morpheus?”

“Correct,” replied Cole.

“Is your project still running up there?” asked Stone.
“No. I had to terminate all my projects when I took the job here at the Lab. I really felt bad about it after putting all that work into it.”

“What did you do?” asked Taylor.

“I shut down the reserve by pulling the P2P network and sending out cancelbots to terminate the digital organisms.”

“Cool!” said Dallas, “You committed cybercide!”

“Well, that’s one way of looking at it,” said Cole rolling his eyes. “I wouldn’t put it in those terms. They were just digital programs.” Adding thoughtfully, “Would have liked to have seen the outcome, though.”

“Travis,” asked Taylor. “How do you know you cancelled all the entities?”

“What do you mean how do I know? I did,” replied Cole. “I closed down the reserve. If I missed any, they would have ‘expired’ because their environment disappeared.”

“Could any of the digital organisms escape the reserve?” asked Taylor.

“No way. It was a closed system.”

“Just asking. I mean, the reserve being a P2P network, running file-sharing software, on student machines most likely running other file sharing programs like Gnutella or Morpheus…..”

“And your point is, “ asked Cole.

“Never mind,” Taylor said, seeing Cole getting testy. “Forget it. Doesn’t matter anyway.”

Advertisements

3 Comments »

  1. Hey, wonderful blog you got here! Keep up the good job!

    Comment by Buy Zenerx — June 11, 2010 @ 12:04 AM | Reply

  2. Hey – Thx.

    Comment by Frank Fiore — June 11, 2010 @ 4:25 PM | Reply

  3. I’ll visit this site again to find out your upcoming post! Excellent job done!

    Comment by Student Loan Refinance — June 16, 2010 @ 11:25 PM | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at WordPress.com.

%d bloggers like this: