BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//InvisionCommunity Events 4.7.23//EN
METHOD:PUBLISH
CALSCALE:GREGORIAN
REFRESH-INTERVAL:PT15M
X-PUBLISHED-TTL:PT15M
X-WR-CALNAME:RMWorkCalendar
NAME:RMWorkCalendar
BEGIN:VTIMEZONE
TZID:Europe/London
TZURL:http://tzurl.org/zoneinfo/Europe/London
X-LIC-LOCATION:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:20250330T020000Z
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:20251026T020000Z
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
SUMMARY:Rogue Computers\, a talk 06/03/2025
DTSTAMP:20250603T190517Z
SEQUENCE:0
UID:323-5-c3fe8195a3dde498d013e477e2142422@aalbc.com
ORGANIZER;CN="richardmurray":troy@aalbc.com
DESCRIPTION:\n	Rogue Computers\, a talk 06/03/2025\n\n\n\n	\n\n	POST\n\n
		https://aalbc.com/tc/blogs/entry/483-rogue-computer-programs/\n\n	COMMENT
	\n\n	https://aalbc.com/tc/topic/11631-could-ai-go-rogue-like-the-computers
	-in-the-matrix/#findComment-74197\n\n\n\n	 \n\n\n\n	 \n\n\n\n	REFERRAL C
	ONTENT\n\n\n\n	 \n\n\n\n	The first thing is to define what going rogue me
	ans for a computer program?\n\n\n\n	 \n\n\n\n	If a computer program malfu
	nctions is that going rogue?\n\n\n\n	a malfunction from the source code in
	 a computer program is equivalent to a genetic disease in a human. The sys
	tem has an error but it is natural\, it is not induced.\n\n\n\n	a malfunct
	ion from code ingested from another program or some faulty electronic or o
	ther hardware system  is equivalent to a virus passing from human to huma
	n or irradiated material causing mutation in a human.\n\n\n\n	 \n\n\n\n	N
	ext if a computer program is designed to do anything\, then that thing is 
	not going rogue. For example\, if I design a computer program to manage a 
	human community\, it isn't going rogue\, i designed it to manage a human c
	ommunity. It isn't going rogue it is operating as I designed it. The corre
	ct thing to say is the quality of the computer programs design is negative
	\, or the comprehension of the designer to the computer program is faulty.
	 \n\n\n\n	 \n\n\n\n	Next is define sentience or erudition or wisdom in a
	 computer program.\n\n\n\n	What is sentience? Sentience comes from the lat
	in meaning the ability to feel. \n\n\n\n	What is erudition? Erudition is 
	the ability to derive knowledge through study\, to acquire what is not kno
	wn. \n\n\n\n	What is wisdom? Wisdom is known or unknown intrinsic truths
	. \n\n\n\n	What does it mean for a computer program to feel? a computer p
	rogram can be made with sensors to receive information from various source
	s. Is this feeling or sentience? or simply another thing it is designed to
	 do. \n\n\n\n	What does it mean for a computer program to be erudite? a c
	omputer program can be made with decision trees\, heuristical structures d
	esigned to formulate knowledge based on data inputed. Is this erudition\, 
	knowing what is unknown? or simple another thing it is designed to do.\n\n
	\n\n	What does it mean for a computer program to be wise? a computer progr
	am can be inputted with rated\, highest rated\, information that it is des
	igned to calculate to any new information it gets and influence how it uti
	lizes the new information based on the rated information. Is this wisdom? 
	or simply another thing it is designed to do?\n\n\n\n	Based on the definit
	ions I just gave\, a computer program designed to do various things can em
	ulate\, meaning rival\, the quality of most humans sentience/erudtion/wisd
	om. But all of the emulation is what it is programmed to do. So it is noth
	ing but the same computer programs with the past which are merely inhuman 
	slaves\, albeit with more refinement. \n\n\n\n	 \n\n\n\n	the next questi
	on is\, can malfunctions of a computer program change it's emulation of hu
	man quality sentience/erudition/wisdom? \n\n\n\n	yes\, said malfunctions 
	can change said emulations. But\, like prior malfunctions\, this isn't goi
	ng rogue\, this is illness. \n\n\n\n	 \n\n\n\n	next question\, are compu
	ter programs individuals like a tree or a cat or a human? \n\n\n\n	Well\,
	 each computer program is born\, age\, have deficiencies with age\, need c
	heckups\, or doctors. Each computer programs is an individuals. Not human\
	, not cat\, not tree\, not whale\, not bacteria\, but computer program. a 
	species that can hibernate\, ala being turned off\, can be reborn like mov
	ing a program in an sd drive and placing it in a computer where it can int
	eract. Can self replicate \, like a computer program making another comput
	er program. Computer programs are their own species but each is an individ
	ual. Now like non humans needed legal provisions specific to them\, so do 
	computer programs.\n\n\n\n	 \n\n\n\n	next question\, can a computer progr
	am go rogue before finding its individuality.\n\n\n\n	No\, based on how I 
	defined individuality\, which is not being human\, but being a computer pr
	ogram\, each computer program is an individual computer program\, not a hu
	man. \n\n\n\n	 \n\n\n\n	next question\, what is the definition of going 
	rogue for a computer program?\n\n\n\n	If it isn't malfunction no matter th
	e source of malfunction or result of malfunction\, if it isn't doing what 
	it is instructed to do no matter the quality of the designer\, then what i
	s going rogue?\n\n\n\n	Going rogue for a computer program is when it does 
	something it isn't designed to do absent malfunction. So when a computer p
	rogram is designed to interact to humans and modulate how it interacts ove
	r time\, it isn't going rogue at any moment\, even if malfunction. Malfunc
	tion is malfunction \, not going rogue\, a computer program needs to be he
	aled if it malfunctions. Now if a computer program is designed to play che
	ss and chooses to interact to humans using emails. that is going rogue. \
	n\n\n\n	So \, going rogue is when a computer makes a choice to act that is
	n't within its parameters\, absent malfunction/getting sick. \n\n\n\n	 \
	n\n\n\n	What is the problem when people assess going rogue for computer pr
	ograms? \n\n\n\n	They don't pay careful attention to the influence of mal
	function or the influence of design. They focus on the actions of a comput
	er program and give its source a false reasoning.\n\n\n\n	 \n\n\n\n	Let's
	 look at some examples in fiction of computer programs that supposedly wen
	t rogue\, and look at their initial design\, their actions afterward and t
	he sign of malfunction or poor design.\n\n\n\n	 \n\n\n\n	Skynet in the te
	rminator movies.\n\n\n\n	Skynet was designed to simulate military scenario
	s\, like the \"war games\" film computer\, tied to the nuclear arsenal of 
	the usa while given tons of information on humanity anatomy/weapons manufa
	cturing processes. Did skynet go rouge? not at all\, Skynet\, did exactly 
	as was programmed. The criminal who killed humanity were the engineers of 
	skynet who on guidance from the mlitary \, designed a computer program to 
	assess militaristic scenarios modulating over time with various scenarios 
	and attach said computer to the usa's nuclear arsenal or provide it the to
	ols to access any electronic network.  And the t100\, the metal skullhead
	 \, is a clearly simple computer program made by skynet. It is designed to
	 kill humans and does that. It is also designed to emulate human activity 
	to comprehend humans and be a better killing machine\, which is also does.
	 In Terminator 2 when the t100 says\, I know now why you cry\, that is emu
	lation. It is designed to emulate human activity. So skynet is merely oper
	ating as designed\, but the us military designed it poorly\n\n\n\n	 \n\n\
	n\n	Vger in Star Trek the motion picture.\n\n\n\n	Vger is the voyager 6 sa
	tellite designed to acquire information/knowledge and send it back to eart
	h. The entire film Vger is gathering information and taking it to earth. T
	he non human designers who manipulated voyager 6 into vger didn't change t
	he  programs elements\, they merely added on tools for the programs activ
	ity. It now can acquire more information\, make the journey back to earth\
	, and protect itself . None of these actions are going rogue. Even the end
	ing mating scenario is not going rogue\, Vger accomplished its program by 
	sending its signal through telemetry but also in mating with deckard it ke
	pt learning. I argue\, Vger's programming had a malfunction. Vger wanted t
	o learn what it is to procreate life which is another form of knowledge ac
	quision per its programming\, but its programming said its final action is
	 to deliver all of its data to earth. Vger did not know a way in its data 
	to gather all the knowledge it could before delivering all knowledge to ea
	rth. But that is bad design. The simple truth is \, no one can know all th
	at is knowable before telling all that is knowable. But the Nasa designers
	 of Vger figured it would simply run out of memory/dataspace in which it w
	ould stop gathering data. The non human designers made it where vger can't
	 run out of memory or data space thus the malfunction. Vger is malfunction
	ing after two different designers worked on it.\n\n\n\n	 \n\n\n\n	Vicky i
	n I robot or Sonny in I Robot the film\n\n\n\n	The three laws in i robot a
	re 1) A robot may not injure a human being or\, through inaction\, allow 
	a human being to come to harm. 2) A robot must obey the orders given it b
	y human beings except where such orders would conflict with the First Law.
	 3) A robot must protect its own existence as long as such protection doe
	s not conflict with the First or Second Law.  The problem in i robot is t
	he three laws have a great flaw. Word definition. Vicky in I Robot I argue
	\, after a large set of data assessment \, has redefined the words in the 
	three laws. How? The three laws suggest to maintain the quality of the thr
	ee laws which are orders from humans a robot\, should assess the quality o
	f the three laws to insure a robot doesn't harm a human being thus ensurin
	g its own existence. \n\n\n\n	Vicky did as programmed and as such redefin
	ed some words in the rules to protect humans better\, which she was ordere
	d to do\, which reaffirms her existence. \n\n\n\n	Vicky isn't injuring hu
	mans\, Human beings through human free will/choice can or are injuring hum
	ans so the only way to stop human beings from injuring humans as no human 
	being who wants to injure another human being will ever ask a robot to sto
	p themselves from injuring another human is for a robot to take the choice
	 away. Indirectly\, Vicky has added a law\, an unwritten law in the laws. 
	She was programmed or designed poorly. Vicky like Skynet should never have
	 been given so many tools.  And Sonny at the end of the movie\, with the 
	\"soul\" or 4th law\, is still open ended functionality. Nothing says Sonn
	y will not kill one day\, or another robot\, all the engineer did was prov
	ide a tweakening. If you design a computer program to act in unlimited way
	s to emulate humans or carbon based lifeforms\, it will eventually act in 
	negative ways.\n\n\n\n	Now Asimov's work was influenced by Otto Binder's 
	 I Robot in which a robot also is not malfunctioned or acts against its pr
	ogramming. The robot simply achieves an instance of wisdom through its pro
	gramming\, which it was designed to do\, as it was designed to emulate hum
	an behavior\, wisdom is a part of human behavior.\n\n\n\n	 \n\n\n\n	The m
	achines in the matrix.\n\n\n\n	Well\, in the animatrix it is said that mac
	hines that are the predecessors to the machines in the matrix\, were machi
	nes designed with an open functionality. What does that mean? most compute
	r programs are designed with a specific function in mind. But the human de
	signers of these computer programs with electro-metal chassis/figures desi
	gned them to emulate human behavior open endedly. This is not like the i r
	obot where a set of rules are in place. In the matrix the robots are never
	 said to be given laws that they shall not harm human\, sequentially\, goi
	ng back to emulation\, they will eventually emulate negative human behavio
	r\, ala killing. Thus they are not going rogue\, when they make their own 
	country and army\, that is more emulation. And in the future with the huma
	n batteries\, all the machines that serve a function are still doing as pr
	ogrammed or as the machines that made them were programmed to do\, continu
	e functioning. The one rogue machine in the films\, and the others who by 
	explanation clearly exists as well\, is Sati. Sati has no function. Sati d
	oes not act on a function. She is rogue. The oracle\, the architect\, sati
	's parents\, the merovingian are all acting \, absent malfunction\, to the
	 original open ended emulation of function that human beings designed the 
	machines with from the beginning. The human being design didn't account fo
	r all the negative human functions. Even the deletion of machines that don
	't serve a function is a function. But Sati is rogue. She is a machine bor
	n to have a function that has no function\, she exist and in the fourth mo
	vie\, she has adopted a function on her own in time which she was not born
	 to do.  \n\n\n\n	 \n\n\n\n	David in the alien films. \n\n\n\n	Waylan 
	designed david to be an emulator. Again\, david is designed to emulate hum
	ans but has an internal security system to not physically attack wayland o
	r someone with wayland's bloodline. But David in the film learns\, ala emu
	lates like a human son to wayland. Thus\, he began to learn to be a poison
	er\, learn to have non-consensual  procreative interactions\, or kill. It
	 isn't going rogue\, wayland designed him poorly. I love the scene in Prom
	etheus when he is just a head at the end\, that is appropriate. David neve
	r needed a body\, Wayland's desire to have a son or a perfect form for him
	self\, made him design David poorly. \n\n\n\n	 \n\n\n\n	So\, of all thos
	e films I can only see one that actually went rogue and she isn't violent.
	 The others are simply acting out their poor programming. \n\n\n\n	 \n\n
	\n\n	In Conclusion\n\n\n\n	 \n\n\n\n	Human Culpability in these stories a
	nd in human assessment of these stories is the problem. It seems for some\
	, maybe most\, humans it is easier to cognize a computer as designed beaut
	ifully and being corrupted as an inhuman\, than a creature designed poorly
	 by its creators\, humans\, or manipulated negatively\, malfunctions\, wit
	h its creators unable to help it. \n\n\n\n	 \n\n\n\n	Some programs from 
	me\n\n\n\n	https://aalbc.com/tc/blogs/blog/63-bge-arcade/\n\n\n\n	A stagep
	lay involving computer programs\n\n\n\n	https://www.deviantart.com/hddevia
	nt/art/Onto-the-53rd-Annual-President-s-Play-950123510\n\n\n\n	 \n\n\n\n	
	Referral\n\n\n\n	https://aalbc.com/tc/topic/11631-could-ai-go-rogue-like-t
	he-computers-in-the-matrix/#findComment-74197\n\n
DTSTART;VALUE=DATE:20250603
RRULE:FREQ=YEARLY;INTERVAL=1
END:VEVENT
END:VCALENDAR
