catless.ncl.ac.uk/Risks/7.46.html#subj2
First 10 Last 11 Copyright 12 RSS 13 Previous Issue 14 Index 15 Next Issue 16 Info 17 Searching 18 Submit Article 19 FTP 20 Do not even think about clicking on this button The Risks Digest Forum on Risks to the Public in Computers and Related Systems 21 ACM Committee on Computers and Public Policy, 22 Peter G. MOD development standards 24 Lorenzo Strigini 25 Vincennes: Rules of engagement violated by AI heuristic? Can anyone give a first-hand account of that lecture, or a more complete citation, or somehow shed more light on the issue? Lorenzo Strigini 39 --------------------------------------------- Vincennes: Rules of engagement violated by AI heuristic? EDU> Wed, 7 Sep 88 00:00:32 PDT A recent contribution noted that the Airbus shot down by the Vincennes had been within binocular range of the ship, and inferred that binoculars were superior to the Aegis system. He jumps to his feet and says 'possible comair,' for commercial aircraft, to the ship's commanding officer, Capt. It was not the Aegis giving bad data, but it was the Aegis giving a procedurally *conclusive* categorization, together with the duty-imposed rules of engagement, that caused what the military now boasts was a "prudent," albeit automatic, killing of 290 civilians. Thus: from the moment of take-off, the plane was formally characterized as hostile merely because the airfield was not wholly civilian, and this characterization would be definitively "correct" until disproven by the flight's obeying the ship's radioed warnings; The Aegis did its job and the Captain his mandated duty, and they conclusively saved the Vincennes from the risk posed by a lumbering Iranian Airbus that would not immediately respond to radioed warnings. JCS Chairman Crowe explained that all fault lay with Iran, because it was "unconscionable" for the Iranians to permit a civilian airliner to take off amid hostilities (which the air controllers are simply presumed to have known about) and to ignore warnings. According to the NYT, Crowe asserted that the plane would have been shot down IN ANY CASE given lack of proof that it was not hostile. Such "shoot-on-suspicion" rules of engagement Crowe claimed to be wise policy. That is, Rule Of Engagement number above was in violation of the declared Rules Of Engagement. One natural question naturally not commented on in the Pentagon's report is the applicability of the word "panic," although it notes: "At every opportunity when the ship's internal communication link is silent, an officer known as the tactical information co-ordinator calls the attention of the other officers to his belief that the plane is accelerating and descending. Interest has been expressed in the numerical/logical algorithms whereby computerized sensors declare a detection as hostile. It provides a comprehensive table of techniques, which include Bayesian, frequentist, maximum likelihood, evidential, pattern-matching, associative, syntactic, and heuristic methodologies. A basic division is into "hard" sensors, that declare an attack in binary form (yes/no), and "soft" sensors, that provide a probability estimate that a detection is hostile. I am confident that I quoted Cullyer, Leveson and others accurately; However --- - I should emphasize that skeptical comments regarding statistical reliability estimation were limited to the context of *a priori predictions* of the reliability of *software* - that is, predicitions of software reliability made prior to experience in the field. Regarding their opinions on statistical reliability estimation and life in general, I cannot say. I did note that Cullyer and others did remark that a priori estimates could be useful for *hardware* systems, where failure histories for the components were known. It is necessary to distinguish *validation* from *certification*. Validation is the technical process of determining whether a product conforms to its requirements. Nobody at COMPASS claimed that any validation technique was perfect, although people did claim that some techniques were better than others. Certification is the administrative act of releasing a potentially hazardous product for sale or use. The necessity for basing a yes-no decision on less-than-totally-conclusive technical information is the certifier's dilemma. Burleson faces up to 10 years in jail and a $5,000 fine if convicted in the trial, a first for the computer industry. Burleson was indicted on charges of burglary and harmful access sic to a computer in connection with computer damage at a securities firm, said Nell Garrison, clerk of the state criminal district court in Fort Worth. Through his lawyer, Jack Beech, Burleson denies the charges but has declined further comment. The firm has been awarded $12,000 in a civil lawsuit against Burleson. Pretrial motions were scheduled to be heard today, followed by jury selection, Garrison said. Burleson is accused of planting a piece of computer software known as a virus in the computer system at USPA&IRA Co. A virus is a computer program, often hidden in apparently normal computer software, that instructs the computer to change or destroy information at a given time or after a certain sequence of commands. USPA officials claim Burleson went into the comapny's offices one night and planted a virus in its computer records that would wipe out sales commissions records every month. The virus was discovered two days later, after it had eliminated 168,000 records. I was about ready to nod off (again), but someone was knocking rather rudely on the door. Someone had called 911, in fact they called 911 three times in a row. I assured them that I didn't call but they wanted to look around and make sure I did have any dead bodies lying around so i ran in and put some pants on and unhooked the chain on the door. They checked out the living room, then headed to the bed rooms. One bedroom is a bed room and one is a computer center, radio room (ham) and electronic scrap room (my play room). Why did they have their guns out, I had forgotten that I had 2 UZI water guns hanging on the wall in my play room, that along with the radio, flashing lights and other terrors looking electronic gimos in the room, It must have spooked them a little. Well they finally figured out that the guns were plastic and that I didn't have any real bombs in the room, they put away their guns. I have no phone on the line, so it must have been the computer calling someone. I call a site with a phone number of 891-11xx and from the logfile I had called the site 3 times a short time before the police arrived. It looked like MA Bell had take a little to long to give dialtone and the first digit was dropped. We have had a variety of cases just like this in the past. But it serves as another reminder of how easily it can happen. EDU> Tue, 6 Sep 88 22:41 MDT Our local county government just worked a deal whereby for a small fee added to each customer's phone bill, the county's centralized 911 emergency switchboard would be provided with a display of all incoming phone numbers and addresses. I'm rather glad that the next time I call 911 all that information will be communicated automatically (but I hope it will still be verified orally whenever possible). However, I suppose that once we pay for the installation of the necessary technology the local telco will be able to sell it as a service to other businesses. As previous notes have suggested, there are many privacy issues to consider here but there are benefits that also need to be considered as well. NET> Tue, 6 Sep 88 11:00:22 PDT This discussion has gotten pretty far from RISKS. Consider a world in which, when you wonder into a shop with an idle question, the shopkeeper can, without your permission, divine your identity. There's a world of difference between "Good afternoon, what's your name?
|