Topic 33: Privacy, Anonymity, and Authentication By: Sunday working group (teluride) on Sun, Jul 25, '93 5 responses so far Privacy, Anonymity, and Authentication This conference was devoted to discussion of the matters of assuring privacy in electronic communication and data management, the authentication of users in the network, and the capability of assuming an anonymous personna while involved with data communication. 5 responses total. Topic 33: Privacy, Anonymity, and Authentication # 1: Shulom Kurtz (teluride) Sun, Jul 25, '93 (09:48) 10 lines Key issue: protection from threat, defined as surveillance, masquerading on the part of others, authorization to access communications, prevention of unauthorized modification of communication, and maintenance of data integrity. Points discussed covered the matter of confidentiality through encryption of data (e.g., the Clipper chip proposed for a hardware-based scheme which would permit limited access through escrowed keys in different locations, the Data Encryption Standard [DES] issued through National Institute of Standards and Technology, formerly the National Buareau of Standards). Topic 33: Privacy, Anonymity, and Authentication # 2: Shulom Kurtz (teluride) Sun, Jul 25, '93 (12:56) 27 lines Key issue: Networking is permitting the aggregation and distribution of highly personal data. This can, in many ways, violate an individual's right to privacy because no permission is asked or granted for such action. Examples are the sale by credit report agencies of credit history; by credit card issuers of amounts, kinds of purchases, frequency of purchases, etc.; by telephone companies of new connects; by hospitals of patient information! Lotus Development Corporation proposed, at one time, to gather and combine such information into a single (inter)national database and to make that aggregation available for purchase by all comers. When the that plan was made public, a huge public outcry against such an activity caused Lotus to "cancel" (postpone?) the project. The question-mark on the parenthesized word originates from a belief that, at some future date, the public will become so inured to the "inevitable" that the plan may be resurrected. Another obvious example is the fact that many classified government documents are aggregations of information available in the public domain. Author Tom Clancy's books dealing with many intelligence activities are based on such public resources, and he has been "interviewed" repeatedly with questions about how he obtained the classified information contained in his books. Should some sort of control be instituted to prevent aggregation of data, either about individuals or governmental/industrial secrets? If so, how can that control be limited under constitutional constraints? Topic 33: Privacy, Anonymity, and Authentication # 3: Shulom Kurtz (teluride) Sun, Jul 25, '93 (13:00) 8 lines Key issue: provision needs to be made for (non)repudiation of message traffic -- i.e., assurance that a message has/has not been sent, and on the flip side, assurance that a message sent has/has not been delivered or received. This issue addresses those vital messages which must be transmitted in a timely manner and/or received and read by the addressee to assure completion of the communication loop. Topic 33: Privacy, Anonymity, and Authentication # 4: Shulom Kurtz (teluride) Sun, Jul 25, '93 (13:12) 26 lines Key issue: Need for verifiability of senders of messages, analagous to Caller ID available in the telephone network. This is an issue to avoid invasion of privacy, otherwise considered as a means of exercising control over uninvited interruptions of one's personal activity. The technology should provide for either or both of (a) a "bozo" filter to assure non-delivery of messages from specific individuals or messages containing specific types of material, or (b) a "non-bozo" filter which only would allow delivery of messages from a selected group of individuals and reject all others. The (b) option has a disadvantage of potentially rejecting messages from unknown individuals but whose content might be of great interest to the recipient. The (a) option has the disadvantage of requiring identification of all "bozo" senders or "bozo" content so that the filter could be expanded to accommodate the new rejection criterion. Telemarketers in the telephone community are a group of entrepreneurs who have elicited great antipathy on the part of those who are the object of their sales efforts. What will happen to the electronic data highway and the community of networkers when the marketers become more aware, technologically, of the potential for use of the highway? How can the user be protected from these (perhaps) unwanted intrusions; how can the filter be set up to allower certain types of marketing efforts and reject others? There are already in existence "anonymous remailers" on the network which can mask the real source of such messages. How will the technology deal with this development? Topic 33: Privacy, Anonymity, and Authentication # 5: Shulom Kurtz (teluride) Sun, Jul 25, '93 (13:19) 16 lines Participants in this session include: Paul Lambert Motorola, Inc., Scottsdale AZ (session leader) Shulom Kurtz Hewlett-Packard Co., Englewood, CO (session recorder) Janet Monroe Wayne State University, Michigan Lannie Boyd Retired Professor, Colorado State University, Ft. Collins, CO David Fodel ArtLab (Non-Profit Organization), Boulder, CO A. Spiros Taos, NM Andrew Currie Software Developer, Boulder, CO Maxine Kurtz Management Consultant, Attorney, Denver, CO Paul Ginsparg LANL Salye Stein Telluride Resident Peter Lent Telluride Resident Ann Branscomb Program on Information Resources Policy, Harvard University .