David S. Stodolsky
Entropy is on the rise.

 
Home

About



Members
Join Now
Login

 
 

Automation of Contagion Vigilance

Abstract

The very long latency between HIV infection and the appearance of AIDS imposes extensive information processing requirements on partner notification efforts. The apparently contradictory needs of maintaining the right to privacy of infected persons, while simultaneously providing information to persons at risk of infection, impose severe security requirements. These requirements can be satisfied by a Contagion Management System based upon networked personal computers of a kind now becoming available. Security of information is based upon cryptographic protocols that implement anonymous partner notification (contact tracing) and Privacy-Preserving Negotiation. The proposed scheme has the following properties: (a) Contact tracing is automated, (b) contacts remain anonymous, (c) sensitive information is kept private, and (d) risk-conscious users act as if sensitive information was public. Optimal health protection can thus be obtained while securing informational rights.

Key terms: Preventative health services, patient data privacy, real-time systems, distributed data bases, epidemiology.



Stodolsky, D. S. (1997). Automation of Contagion Vigilance. Methods of Information in Medicine, 36(3), 220-232.





Few would disagree with the statement in the "Global AIDS [Acquired Immune Deficiency Syndrome] Strategy" that, "the key to preventing HIV [Human Immunodeficiency Virus] infection is information and education [1, p. 393]." Information technology is already playing an important role in the global AIDS campaign. Personalized information provided at the moment it is needed, however, can affect behavior more surely than general information targeted to populations or groups.

Efforts to provide personalized and timely information, however, have already had to confront the dilemma of how to maintain the right to privacy of infected persons while simultaneously providing information to persons at risk of infection. Problems associated with partner notification, also called contact tracing, have generated some of the most divisive debates in the formulation of public policy on AIDS. Osborn [2, p. 38] has commented, "Perhaps the most heated political debates thus far have swirled around confidentiality and anonymity. . . . Concerns about confidentiality have become much more focused recently, as corollary issues of contact tracing and/or mandatory partner notification have entered the discussion.... Some members of groups at high risk of developing AIDS might avoid being tested or seeing physicians at all if they feared release of information about their infection to their partners."

Appropriately-structured secure information systems can help resolve the dilemma created by conflicting demands of privacy rights of HIV-infected persons and of rights to information of others, thereby reducing the need for mandatory measures and their potential counter-productive side effects. The system proposed here offers two innovations: (a) On the organizational level, a communication mechanism is outlined which ensures that when a person is diagnosed as having an infectious disease, all personal contacts are automatically notified, without revealing the identity of the diagnosed person; (b) on the personal level, a Privacy-Preserving Negotiation mechanism ensures that when a person wishes to make a contact, a decision to avoid the contact can be reached, without revealing whether this decision is based upon infection risk or lack of desire for that contact. This negotiation mechanism motivates at risk individuals to participate in the management of infectious agents, thereby improving contagion vigilance. A system with these inherent motivational components is called a Contagion Management System.

Structure of this Document

The objective of this document is to outline an information system for the tracing of infectious agents. First, terms will be defined and the objectives for such an information system will be discussed. Next, the relation between individual rights and protection of the public, and the impact of the proposed system within this context will be briefly mentioned. General properties of transmissible agents will be discussed to highlight the overwhelming information processing requirements that can accompany partner notification efforts. An anonymous partner notification system using distributed databases will then be specified. Security problems with this system will be mentioned and a secure partner notification system, that is, one which does not require full cooperation and complete honesty of all participants will be introduced. Finally, a secure and anonymous partner notification system will be described. Possible applications will be mentioned and reasons for choosing them as initial environments for system development will be presented.

Definitions

Partner notification is also called contact tracing. The term contact can be ambiguous, however, referring either to a partner or to a single event (transaction) during which infection can be transmitted. The anonymous partner notification method discussed below corresponds most closely to the provider referral type of partner notification, "The approach by which health care providers or other health workers notify an HIV- infected person's partners. . . . [1, p. 394]."

People are not likely to make sensitive information available unless it is securely protected. The proposed Contagion Management System uses both anonymity and privacy for information protection [3]. Anonymously disclosed information cannot be traced to its source, but message content is known. Private information, while having a known source, has its content hidden (Table 1) [4]. We claim that by appropriately combining anonymous and private messages, one can construct a secured informational environment that has the behavioral properties of a completely public informational environment. In particular, the proposed Contagion Management System is meant to permit people to behave as if there was completely open exchange of risk information, when there is not. A system which appropriately combines anonymity and privacy can, in effect, yield secrecy in terms of disclosed information (Table 1).

Table 1 and Figure 1 (PDF format)  

Secrecy, the primary information protection mechanism currently used, becomes unnecessary if other protections are available. This is desirable because secret information, which hides both message source and content, obviously cannot play a direct role in risk avoidance.

Confidentiality is based on a privileged relationship, which may be defined by a code of conduct. A code may specify a doctor to be a trusted party in a relationship with a patient. Within such a relationship, information is transmitted without any form of protection. The person receiving the information is typically required, however, to protect the source. This trusted party may then transmit information to others, for partner notification or statistical purposes. Because changes in public policy may force disclosure of information revealed in confidence, and because human error or misjudgment may occur, this form of protection is inherently weaker than those not requiring a trusted party. Also, the limited availability of such trusted persons may make this form of protection impractical. Finally, such relationships may not be appropriate where enhancement of responsible behavior is critical. The proposed system does not require trusted parties for information protection.

We should also note that however desirable having a trusted party may be, many individuals have great difficulty in establishing a climate of trust. And in certain social environments, doing so may be virtually impossible. It is precisely in these environments that the system proposed here provides a solution. A solution which permits the reestablishment of trust.

The above terms, as defined in this document, have a more precise meaning than they do in common usage. In fact, they may even appear at first to be used in a manner that is in conflict with common usage, in some instances. The precise meanings of terms are specified by the procedures of the Contagion Management System. In this document, the focus is upon management of biological agents transmitted by direct contact. However, the model can also be applied to other types of contagion. The spread of computer viruses and of rumors, the transmission of habits, and the adoption of technical innovation, have all been modeled as contagion processes. Thus, for example, the model might be applicable in ensuring that proprietary information does not leak from one company to another in a joint venture, where companies may be both cooperating and competing with each other, depending upon which departments are communicating. The scope of application will become more obvious as it becomes apparent that it is actually risk that is being directly managed by the system.

If people heed risk information, then contagion barriers can be said to result, that is, barriers created by being informed. Just as a condom, if used correctly, creates a physical barrier to transmission of infection, the Contagion Management System, if used correctly, creates a virtual barrier to the transmission of infection. This virtual barrier is created by an automated exchange of information.

Individual Rights and Public Health

The impact of the specified system on individual rights is a question of key importance. "The 'Global AIDS Strategy' emphasizes the need to protect the rights and dignity of HIV-infected persons [1, p. 393]." Information rights are an important element of the overall rights question, particularly in the AIDS pandemic. Bayer [5, p. 96] comments, "Protection of the public's health is dependent on respect for privacy."

It is somewhat fashionable to focus on personal privacy at the expense of other human rights and aspects of dignified treatment of other persons. Consequently, we stress the following: Individuals have a  right to keep sensitive information private, but also a right to avoid infection and an obligation to prevent transmission of infection to others. Within the framework of the Contagion Management System, protection of personal data and of public health are mutually supportive objectives. The information system specified here is in the spirit of free and open societies and would dramatically enhance the protection of citizens' rights. However, I leave it as an exercise for the reader to consider such impacts. The major objective of this document is to define contact tracing using a secure distributed database. It specifies the anonymous transmission of partner notification information from a health care system to individuals. When it comes to the use of this information by individuals, one needs a powerful privacy protection mechanism, which does not compromise others right to know in order to achieve its objectives. One such mechanism [6] is outlined in the appendix.

Admittedly, the information system described here would make possible an unprecedented degree of data accessibility and control over the most private areas of social life. The potential for abuse would be enormous. While people would be expected to have misgivings because of this, the distributed nature of the system mitigates these risks. However, it appears that this same type of control is evolving within the highly centralized systems for information management of governments and corporations. Such centralization could well strip the individual of basic rights without any fundamental changes in current legal and regulatory frameworks.

Partner Notification using Distributed Databases

Typically, partner notification occurs after the diagnosis of an index (infected) person. Most often this is after the exposure of the partner to the infectious agent. One of the major features of the system specified here is preemptive notification. Preemptive partner notification occurs prior to a risky contact, thus permitting it to be avoided. A system supporting preemptive partner notification is more likely to be effective as compared to traditional partner notification, because individuals can use timely information for self protection. Traditional partner notification techniques most often limit individuals to protecting others, since they themselves have already been infected by the time notification occurs.

Timely information delivery assumes greater importance with preemptive partner notification than in traditional partner notification, since the operation of the preemptive system depends heavily upon motivation of a large number of individuals. The success of the system can be seen as dependent upon a competition between biological and informational agents. If informational agents, indicating risk, propagate more rapidly (along the infection chain) than biological agents, then an epidemic can be controlled. One way to ensure this is to make the transmission of informational agents at least as easy as the transmission of biological agents, such as HIV.

Classes of Transmissible Agents

An overview of the properties of transmissible agents can be helpful in understanding the potential advantages of the proposed system. A clear understanding of preemptive partner notification requires distinction between biological agents, informational agents making demands upon attention, and informational agents that require only processing by machines. If each person were to inform a potential partner of all transmissible agents carried by that person, then we could say that risk information preceded each transmissible agent. This would give persons the option of avoiding contact with selected biological agents. (While this is hardly realistic, some persons have found it prudent to give this type of information in "personals" columns to avoid the possibility of legal action against them [7].)

Informational agents demanding attention. Aside from the privacy problems and diagnostic uncertainties which would reduce the effectiveness of such a procedure, there are major demands upon attention and memory associated with it. Particularly in the case where there is a high prevalence of an infectious agent in a population, the simple communication of diagnostic information would be inadequate. In most instances of sexually transmitted disease, at least one new person has been infected by the time a given individual has been diagnosed. Thus, a person would have to communicate not only their own diagnostic information, but also the diagnostic information from previous contacts. Some information concerning a partner would only become available much after that contact had taken place, thus placing unrealistic attentional and memory demands upon communicators.

Informational agents processible by machine. A solution to this problem is to structure diagnostic data in standardized machine readable forms, thus permitting information (both from diagnostic tests and from risk alerts transmitted through partner notification) to be exchanged by computers prior to an anticipated contact. This strategy also compensates to some degree for diagnostic uncertainties, since what is most often transmitted by automated partner notification is information about risk, as opposed to direct diagnostic information. A further advantage of an automated partner notification mechanism is that it can be implemented in a manner offering superior protection of personal data [3, 8].

Effective protection of personal data, however, can massively increase the amount of data that must be processed, making machine processing essential. For example, one way to untraceably receive a message (i. e., receive it without others being aware that you are the receiver) is to have it distributed to all persons. Without automatic sorting, this easily leads to information overload. These considerations suggest that informational agents processible by machine offer major advantages in the control of epidemics, but only when cost effective technology is available.

Communicating Diagnostic Information

A major objective of this section is to show how sensitive information about risk can be communicated without compromising persons' information rights. This is required if information about infection is to be available for partner notification. For explanatory purposes, an anonymous partner notification system using distributed databases will be specified first. A secure partner notification system, that is, one which does not require full cooperation and complete honesty of all participants will then be introduced. Finally, a partner notification system which is both secure and anonymous will be described.

Anonymous partner notification. Anonymous messages have an unknown source, but a known content (Table 1). The implicit content in this case is, "You may have been exposed to HIV." In the simple communication system described in this section on anonymous partner notification, the transmitted information identifies a transaction deemed to be at risk for transmitting the virus. No personal identity information is transmitted. In fact, with this simple anonymous partner notification system, the source can only be known to the participants in the transaction (and then only if just two persons are involved), since any participant would transmit exactly the same message (i. e., "You may have been exposed to HIV").

The anonymous partner notification system outlined in this section is most easily explained if we assume that each user has a personal computer capable of directly exchanging information with those of other persons. These computers can, in the simplest case, generate random numbers that are added together in order to label transactions. After each transaction, therefore, each involved person has a unique label or code for that transaction in his or her database. Other types of labels, such as ones based upon the time of the transaction could be used, but several problems must be avoided. First, the label must not make it easier to identify the persons involved in the transaction. A time based label could be used to separate the user population into those who could not have been involved and those who could have been involved in a transaction. Second, transaction labels must be unique, or the chance of duplication made vanishingly small. Third, the label must not be deterministic or dictated by one person, or it could be made to contain information that reveals identity. Adding two large random numbers together to generate a label is an easy way to avoid these (and other) potential problems. The objective is to capture and transmit only needed information, so as to thwart potential misuse or security breaches.

In the event that a person becomes ill or is identified as carrying an infectious agent, the transaction codes which label transactions during which that agent could have been transmitted to or from the ill person are then broadcast to all other computers. If a receiver's computer has a matching code, then that person is alerted to the possibility of the agent's presence, and can report to a medical center for testing and treatment. This iterates the process, thus identifying all carriers eventually. The effect is to model the epidemiological process, thereby identifying all (potential) carriers through forward and backward contact tracing.

In order to clarify the procedure, consider a scenario in which there are two types of actors, persons (Pi) and doctors (Di). Doctors operate only within a Health Center. There are also two types of agents, biological and informational, that can be transmitted during a transaction. Informational agents are always transmitted (not true of doctor-patient interactions). Each actor has a computer that can exchange information with another actor's computer. A doctor's computer can also broadcast messages to all actors at once by sending them through a more powerful transmitter at the Health Center.

================== Figure 1 here ==================

Contact tracing is illustrated by the sequence in Figure 1. At time T1 person A (Pa) and person B (Pb) engage in a transaction. Their computers label this transaction with a number N1 and store the number. Pb then physically moves into contact with person C (Pc), this transaction is labeled N2 and recorded at time T2. At time T3, Pb becomes ill and reports to a doctor (Da). The doctor verifies the infectious nature of the illness and then reads the transaction codes, N1 and N2, out of Pb's computer. These are broadcast to all other computers at time T4. When Pa's computer receives the broadcast, the transaction code N1 matches the number stored in memory. This alerts Pa to the fact that s/he may be in the chain of transmission of the infection (in this case Pa was the initial carrier of the infectious agent). When Pc's computer receives the broadcast, the transaction code N2 matches the number stored in memory. This alerts Pc to the fact that s/he may have been infected (at T2). The alerting of Pa is an example of backward tracing from Pb. The alerting of Pc is an example of forward tracing. We assume in this simplest case, that when an alert is received, the affected person voluntarily reports to a doctor. In a more secure system, a person's computer would not be capable of generating new transaction codes,  if a matching code from an earlier transaction had been received. This would indicate to potential new partners that contact with this person was risky. Unfortunately, such a security feature would violate one of the fundamental principles underlying the specified system, that each person control their own database. Failure to receive some transmitted information could also result in a failure to preempt risky contacts. In order to overcome limitations of this simple anonymous partner notification model, security mechanisms must be introduced. Secure partner notification. A problem with the simple notification system discussed above is the cooperation and honesty required of participants. Even failures of omission, that is, failure to receive alerts or update one's database, could result in others being placed at risk, without notification. A secure system would require the exchange of health certificates that could be used to trace risky contacts in a manner similar to that possible with the randomly generated transaction codes. Each person would receive a new set of health certificates daily. These would be time stamped and signed by the Health Center. Before engaging in a risky contact, persons would exchange health certificates, or, more precisely, their computers would. This exchange would be part the of automated negotiation process.

In the case of illness, a person would be required to report to the Health Center to obtain fresh health certificates. If the person  was found to be carrying an infectious agent, s/he would provide the certificates s/he had received during the period of potential disease transmission. The Health Center could then require partners to report for testing before issuing them new certificates. Iteration of this cycle results in comprehensive contact tracing and prevents further spread of the infectious agent.

With this model, a delay of one day could elapse between the notification of each person along the chain of infection. If each person deposited the certificates they had received each day, this delay could be avoided, permitting all persons along the risk chain to be tested simultaneously. Requiring a person to return their unused certificates daily could also ensure against failure to disclose contacts (e. g., system malfunction causing a loss of data) or acceptance of outdated certificates, which would not incorporate the risk resulting from transactions that had occurred after the certificates were issued.

This secure partner notification scheme would rely upon health care provider confidentiality to protect the identity of partners. Especially in the case where exchanged certificates were returned to the Health Center daily, a substantial amount of sensitive information would be concentrated there and could be released as a result of a breach of confidentiality. Such a risk would be unacceptable to many persons.

Secure and anonymous partner notification. We have described a system for anonymous partner notification which had severe limitations due to the complete cooperation required of participants and the systems sensitivity to failures in information transmission. These limitations could be overcome to some degree by placing certain information in the individual's computer beyond their control. This was deemed an unacceptable, because it violated the principle that persons fully control their own database. Loss of this control induces certain security risks. We then described a secure partner notification system that depended upon health provider confidentiality for the protection of sensitive data. While this avoids the health security problems of the first system, it compromises data protection to such a degree that the system would not be acceptable in most societies. A mechanism that provides the anonymity of the first described partner notification system and the security of the second described system will be outlined below.

Mechanisms satisfying the requirements of both security and anonymity implement what has been termed unobservability [9]. These mechanisms ensure the authenticity of certificates without requiring identification of participants. A mechanism could be based upon the use of a cryptographically-secure pseudonym system for construction of health certificates [10, 11]. Functions required for this type of system include digital signatures, linked digital certificates, and untraceable pseudonyms and electronic mail.

A digital signature system permits the electronic transmission of certificates that can be proven to originate from a claimed source. This can be used to ensure that health certificates can not be forged. However, if these were exchanged in transactions directly, it would permit recording of the identities of persons in all transactions. We will call these certificates the ID Set, since the person is identified to the Health Center by their signature on this set of certificates. A system of linked digital certificates allows the individual to transform his or her ID Set into a new form, which permits the use of these same unforgeable certificates under a different pseudonym (i. e., random number). We call this the Free Set, since the person can freely exchange these in transactions, without revealing the identity they have registered with the Health Center. This second pseudonym is the one which they use to communicate with the Partner Notification Service. This Notification Service is functionally distinct from the Health Center, but the system's security makes separate organizations unnecessary. The availability of untraceable pseudonyms and electronic mail makes possible secure communication while hiding identities.

In order to obtain a fresh Free Set each day, the person must transform their stale (more than one day old) Free Set, received untraceably from the Partner Notification Service, into an ID Set. This ID Set is transmitted to the Health Center, which checks to see if all members of the set (e. g., one hundred certificates) are present (meaning no testing is required), and then issues a fresh ID Set. The person then transforms this fresh ID Set into a fresh Free Set, which can be used in transactions. This system will now be explained in greater detail, and with the inclusion of the procedure by which the Notification Service initially obtains a Free Set from the person.

The system can be explained in terms of an analogy using carbon-lined envelopes. A signature mark on the outside of such an envelope transfers to a paper slip on the inside. A person obtains an initial set of, say, one hundred date validated slips by presenting them in such envelopes to a Health Center. The Center signs the envelopes and then returns them. The person throws away the envelopes and keeps the slips. These certificates indicate the person is free from infection (in the simplest case).

The person selects two different random numbers (pseudonyms), writes one on the left side and the other on the right side of each slip, makes a copy of each slip, and returns them in envelopes with windows on the left to the Health Center, as the ID Set. The copies in envelopes with windows on the right, the Free Set, are returned in an untraceable batch to the Partner Notification Service at the same time. These Free Set  envelopes are retained as certificates that can be exchanged by presumedly healthy persons. Copies of these Free Set envelopes which show the Center's dated health status stamp are exchanged during transactions on the first day.

If someone becomes sick, they return certificates they have received in suspect transactions to the Notification Service. These are then removed from the Free Set store. Each subsequent day the stored Free Set is broadcast. Each person picks out their own Free Set (i. e., the certificates showing their pseudonym). People transfer the slips contained in their Free Set (right window envelopes) to left window envelopes which they transmit to the Health Center as their ID Set. Only if the ID Set is complete (i. e., all one hundred are present) is their ID Set returned signed with the Center's dated health status stamp. The person can then transfer the slips into right window envelopes generating a new Free Set. These can then be exchanged in transactions where they are presented as evidence of health status.

Failure to receive even one element of the Free Set would require the person to appear at the Health Center for testing. This would also permit the capture of additional certificates involved in possible transfer of infectious agents, thereby implementing an automatic and privacy preserving contact tracing mechanism.

This analogy is a simplification of both the security and coordination mechanisms appropriate for any real world system. Proof of the certificate mechanism is given in [12]. Untraceable sending of the Free Set is detailed as the communication mechanism in [10].

In discussion of the secure partner notification system, two options for the storage of transaction data were mentioned. In the first case, information was stored only in personal computers, in the second, information was transferred to a central database. Even in the second case, there would be data that only existed in personal computers for a certain time, that is, prior to the daily transfer of exchanged certificates to the central database. In any practical system, these types of delays, security preferences of individuals, and technical factors related to connectivity between central and distributed databases would influence the effectiveness of the system.

     Possible Application Development

The sensitivity of the data protected by the proposed system suggests it be tested in some less demanding applications first. Stodolsky [13] proposes that a version of the anonymous communication system presented above be used to control infectious agents in computer networks. Stodolsky [14] outlines the use of a pseudonym-based security mechanism to support peer review in computer-mediated conferencing. Stodolsky [15] proposes the same security mechanism be used in connection with blood and tissue donation. In addition to protecting the privacy of donors, if their blood is found to be infected, the approach aims to enhance the safety of donated blood. These application development contexts represent a sequence of environments requiring increasingly specialized systems and managing increasingly sensitive data.

While the control of computer viruses is less subject to hardware constraints and less limited by sensitivity of data, applications for such environments are not directly transferable to biological environments. However, some theoretical analyses developed in connection with computer viruses suggest that the type of countermeasures discussed here can be highly effective [16, 17, 18]. Experience with computer networks also verifies the importance of anonymity as a component in an infection management strategy [19]. On the other hand, the transition from the system for donation management to that suggested in this document is primarily one requiring production of special purpose data collection hardware. Thus, donation management can be seen as an appropriate test environment for the proposed system.

     Rationale and Summary

"In using surveillance data to describe the natural history of an unknown disease, it may be possible to initiate control and preventive measures before the actual etiologic agent has been identified if the means of transmission and/or essential co-factor can be identified (20, p. 1105)." Seale [21] has suggested that improved surveillance technology has become essential: "The greatest threat is posed by slow virus diseases, like AIDS and BSE (Bovine spongiform encephalopathy), which can spread silently far and wide before sounding any alarms. It is of interest that [the discoverer of HIV-1] warned of even more virulent infectious agents than HIV-1 which now threaten mankind. 'The greatest danger lies', he said, 'in non-conventional viruses that produce no immune reaction' (Montagnier, L., Interview: Luc Montagnier. Omni, December 1988: 102- 134)." Mann [22] has stated, "The sheer volume of movement of people and goods, across all borders, has created a qualitatively new situation which is ideally suited to the global spread of disease.... HIV may be the first virus to take advantage of this uniquely modern opportunity, but it would be a fatal error to assume it is the last.... We should invest now in creative thinking about how a sensory network can be created that will be capable of seeing what has not been recognized before, and sometimes to listen -- like Watson and Holmes on the moors -- for the bark which doesn't occur. A traditional, passive surveillance system would almost certainly miss the mark."

The system described here permits real-time surveillance of disease symptoms, regardless of identification of an etiologic agent. It also incorporates control and preventive measures as an inherent feature, thus eliminating delays in applying the findings from surveillance efforts. The results from computer virus simulations confirm the existence of a low counter-contagion threshold, that is, a level of contact tracing adequate to dampen the spread of an infectious agent. This also has been called an epi- epidemic threshold, because it can be modeled as "a sort of anti-virus epidemic ... riding on the back of the virus epidemic [17, p. 4]." One of the results suggests that if only twenty percent of contacts are traced, there is a dramatic drop in spread of the agent. Thus, adoption of the Contagion Management System by a small fraction of the at risk population could stop the AIDS pandemic.

The expansion of human populations into new environments continues to lead to contacts with new and sometimes dangerous viruses. Even without this risk, increasing population density could create an environment in which benign agents become virulent [23, 24]. An examination of worldwide trends and the history of recent viral epidemics supports the view that current disease surveillance is inadequate to suppress a new pandemic. Factors as varied as social disorders, widespread use of antiviral drugs, lack of trained personnel, and the rapid mutation rates of RNA viruses contribute to the likelihood that a new epidemic could not be contained [25]. Thus, we suggest that the surveillance system proposed here is an essential option in the context of global health management.

     References

[1] World Health Organization [WHO] (Carballo M, submitter). Partner notification for preventing HIV transmission. Journal of Sex Research 1989; 26(3): 393-9.

[2] Osborn JE. Public health and the politics of AIDS prevention. Daedalus 1989; 118(3): 123-44.

[3] Stodolsky D. Data security and the control of infectious agents. Abstracts of the cross disciplinary symposium at the University of Linkoeping, Sweden: Dept Communication Studies, 1986.

[4] Stodolsky D. Decision processes in a democracy. Self-management 1979; 6(2): 26-34.

[5] Bayer R. AIDS, Privacy, and Responsibility. Daedalus 1989; 118(3): 79-99.

[6]  Stodolsky, D. (1990, March 11). Toward Personal Risk Management: Information Technology Policy for the AIDS Pandemic. Comp.groupware [Usenet].

[7] Vermont Member. HSV+. The Helper 1989; 11(4): 4.

[8] Stodolsky D. Personal computers for supporting health behaviors. Stanford, CA: Department of Psychology, Stanford University, 1979. (Preliminary proposal)

[9] Burk H, Pfitzmann A. Value transfer systems enabling security and unobservability. University of Karlsruthe, Faculty of Informatics, Institute for Computation and Failure Tolerance, 1988. (Working paper 2/87, available from the Informatics Library at the Faculty)

[10] Chaum D. Security without identification: Transaction systems to make big brother obsolete. Communications of the ACM 1985; 28(10): 1030-44.

[11] Chaum D. Showing credentials without identification: Transferring signatures between unconditionally unlinkable pseudonyms. Auscrypt '90. Sydney, Australia: University of New South Wales, 1990.

[12] Chaum T, Evertse J-H. A secure and privacy- protecting protocol for transmitting personal information between organizations. Proceedings of Crypto '86. New York: Springer-Verlag, 1987. [Lecture Notes in Computer Science].

[13] Stodolsky D. Net hormones: Part 1 - Infection control assuming cooperation among computers, (1989). [Machine-readable file]. Van Wyk, KR. Several reports available via anonymous FTP. Virus-L Digest, 1989; 2(77). Abstract republished in van Wyk, KR. Virus papers (finally) available on Lehigh LISTSERV. Virus-L Digest, 1989; 2(98). (Available via anonymous file transfer protocol from LLL-WINKEN.LLNL.GOV: File name "~ftp/virus-l/docs/net.hormones" at Livermore, CA: Lawrence Livermore National Laboratory, Nuclear Chemistry Division and IBM1.CC.LEHIGH.EDU: File name "HORMONES NET" at Bethlehem, PA: Lehigh University. And by electronic mail from LISTSERV@LEHIIBM1.BITNET: File name "HORMONES NET" at Lehigh University)

[14] Stodolsky D. Protecting expression in teleconferencing: Pseudonym-based peer review journals. Canadian Journal of Educational Communication 1990; 19(1): 41-51. (Communication Research and Theory Network [CRTNET] 1989; No. 175. [Semi-final draft available by electronic mail from LISTSERV@PSUVM.BITNET at University Park, PA: The Pennsylvania State University and COMSERVE@Vm.ecs.rpi.edu at Troy, NY: Rensselaer Polytechnic Institute.])

[15] Stodolsky D. Personal integrity and the safety of donated blood (Personlig integritet och blodkvalitet). Proposal submitted to the Committee for Social Research (Delegationen foer social forskning), Social Department, Stockholm, Sweden, 1987.

[16] Kephart JO. How Topology Affects Population Dynamics. Proceedings of Artificial Life 3, Santa Fe, NM, 1992.

[17] Kephart JO, White SR. Measuring and modeling computer virus prevalence. Proceedings of the 1993 IEEE Computer Society Symposium on Research in Security and Privacy, 1993; Oakland, CA.

[18] Kephart JO, White SR, Chess DM. Epidemiology of Computer Viruses. IEEE Spectrum, 1993; 30(5): 20-6.

[19] Rochlis JA, Eichin MW. With microscope and tweezers: The worm from MIT's perspective. Communication of the ACM 1989; 32(6): 689-98.

[20] Evans AS, Brachman PS. Journal of Chronic Disease, 1986; 39(12): 1105-24.

[21] Seale J. Crossing the species - viruses and the origins of AIDS in perspective. J R Soc Med 1989; 182: 519-23.

[22] Mann J. The global lesson of AIDS. New Scientist June 30, 1990: 30.

[23] Yoon CK. What might cause parasites to become more virulent. Science 1993; 259: 1402.

[24] Herre EA. Population structure and the evolution of virulence in nematode parasites of fig wasps. Science 1993; 259: 1442-5.

[25] Garrett L. The next epidemic. In: Mann JM, Tarantola DJM, Netter TW, eds, AIDS in the world. Cambridge, MA: Harvard University Press, 1992: 825-39.

=====================================================

     Acknowledgments:

The author is grateful to David Chaum and Joergen Hilden for assistance in preparing this article.

===================================================

===================================================

     Appendix: Privacy Preserving Negotiation

In body of this document, a computer-network based system for anonymous partner notification (contact tracing) is described. This system provides for secure and protected distribution of infectious risk status information from a health care system to personal computers. However, the sensitive nature of such information may make it difficult to use without risking social discrimination. On the other hand, failure to use such information may result in exposure to infection. A method for blocking the spread of infection without a release of sensitive information is presented here. We call this method Privacy Preserving Negotiation. People who have seen the draft of this appendix often ask, "How can privacy exist in an exchange of information between two people? Surely you mean that the exchange of information is private from others!" However, there can be a hiding of message content in a two person exchange, and this is the key point of the technical sections that follow.

Conditional Privacy

Conditional privacy, as defined here, permits information to be revealed only if the other party reveals corresponding information. Ideally, this information sharing is complete if the condition is satisfied, but no information is transferred otherwise.  The interaction model ensures that sensitive information is released only in a mutual exchange. That is, for example, when both persons are HIV positive. Otherwise -- and this is the surprising feature -- no sensitive information is revealed. Since social discrimination is not a factor in an interaction when persons reveal identical information, such information is no longer considered sensitive. Also, the "surprise value" of the information is reduced under mutual exchange.

An ideal physical model for conditional privacy with one bit of information exchange will be described. Next, a model and protocol for automatically performing this information sharing will be given. Then, a multistage model which enhances information hiding is presented. Finally, risk of compromise in an actual application environment is considered.

Single Stage Models

The concept of privacy, as suggested above, is defined in terms of message contents (see table in body). We will limit this analysis to a two-person interaction in which anonymity is not possible. That is, when only two persons are exchanging information, content hiding is the only type of information protection available, since there can be no doubt about the message source.

In the explanation that follows, questions are always structured so that a Yes reply reflects sensitive information. One bit of information is exchanged when a Yes is transmitted by both sides, if we assume no prior knowledge. A No reply is never considered sensitive and in most cases is not very informative. A reply communicates information to one's computer, but this information is not transmitted between computers, and thereby revealed, except under mutual exchange.

The prototype for these procedures can be called One- Bit Matchmaking. If both he and she have a Yes for each other, then the match takes place and the information is revealed, but if either one has a No, the person with a Yes does not wish to reveal it (Table 1). This is a logical And function, the output is a Yes if and only if both inputs are Yes. The One- Bit Matchmaking protocol can be repeated to exchange more information.

     Table 1

Privacy Preserving Negotiation (One-bit Matchmaking)

Input by person:   Information revealed by person:  Match                                                     Succeeds?   A     B                    A     B

------------------------------------------------------------- --

 Yes   Yes                  Yes    Yes                    Yes

 No    Yes                   No     -                      No

 Yes    No                   -      No                     No

                             -      No  No     No                      or                         No                              No     -

"-" indicates that no information is revealed.

An ideal physical model. A physical model of this process can be based upon light passing through a translucent material. Each person places an opaque card in a translucent envelope. A Yes is represented by a card with a hole in its middle. The covered cards are placed together in another translucent envelope. Next, they are viewed against a powerful light source which can penetrate the translucent material. If light comes through the holes in the opaque cards, then the match succeeds: Both persons have transmitted a Yes. If either has transmitted a No, a completely opaque card has been used, and no light is visible.

Asymptotically secure models. A practical procedure can be illustrated by a model that uses playing cards in envelopes. Each person places, say, a hundred cards in envelopes. This number of cards limits the privacy failure rate of the protocol to about one time in two hundred, as we shall see. (A privacy failure does not result in an incorrect decision, but in the unwanted release of information.) A person represents a Yes by placing even-numbered cards in all envelopes. A No is represented by placing an odd-numbered card in one of the envelopes and even-numbered cards in the rest. Each person puts their envelopes on the table. They take turns in randomly selecting one of the other person's envelopes. That envelope is opened: If the card is odd-numbered, the procedure terminates without a match. In that case, a No has been transmitted by the person whose odd-numbered card was displayed, whereas the other person has not definitively revealed anything (Table 1). If the card is even-numbered, they continue opening envelopes. If all cards are displayed and they are all even-numbered, the match succeeds, indicating a mutual Yes.

Amount of information released. If the odd-numbered card appears in the last envelope opened, when all of the other person's envelopes have been opened and found to contain even-numbered cards, this other person has disclosed his Yes to a No-sayer. This privacy failure clearly happens in one of two hundred Yes-No encounters. This risk can be reduced to an arbitrarily low level by increasing the number of cards.

Suppose a small fraction f of the population is infected (i. e., answers Yes to, "Have you been tested HIV positive?"). If two persons, Negative and Positive, adhere to this card-revealing protocol, what will Negative know about Positive after Negative has revealed an odd-numbered card (thereby indicating that Negative is not infected)? Suppose that there remain m hidden cards on Positive's side. By Bayes' theorem:

Prob (P is saying Yes | cards revealed) = f / [f + (1 - f) (m / 100)],

is 1 when m is zero, as already hinted at, but otherwise always a small probability, as long as f is much less than 1/100. If a larger fraction of the population is infected, then significant amounts of information could be released even without a privacy protocol failure. However, the informativeness or surprise value of such information would be low in a population where infection rates are high.

The release of information that could be used to infer positive status is not likely to be a practical problem, unless the same potential partners repeat the negotiation process at a rate much greater than the likely rate of change in the sensitive information. Thus, an appropriate delay period between negotiations with the same potential partner would reduce this risk. Maximum protection of sensitive information would be provided in the limiting case, where negotiation would occur only once with each potential partner. (This assumes no collaboration between potential partners.)

Protocol implementation. This model can be implemented by using a one-way function. This is a function that is easy to compute in one direction, but infeasible to compute in the other. A card is represented by a number that may be even or odd. Odd could mean stop the matching attempt and even would then mean continue the matching attempt. Cards are placed in envelopes, as it were, by encoding them with the one-way function. Each party then shows a list of, say, one hundred encoded numbers (envelopes). When one of these  is selected, the number (card) which produced it must be supplied. To verify that the number (card), odd or even, actually produced the encoded number (envelope), the one-way function is applied again.

Thus, in purely mathematical terms, the entire negotiation proceeds by agreeing (a) on a function, (b) on how many encoded numbers to exchange, and (c) on the a meaning of odd and even numbers. Each side then produces and sends the encoded numbers. First one side selects an encoded number at random and the other side shows the unencoded number used to produce it. The unencoded number is tested by encoding it with the function and verifying that the result equals the previously selected encoded number. Then the other side does the same. This continues until an odd number is produced or the numbers are exhausted.

A Multistage Model

Superior information hiding can be achieved with a multistage model (i. e., one Yes-No question is negotiated after another.) Assume that all interacting persons are equipped with personal computers that they always use when contacts are organized. The computers negotiate to determine if a risk would result from contact. In the case that risk would result, they wish to avoid the contact, however, they do not want to reveal sensitive information. We limit the current example to sensitive information about infection with a single virus.

We assume repeated use of the One-Bit Matchmaking Protocol. In the initial stage of the algorithm, both parties' computers respond to the question, "Have you been tested HIV positive?" In stage two, their computers respond to the question, do you "Want contact (with the potential partner)?" The negotiation process starts automatically when two computers are close to each other. Thus, the fact of information exchange is not revealing by itself. It is assumed that one can transmit information to one's computer privately at any time. Specifically, one can operate the "Want contact?" switch for the algorithm below unobtrusively.

The rules for exchange of information that have been programmed are (Figure 1):

  : if you satisfy the condition (i. e., have been
tested positive), then
     Stage 1: Send a Yes.
     Stage 2a: If the potential partner also sent a
Yes (in stage 1), then Send a Yes, if you want
contact.
     Stage 2b: If the potential partner sent a No (in
stage 1), then always send a No.

  : if you do not satisfy the condition, then      Stage 1: Send a No.      Stage 2: Send a Yes, if you want contact, otherwise send a No.

Figure 1 (PDF format)

When an HIV positive person and an HIV negative person are attracted to each other the contact is blocked. In order to simplify explanations, we assume potential partners are attracted to each other in all examples (i. e., always respond Yes, to the question "Want contact?"). The paths of activation in the flow charts are indicated by shadows from objects in the figures, beginning with the "start" action. The computer of the HIV negative person (on the left in figure 2) automatically sends a response to the question on HIV infection status to the computer of the HIV positive person. Simultaneously, the computer of the HIV positive person sends a response. However, this information is blocked because the match fails (on the right in figure 2). Next, the HIV positive person's computer automatically sends a No in response to the question about wanting the contact, thus the desire for contact of the HIV positive person is not used at all. The HIV negative person's computer conditionally transmits a Yes in response to the question about wanting the contact. This is blocked, because the match fails (on the left) and contact is avoided. Attempts to initiate prohibited contacts always result in a finding that the other party is unavailable. Whether this results from restrictions on the transmission of an infection or from the lack of desire by the other party, cannot be determined within the confines of the protocol. Thus, an infected person could freely search for a partner in a responsible manner, without risking the exposure of sensitive information. (We assume, in this text, that an HIV positive person runs no risk in contacts with other HIV positive persons: Medically, this may not be completely correct, because one person may be carrying a more virulent variant of the virus. This problem can be avoided by treating each variant separately.)

Figures 2 - 4 (PDF format)

When two HIV negative people are attracted to each other (Figure 3), the protocol transmits the fact of HIV negativity, just as in the previous case. In this case, however, there is an equal chance that either person transmits this information successfully (Table 1, last combination): But, since the first "Match?" decision node is bypassed, the information is ignored anyway. The protocol results in arbitrary mutually- exclusive information transmission. That is, the direction of information transmission is determined by chance and that transmission blocks any further information transmission in the stage-one matching procedure. In stage two, the mutual desire for contact is exchanged and contact is permitted.

================== Figure 3 here ==================

When two HIV positive people are attracted to each other (Figure 4), the protocol results in the exchange of the HIV positive information in stage one. Recall that persons of equal health status are allowed to learn of each others' condition. In stage two, the mutual desire for contact is exchanged and contact is permitted.

================ Figure 4 here ================

Each of these cases could terminate with "avoid," because of a lack of mutual desire for contact. In the case of discordant HIV status, however, the reason for avoidance is never revealed. This is crucial for the protection of sensitive information. The HIV negative person knows only that negotiation failed (either the other person was HIV positive or did not desire contact). The HIV positive person knows that the other person is HIV negative, but does not know whether that person desired contact. Further uncertainty can be introduced by running multiple matches in stage one. A wide range of infectious agents could be probed for, before proceeding to stage two. Stage one could be further elaborated to include probes for genetic diseases, habits, preferences, affiliations, and so on. Such elaboration would improve information hiding.

Risk of Compromise

While information hiding is adequate within the framework of the simple two-stage protocol, there may be transmission of information outside of the protocol framework that results in poor protection of sensitive information. For instance, in a face-to-face meeting there may be clues concerning the desire for contact transmitted through eye contact, which could undermine the information hiding effect of the stage-two match. That is, these clues could reveal the true desire for contact directly, making an avoid decision attributable to HIV positive status. By elaborating the protocol to multiple stage-one matches, the hiding of the cause of the avoid decision is improved, so that even a failure to block information transmission in stage two does not result in unambiguous transmission of sensitive health-status information. In any practical situation, there will be multiple channels for information transmission. There may also be situations in which sensitive information may be deducible, because of a small number of involved persons or limited number explanations for an avoid outcome. These practical limitations argue for elaboration the protocol to such a degree that there is always more than the formally required degree of information hiding. Such elaboration may in fact increase the practicality of the system in other ways, by for example, permitting the management of a large number of epidemiological processes for only a minor additional investment.

While it is beyond the scope of this article to elaborate implementation details, various safeguards inherent in the technologies of digital data transmission and digital encryption can be mentioned. Digital transmission protocols yield effectively errorless transmission, even when channels are noisy and signals are weak. Error correcting codes and retransmission upon the detection of errors eliminate the possibility of errors in the accepted messages. Typically, data is totally rejected unless it is completely error free. In addition, the digital signature mechanisms mentioned ensure that message contents are free from error and tampering, as well as certifying the identity of the sender. These techniques have been used widely in high risk applications, such as electronic funds transfer for many years without demonstrable weaknesses. Therefore, the application discussed here, which has both lower security requirements and lower intensity of assets (reducing the incentive to undermine system security) can be considered highly secure, within the limits discussed.

     Summary

Privacy Preserving Negotiation protocols illustrate that the total distribution of risk information is an option for contagion management. Highly sensitive information can be used for local decision making without employment of confidentiality, that is, without the need for a trusted third party or central authority. This presumes secure and protected distribution of risk information to local databases from the health care system.

Privacy Preserving Negotiation limits the exchange of health information to those of equal health status. Thus, while sensitive information is effectively kept secret, behavior of risk-conscious persons is indistinguishable from that occurring if such information was public. Since information security is very high with a personal-computer based system, the chance of sensitive data being compromised is reduced. This encourages increased entry of information into the database and therefore more information becomes publicly actionable.




Last update: Monday, June 13, 2005 at 9:46:02 PM.