Research Article
24 July 2019

Nothing to hide, but something to lose

Publication: University of Toronto Law Journal
Volume 70, Number 1

Abstract

‘I have nothing to hide’ is among the most common and controversial arguments against privacy. This article shows why the argument is mistaken on its own terms. To do so, it constructs a model combining the standard economic argument – that only people with ‘something to hide’ will value privacy – with a concept of intrinsic privacy preferences and shows that the inclusion of this dimension causes the standard argument to fail. It then applies these insights to two legal contexts in which there are active policy debates: the protection of genetic information in the context of employer-provided health insurance and tax privacy.

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
Daniel Solove, Nothing to Hide: The False Tradeoff between Privacy and Security (New Haven, CT: Yale University Press, 2011) at 1 [Solove, Nothing to Hide]: ‘If you’ve got nothing to hide, you shouldn’t worry about government surveillance.’
2.
Solove, Nothing to Hide, supra note 1 at 747.
3.
See Part iii.a.
4.
Ignacio Cofone & Adriana Robertson, ‘Privacy Harms’ (2018) 69 Hastings LJ 1039 [Cofone & Robertson, ‘Privacy Harms’].
5.
Richard A Posner, ‘The Economics of Privacy’ (1981) 71 American Economics Rev 405 [Posner, ‘Economics’]; George Stigler, ‘An Introduction to Privacy in Economics and Politics’ (1980) 9 J Leg Stud 623.
7.
Bruce Schneier, ‘The Eternal Value of Privacy’ (18 May 2006), online: Schneier on Security <https://www.schneier.com/essays/archives/2006/05/the_eternal_value_of.html>; see also Elizabeth Stoycheff, ‘Mass Surveillance Chills Online Speech Even When People Have “Nothing to Hide,”’ Slate (3 May 2016), online: <https://slate.com/technology/2016/05/mass-surveillance-chills-online-speech-even-when-people-have-nothing-to-hide.html>; Geoffrey Stone, ‘Commentary, Freedom and Public Responsibility,’ Chicago Tribune (21 May 2006) at 11, online: <https://www.chicagotribune.com/news/ct-xpm-2006-05-21-0605210386-story.html>.
8.
Ramzi Kassem & Diala Shamas, ‘Rebellious Lawyering in the Security State’ (2017) 23 Clinical L Rev 671 at 688.
9.
Solove, Nothing to Hide, supra note 1 at 747; see also Daniel Solove, ‘Why Privacy Matters Even if You Have “Nothing to Hide”’ The Chronicle of Higher Education (15 May 2011) [Solove, ‘Why Privacy’].
10.
Moxie Marlinspike, ‘Why “I Have Nothing to Hide” Is the Wrong Way to Think about Surveillance,’ Wired (13 June 2013), online: <https://www.wired.com/2013/06/why-i-have-nothing-to-hide-is-the-wrong-way-to-think-about-surveillance/>; see also Colin J Bennett, The Privacy Advocates: Resisting the Spread of Surveillance (Cambridge, MA: MIT Press, 2010) at 97–8.
11.
Anthony Ha, ‘Edward Snowden’s Privacy Tips: “Get Rid Of Dropbox,” Avoid Facebook and Google’ (11 October 2014), online: TechCrunch <https://techcrunch.com/2014/10/11/edward-snowden-new-yorker-festival/>.
12.
Daniel Solove, ‘The Nothing-to-Hide Argument: My Essay’s 10th Anniversary’ (23 February 2017), online (blog): Privacy+Security <https://teachprivacy.com/the-nothing-to-hide-argument-my-essays-10th-anniversary/>.
13.
Jeffrey Rosen, The Naked Crowd (New York: Random House, 2004) at 35–6 [Rosen, Naked Crowd].
14.
See e.g. Daniel Culpan, ‘“UK Surveillance Is Worse Than 1984” Says UN Privacy Chief,’ Wired (25 August 2015), online: <https://www.wired.co.uk/article/uk-digital-surveillance-joseph-cannataci>. Lorenzo Franceschi-Bicchierai, ‘Surveillance Cameras Turning UK into Big Brother State,’ Mashable (5 October 2012), online: <https://mashable.com/2012/10/05/uk-is-turning-into-big-brother-working-title/>. Alastair Jamieson, ‘Britain’s Surveillance Society “beyond Orwell’s Worst Fears”, Warns Michael Mansfield,’ The Telegraph (2 September 2009), online: <https://www.telegraph.co.uk/news/uknews/law-and-order/6125068/Britains-surveillance-society-beyond-Orwells-worst-fears-warns-Michael-Mansfield.html>. According to one estimate, as of the early 2000s, there were 4.2 million surveillance cameras in Britain. Rosen, Naked Crowd, supra note 13 at 36.
15.
See also Peck v United Kingdom (2003) ECHR 44, 36 EHRR 41.
16.
‘BellSouth Denies Giving Records to NSA’ (15 May 2006), online: CNN.com <http://www.cnn.com/2006/POLITICS/05/15/bellsouth.nsa/>.
17.
American Civil Liberties Association, Seeking Truth from Justice: Patriot Propaganda: The Justice Department’s Campaign to Mislead the Public about the USA PATRIOT Act 7 (2003), online (pdf): ACLU <https://www.aclu.org/report/seeking-truth-justice-patriot-propaganda-justice-departments-campaign-mislead-public-about>; USA PATRIOT Act of 2003, Pub L No 107-56, 115 Stat 272.
18.
Warrantless Surveillance and the Foreign Intelligence Surveillance Act: The Role of Checks and Balances in Protecting Americans’ Privacy Rights (Part I): Hearing before the H Comm on the Judiciary, 110th Cong, 1st Sess (2007) at 59–60. Foreign Intelligence Surveillance Act, 50 USC § 1801–1885c (1978).
19.
Kenneth Einar Himma, ‘Why Privacy and Accountability Trump Security’ in Adam D Moore, ed, Privacy, Security and Accountability: Ethics, Law and Policy (London: Rowman & Littlefield International, 2015) 171 at 174–5.
20.
US, Investigation of Communist Activities in the State of Michigan, Part 6: Hearing before the H Comm on Un-American Activities, 83rd Cong, 2nd Sess (1954) at 5357.
21.
US, Disclosure of IRS Information to Assist with the Enforcement of Criminal Law: Hearing Before the Sub. on Oversight of the Internal Revenue Service of the Sen Comm on Finance, 97th Cong, 1st Sess (1981) at 119 (statement of Senator Nunn); see also US, Computer Matching: Taxpayer Records: Hearing before the Sub. on Oversight of Government Management of the Sen Comm on Governmental Affairs, 98th Cong, 2nd Sess (1984) at 16–17, 94 (statement of Roscoe L Egger, Jr, Commissioner, Internal Revenue Service); US, Caller-ID Technology: Hearing before the Sub on Technology and the Law of the Comm on the Judiciary, 101st Cong (1990); see also US, US Customs Service Passenger Inspection Operations: Hearing before the Sub on Oversight of the H Committee on Ways and Means, 106th Cong, 1st Sess (1999) at 33–5.
22.
Karen EC Levy, ‘The Contexts of Control: Information, Power, and Truck-Driving Work’ (2015) 31 Information Society 160; Karen EC Levy & Michael Franklin, ‘Driving Regulation: Using Topic Models to Examine Political Contention in the U.S. Trucking Industry’ (2014) 32 Social Science Computer Rev 182.
23.
Jeremias Prassl, Humans as a Service: The Promise and Perils of Work in the Gig Economy (Oxford: Oxford University Press, 2018) at 54–6, 99, 113, 178.
24.
Ifeoma Ajunwa, Kate Crawford & Jason Schultz, ‘Limitless Worker Surveillance’ (2017) 105 Cal L Rev 735; Alex Rosenblat, Tamara Kneese & danah boyd, ‘Workplace Surveillance’ (2014) Open Society Foundation’s Future of Work Commissioned Research Papers 2014, online: Data & Society <https://datasociety.net/pubs/fow/WorkplaceSurveillance.pdf>; Kristie Ball, ‘Workplace Surveillance: An Overview’ (2014) 51:1 Labor History 87, online: <https://www.tandfonline.com/doi/abs/10.1080/00236561003654776>; Michel Anteby & Curtis K Chan, ‘A Self-Fulfilling Cycle of Coercive Surveillance: Workers’ Invisibility Practices and Managerial Justification’ (2018) 29 Organizational Science 247.
25.
Sam Levin, ‘Walmart Patents Tech That Would Allow It to Eavesdrop on Cashiers,’ The Guardian (12 July 2018), online: <https://www.theguardian.com/business/2018/jul/12/walmart-surveillance-sound-sensors-employees>. Jena McGregor, ‘What Walmart’s Patent for Audio Surveillance Could Mean for Its Workers,’ Washington Post (12 July 2018), online: <https://www.washingtonpost.com/business/2018/07/12/what-walmarts-patent-audio-surveillance-could-mean-its-workers/>.
26.
Olivia Solon, ‘Big Brother Isn’t Just Watching: Workplace Surveillance Can Track Your Every Move,’ The Guardian (6 November 2017), online: <https://www.theguardian.com/world/2017/nov/06/workplace-surveillance-big-brother-technology>. This rule is also present abroad. See e.g. Barbulescu v Romania [GC], No 61496/08 (5 September 2017).
27.
These exceptions mainly relate not only to monitoring but also to the recording of conversations, which is sometimes protected by provincial law.
28.
See Part iv.
29.
See e.g. Natasha Singer, ‘Sharing Data, but Not Happily,’ New York Times (4 June 2015), online: <https://www.nytimes.com/2015/06/05/technology/consumers-conflicted-over-data-mining-policies-report-finds.html>; Surya Mattu Julia Angwin, ‘Facebook Doesn’t Tell Users Everything It Really Knows …’ ProPublica (27 December 2016), online: <https://www.propublica.org/article/facebook-doesnt-tell-users-everything-it-really-knows-about-them>; Siva Vaidhyanathan, ‘Don’t Delete Facebook. Do Something About It,’ New York Times (24 March 2018), online: <https://www.nytimes.com/2015/06/05/technology/consumers-conflicted-over-data-mining-policies-report-finds.html>; Dylan Curran, ‘Are You Ready? This Is All the Data Facebook and Google Have on You, The Guardian (30 March 2018), online: <https://www.theguardian.com/commentisfree/2018/mar/28/all-the-data-facebook-google-has-on-you-privacy>.
30.
‘Google’s Schmidt Roasted for Privacy Comments,’ PCWorld (11 December 2009) online: <https://www.pcworld.com/article/184446/googles_schmidt_roasted_for_privacy_comments.html>; ‘Google CEO on Privacy (VIDEO): “If You Have Something You Don’t Want Anyone To Know, Maybe You Shouldn’t Be Doing It,”’ Huffington Post (18 March 2010); Richard Esguerra, ‘Google CEO Eric Schmidt Dismisses the Importance of Privacy’ (10 December 2009), online (blog): EFF Deeplinks <https://www.eff.org/deeplinks/2009/12/google-ceo-eric-schmidt-dismisses-privacy>; Ryan Tate, ‘Google CEO: Secrets Are for Filthy People’ (9 April 2012), online (blog): Gawker <https://gawker.com/5419271/google-ceo-secrets-are-for-filthy-people>.
31.
Consultation générale sur la protection de la vie privée eu égard aux renseignements personnels détenus dans le secteur privé: Meeting of the Nat’l Assembly Comm Inst, 34th Leg, 1st Sess (7 Nov 1991) at 61 (declarations of Jean-Claude Chartrand and Michel Globensky).
32.
Solove, Nothing to Hide, supra note 1 at 751–3.
33.
Richard A Posner, The Economics of Justice (Cambridge, MA: Harvard University Press, 1983) at 271.
34.
Richard A Posner, Economic Analysis of Law, 5th ed (New York: Aspen Publishers, 1998) at 46.
35.
Solove, ‘Why Privacy,’ supra note 9 at 27; Solove, Nothing to Hide, supra note 1.
36.
Richard A Posner, ‘The Right of Privacy’ (1978) 12 Ga L Rev 393; Richard A Posner, ‘An Economic Theory of Privacy’ (1978) 2 Regulation: AEI J Government & Society 19; Posner, ‘Economics,’ supra note 4; Richard Posner, ‘Privacy,’ in Peter Newman, ed, The New Palgrave Dictionary of Economics and the Law (London: Palgrave Macmillan UK, 1998) 103.
37.
See e.g. Kenneth Einar Himma, ‘Why Security Trumps Privacy’ in Adam D Moore, ed, Privacy, Security and Accountability: Ethics, Law and Policy (London: Rowman & Littlefield International, 2015) 145 at 152–5.
38.
Formally, _~[,]U01.
39.
Wages are restricted to being non-negative.
40.
Unlike Spence’s canonical signalling game, here the cost of signalling is incurred after the wage contract is offered to the potential employees. This is more realistic for the case at hand. See generally Michael Spence, ‘Job Market Signaling’ (1973) 87 QJ Economics 355.
41.
The results are unchanged if an indifferent individual instead randomizes between the two firms.
42.
A Nash equilibrium is a situation in which the strategies of each player is a best response to the strategies of other players, and, therefore, none of them has incentives to deviate.
43.
Charles M Kahn, James McAndrews & William Roberds, ‘Money Is Privacy’ (2005) 46 International Economics Rev 377 [Kahn, McAndrews & Roberds, ‘Money’].
44.
Gabriele Camera, ‘Dirty Money’ (2001) 47 J Monetary Economics 377. Others have viewed this preference from a more benign perspective. For example, Kahn, McAndrews, and Roberds consider a model in which there is a risk of theft, and consumers prefer to be the victim of theft using cash transfers rather than credit transfers. Kahn, McAndrews & Roberds, ‘Money,’ supra note 43.
45.
Jin Kim & Liad Wagman, ‘Screening Incentives and Privacy Protection in Financial Markets: A Theoretical and Empirical Analysis’ (2015) 46 RAND J Economics 1.
46.
Cofone & Robertson, ‘Privacy Harms,’ supra note 4.
47.
See generally Solove, Nothing to Hide, supra note 1 at 764; Solove, ‘Why Privacy,’ supra note 9.
48.
Cofone & Robertson, ‘Privacy Harms,’ supra note 4 at 1096–8.
49.
Ibid at 1049–53.
55.
Time may be a useful analogy here. In general, people value their time and, other things being equal, would prefer to have more free time rather than less. That being said, however, most individuals agree to sell or trade their time for valuable goods and services. The most obvious example of this is through employment, but other examples of this type of trade include spending time looking for coupons or discounts, lining up for free samples or even taking surveys for the chance to win a prize. But see Katherine J Strandburg, ‘Free Fall: The Online Market’s Consumer Preference Disconnect’ 2013 U Chicago Legal F 95.
56.
Cofone & Robertson, ‘Privacy Harms,’ supra note 4 at 1053–5; see also Ryan Calo, ‘The Boundaries of Privacy Harm’ (2011) 86 Ind LJ 1131. Note that, because privacy loss is modelled as a continuum, we are able to capture Abby’s privacy preferences also along a continuum, thereby capturing the idea that losing a small amount of privacy is different than facing a large privacy loss.
57.
See Part ii.
58.
In such cases, individuals might actively share personal information, with the consequential privacy loss, simply because they suffer no harm from the loss of privacy and, in addition, the act of sharing brings them gains in intimacy (Nicole and her partner) or even material benefits (Nicole with Edison regarding information she does not mind disclosing, such as her university). In other words, they are able to selectively reveal personal information. For the purposes of this article, these should not be viewed as privacy gains (that is, as a utility gain based on more privacy) but, rather, as extrinsic (intimacy or material) gains that accrue as a consequence of a reduction in privacy.
59.
Cofone & Robertson, ‘Privacy Harms,’ supra note 4 at 1055, 1093.
60.
For example, this signal could be the result of a background check.
61.
By assumption, x contains new information about individual i – it is not completely duplicative of the signals already aggregated; otherwise, the employer would not need it.
62.
Cofone & Robertson, ‘Privacy Harms,’ supra note 4 at 1093–4.
63.
This is because the standard deviation of her employer’s posterior distribution will likely decline.
64.
This assumes that the employer’s prior belief, the distribution of the signals, and the utility function of the individuals come together in a specific way.
65.
Formally, L~[,]U01.
66.
Formally, θ L.
67.
Under Bertrand competition, the two firms compete by setting prices simultaneously. For a discussion of Bertrand competition, see Andreu Mas-Colell et al, Microeconomic Theory (New York: Oxford University Press, 1995) at 388–90.
68.
Two other solutions (one of which is negative and the other which is larger than one) are left aside because they are outside the domain of _.
69.
The conclusion that the equilibrium with privacy weakly Pareto dominates the no-privacy equilibrium relies on the assumption that firms face no additional screening costs in the equilibrium with privacy. If those costs did exist, it is unlikely that privacy would be weakly Pareto improving. The precise effect of adding such costs would depend on the details of how they were incorporated into the model. In a model with relatively low screening costs, the equilibrium with privacy would likely remain welfare enhancing relative to the no-privacy equilibrium as long as screening costs are modest. Also note that, because this model is intended as a formalization of the standard argument against privacy, misallocation costs (for example, costs that could arise if the ‘wrong’ person is selected for a particular job) are beyond the scope of the analysis.
70.
More formally, the equilibrium with privacy weakly Pareto dominates the no-privacy equilibrium. First, observe that firms are indifferent between the two equilibria. This follows from the assumption of Bertrand competition. In the no-privacy equilibrium, firms pay all individuals their marginal product, θi. In the equilibrium with privacy, firms continue to pay individuals who reveal their marginal product and pay individuals who conceal the average marginal product of those who conceal. In both cases, profits are zero. Second, observe that individuals who choose to reveal in the equilibrium with privacy are also indifferent between the two equilibria: in both cases, they are paid θi and bear the privacy loss Li. Finally, observe that the individuals who choose to conceal are better off in the equilibrium with privacy. This follows from the definition of a Nash equilibrium; since revealing is always in an individual’s choice set, the fact that individuals who choose to conceal do so in equilibrium implies that these individuals are no worse off concealing than they are revealing.
71.
At the same time, given the choices before them, some individuals in the model end up revealing everything, so only some individuals end up choosing to conceal something.
72.
Jessica L Roberts, ‘Protecting Privacy to Prevent Discrimination’ (2014) 56 Wm & Mary L Rev 2097; Jessica L Roberts, ‘GINA’s Limits or Something More? The Need for Greater Protection of Employee Health-Related Information’ (2014) 14 American J Bioethics 45; Jessica L Roberts, ‘Preempting Discrimination: Lessons from the Genetic Information Nondiscrimination Act’ (2010) 63 Vand L Rev 437; Elizabeth A Brown, ‘The Fitbit Fault Line: Two Proposals to Protect Health and Fitness Data at Work’ (2016) 16 Yale J Health Policy, L & Ethics 1.
73.
Genetic Non-Discrimination Act, SC 2017, c 3 [GNA].
74.
Genetic Information Nondiscrimination Act, 2008, Pub L 110–233, 122 Stat 881 [GINA].
75.
GINA, supra note 74, Title I.
76.
GNA, supra note 73, s 3–5; GINA, supra note 74, Title II.
77.
GINA, supra note 74 at paras 101(a)(2), 102(a)(2), 103(a)(2), 104(b), 201(4)(A)(i)–(iii).
78.
Ibid at para 202. GINA defines employer and employee according to Title VII. Ibid at paras 202(2)(A)(i), (2)(B)(i).
79.
A genetic test under GINA is ‘an analysis of human DNA, RNA, chromosomes, proteins, or metabolites that detects genotypes, mutations, or chromosomal changes.’ Ibid at paras 101(a)(2), 102(a)(2), 103(a)(2), 104(b), 201(7).
80.
Ibid at paras 101–6.
81.
The model assumed that individuals knew their θ. Even without having taken a genetic test, it is reasonable to assume that, on average, individuals know more about their genetic predisposition to certain diseases than their employers would (in the absence of genetic testing).
82.
In the context of health insurance, in particular, in addition to having a privacy preference over such information, an individual might be worried about its potential misuses. This second concern represents a distinct reason to avoid disclosing the information.
83.
In Re Government of Quebec Concerning the Constitutionality of the Genetic Non-Discrimination Act Enacted by Sections 1 to 7 of the Act to Prohibit and Prevent Genetic Discrimination, SC 2017, c 3, s 17. The court also emphasized the lack of criminal law object (ss 18–24).
84.
Ibid, ss 3, 4. The court based this conclusion on the fact that the identification of risk assessment factors and the information that insurers can request falls under provincial, rather than federal, jurisdiction. In particular, the court held that the pith and substance of the statute was not criminal. Despite the title of the statute, the court found that the true purpose of the statute was not to prohibit genetic discrimination but, rather, was intended to encourage the use of genetic testing by assuaging fears that such information could be used for discriminatory purposes. Ibid, s 10–11.
85.
Ibid, s 5.
86.
Ibid, s 25.
87.
US, Bill HR 1313, Preserving Employee Wellness Programs Act, 115th Cong, 2017, HR 115-459, online: Congress.gov <https://www.congress.gov/115/crpt/hrpt459/CRPT-115hrpt459.pdf> [HR1313]. The bill was introduced in March 2017 in the House of Representatives by Virginia Foxx (R, 5th District, North Carolina). The bill was then discharged by the Ways and Means Committee and the Energy and Commerce Committee in December 2017. See Congress.gov <https://www.congress.gov/bill/115th-congress/house-bill/1313/all-actions>. The committees released it to the House for its consideration, and it was placed on the Union Calendar. Union Calendar No 341, online: Congress.gov <https://www.congress.gov/bill/115th-congress/house-bill/1313/all-actions>.
88.
Patient Protection and Affordable Care Act, Pub L No 111–48 (2010).
89.
Americans with Disabilities Act of 1990, Pub L 101–336 (1990).
90.
Cole Holderman, ‘H.R. 1313: New Bill to Threaten Genetic Nondiscrimination’ (13 March 2017), online (blog): Huntington’s Outreach Project for Education, at Stanford <https://hopes.stanford.edu/h-r-1313-new-bill-to-threaten-genetic-nondiscrimination/> [Holderman, ‘H.R. 1313’]. From the text of the proposed bill: ‘This bill exempts workplace wellness programs from: (1) limitations under the Americans with Disabilities Act of 1990 on medical examinations and inquiries of employees, (2) the prohibition on collecting genetic information in connection with issuing health insurance, and (3) limitations under the Genetic Information Nondiscrimination Act of 2008 on collecting the genetic information of employees or family members of employees. This exemption applies to workplace wellness programs that comply with limits on rewards for employees participating in the program.’ HR1313, supra note 87, s 3.
91.
In addition to the welfare considerations introduced by the model, one may have non-consequentialist normative reasons to believe that people should not be charged for health insurance differently based on their genetics, even if this were to raise health insurance costs at a societal level.
92.
Anna Bernasek, ‘The Debate Is Back: Should Tax Returns Be Public?’ New York Times (13 February 2010), online: <https://www.nytimes.com/2010/02/14/business/yourtaxes/14disclose.html>.
93.
For example, since 1980, every major party nominee but one has released at least one year’s worth of tax returns. Tom Kertscher, ‘Is Donald Trump the Only Major-Party Nominee in 40 Years Not to Release His Tax Returns?’ (2016), online: PolitiFact Wisconsin <https://www.politifact.com/wisconsin/statements/2016/sep/28/tammy-baldwin/donald-trump-only-major-party-nominee-40-years-not/>.
94.
Justin Wolfers, ‘What Can We Learn from Donald Trump’s Unreleased Tax Returns,’ New York Times (11 May 2016), online: <https://www.nytimes.com/2016/05/12/upshot/what-we-can-learn-from-donald-trumps-unreleased-tax-returns.html>.
96.
In the model, the psychic cost L and the characteristic of interest θ were independent. However, if there is a positive correlation between L and θ – so people with desirable attributes also tend to have higher than average psychic costs from revealing – the results become even more extreme. At the limit, if θ and L move together perfectly, the two features completely offset each other, and no one will ever reveal. It may be that certain types of financial information which may not be relevant in the context of the average person could become relevant in the context of a political candidate.

Information & Authors

Information

Published In

Go to University of Toronto Law Journal
University of Toronto Law Journal
Volume 70Number 1winter 2020
Pages: 64 - 90

History

Published online: 24 July 2019
Published in print: winter 2020

Keywords:

  1. game theory
  2. information privacy
  3. law and economics
  4. privacy
  5. signalling
  6. employment law
  7. surveillance

Authors

Affiliations

Ignacio N Cofone
Assistant Professor, Faculty of Law, McGill University, Montreal, Canada

Notes

I would like to thank Ian Ayres, Jack Balkin, Thomas Bonczek, Christine Jolls, Claudia Landeo, M Henry Linder, Katherine Strandburg, and an anonymous reviewer for their comments. I especially thank Adriana Robertson, who contributed to this piece enormously. This article also benefited from numerous comments and discussions with participants at the American Law and Economics Association Conference, the Canadian Law and Economics Association Conference, and an internal presentation at the Yale Law School Information Society Project. I also thank Fabian Bargout for his research assistance.

Metrics & Citations

Metrics

Related Content

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Format





Download article citation data for:
Ignacio N Cofone
University of Toronto Law Journal 2019 70:1, 64-90

View Options

Restore your content access

Enter your email address to restore your content access:

Note: This functionality works only for purchases done as a guest. If you already have an account, log in to access the content to which you are entitled.

View options

PDF

View PDF

EPUB

View EPUB

Full Text

View Full Text

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media

About Cookies On This Site

We use cookies to improve user experience on our website and measure the impact of our content.

Learn more

×