2016-08-12

IP & Privacy

Exploring Privacy as Commons

Katherine Strandburg & Brett Frischmann

Knowledge production/privacy as highly related, not orthogonal/opposed.  Knowledge production framework as a way of doing descriptive empirical case studies of how privacy works in context, which can aid policy design. Appropriate info flows take place in complex and variable forms, and understanding the variations is important.  Knowledge commons framework also lines up w/Helen Nissenbaum’s work on contextual integrity in the privacy realm. Norms and info transmission principles can be supplemented w/a broader conception of governance.

Privacy is community management that applies to resources and involves a group/community but doesn’t denote the resources, community, place, or thing: privacy is the institutional arragnement of these elements.

Meeting under Chatham House rules: identify or affiliation of speakers/participants can’t be revealed but participants are free to use the info received. Is this privacy or knowledge commons? It is both: encourages candor, openness, sharing of ideas. Once adopted, the rule governs the resources/knowledge produced and behavior. Reflects and shapes norms for participants; reinforces boundary b/t community members and nonmembers. It’s a good example of privacy/commons governance.  Norms of behavior at IPSC can also be described in the same way.  [E.g., I blog about talks but not about hallway conversations, I think he means.]

Studies of different research consortia for rare diseases, which are all about knowledge production: in both, there are IP issues on the fringes, but one really important issue that drives production is privacy w/r/t patient data. How do you get patients to participate?  What will happen w/clinical trials?

The basic characteristic distinguishing privacy from nonprivacy is institutionalized sharing of resources among members of a community: both jarring and useful.  We are accustomed to think of privacy as nonsharing, but privacy is often social; always connotes boundaries b/t sharing and nonsharing.  Doesn’t work at n=1, maybe not w/physical resources; sidelines normative debate and values; takes a long time and needs dedicated research community. Benefits: learn more about variance, nuance, obstacles/dilemmas, institutions; explore intersections w/knowledge commons, as w/big data; learn what people really care about and why; improving insittutional design.

Q: seems like a lot of work is done at different level of generality. Drug cos. are willing to claim protection for privacy as their justification for not sharing information.

A: the studies do provide the necessary details. Boundary crossing: sharing research w/community at large v. within the pharma co. You can get at boundary management by studying a variety of pharma patient communities: rare disease community is different than big pharma. In one case, pharma reps were part of the disease research community. We unpack what privacy means only if we study them systematically, asking the same set of questions to a bunch of different communities.

Q: sharing among corporations involves very different environments, cultures, etc. than sharing among friends—privacy as trust.  How do you translate an idea about privacy that’s inherently about individuals to a larger corporate environment?

A: look at the ends they set for themselves and how their practices interact w/ that.  Maybe withholding data benefits the internal community; our proposal doesn’t judge that or assume that it has social benefit.

Q: can anything be excluded from the definition of an institutional arrangement you offer? E.g., family, freedom.

A: not sure!

Silbey: Privacy is generally considered an individual right against the gov’t in constitutional law; we don’t study institutional mechanisms enough in law to figure out how individual rights are translated into a system.

A: he thinks of privacy as a means; ends are for society to determine.

Trickle Down Privacy

Ari Waldman

How we operationalize privacy in institutions.  Individual expectations of trust form contexts of privacy.  Bamberger/Mulligan’s work in 2010, 2015 about operationalizing privacy on the ground.  We can write all the laws we want, but what happens in corporations as they write policies or create products that suck in data or manipulate us into sharing information?  B/M showed: corporations began to take privacy more seriously, even though the law didn’t change much in 20 years; still swiss cheese like. What changed: development of robust privacy professional sphere, who understood that privacy was about trust.  Role of FTC in developing common law of privacy and data breach notification statutes also mattered, as well as tech changes where new products primarily implicated privacy. If that’s true that over 20 years companies have developed a more robust conception of privacy, why do we still have all these problems? Why are privacy notices still so terrible, unread, unhelpful?  Why are some companies more nimble w/privacy issues than others? Why do platforms get built specifically to manipulate people into sharing data they might otherwise not share?  Do practices start at the top? What about in-house lawyers, and people creating the tech/designing the products? Do they share the robust conception of privacy at the CPO level? And what’s the role of the user?  This matters to help companies that do care to structure their operations to take care of privacy, and for purposes of legal reform. FTC settlements just say “create comprehensive privacy program,” which generally means hiring a CPO, but if that doesn’t matter we should know.

Research design: interviews w/lawyers, programmers, engineers, members of privacy teams, project managers/tech leads.  Observation of product design process for an app that involves lots of user data.  Qualitative w/quantititave aspects.

Hypothesis: robust privacy won’t trickle down from CPO w/o active tech person lower down who shares that vision.  Tech people aren’t trained like lawyers or ethicists, but in efficiency/gathering data. May think about privacy in terms of notice, or user’s response.

Privacy leads even at middle management tend to think about privacy as more than notice, but also user trust, even if they don’t have a complete concept of what privacy is. Robust practices and guidelines exist in all but the newest startups. Lawyers think of privacy as notice pure and simple. They write privacy policies as legal documents; don’t care about impact on users’ decisions to share.  Their goal is to cover everything—cautious.  Technologists use the same words as robust privacy pros, but they fundamentally think about privacy as notice. Privacy becomes creating a product that’s fun and takes in data.  Privacy norms trickle down: only time he’s seen it trickle down is when the technologist designing it isn’t just given a mandate “take privacy seriously” but also shares the robust vision of privacy/trust.  May have something to do with education/training.  An engineer manager/product designer who feels the same way may also be able to produce the privacy trickle down.

Cyberlaw & Intermediary Liability

DMCA+ Enforcement in the New gTLDs

Annemarie Bridy

Rise of DMCA plus enforcement.  Two categories: Type 1 DMCA intermediaries are covered by DMCA but have privately agreed to do more.  Graduated response; link demotion for search engines; proactive content blocking (Content ID etc.). Type 2 are beyond the reach of secondary liability but have privately agreed to do more—payment network, ad network—notice and termination or blocking regimes.  Domain name registrars—pressure on ICANN and related entities to engage more actively.

Characteristics of DMCA plus: nominally voluntary but implemented under gov’t pressure: members of Congress, IPEC, USTR.  Privately negotiated w/o input from public or public interest groups.  Terms generally disclosed only partially, w/resistance.  Enforcement lacks transparency re: nature/volume of sanctions. Lack of procedural safeguards for accused infringers. Notable exception: Copyright Alert system, which was more transparent in substance and operation than other agreements.

Enforceable against users via provisions in intermediaries’ TOS that prohibit illegal activity/abuse and reserve right to terminate service at their sole discretion.

For TM, the ACPA and UDRP have existed since before 2000.  Domain Name System is a logical target b/c domain names often incorporate word marks.  Rarely requires assessment of underlying content of website, which means a critical difference from © enforcement.

Enforcing © through DNS is more recent; © owners like it b/c it enables cross border enforcement. First major development: PRO-IP Act of 2008, which became the basis of hundreds of domain names, from © to counterfeit pharmaceuticals. SOPA almost provided for court-ordered site-blocking. Courts have been asked to grant, and have been granting, site-blocking injunctions against US based nonparty registrars and registry operators.  Private ordering: MPAA and Donuts, which contains hundreds of new GTLDs.

Rightsholders saw in new GTLD process the opportunity to inject © related obligations between ICANN and registries/registrars.  In 2014, USTR included a new issue focus on domain name registrars in its annual Special 301 review of notorious counterfeit markets. Called for © owners to get new procedures/policies.  Music/movie industries most active in lobbying for new © enforcement.  Demanded increased commitments for © enforcement, especially those targeting music or digital content.  2013 version of Registrar Accreditation Agreement contained new obligations for accepting notices of infringement.

ICANN Registry agreement now requires registries to include in contracts w/registrars a provision requiring registrars to include in contracts w/registrants an obligation to refrain from © infringement and a promise of suspension.  Registrar Accreditation Agreement requires registrars to have abuse contacts to receive reports, w/duty to investigate and respond appropriately to claims.  Thus Registrar is contractually bound both to registry and ICANN. Complainants can seek redress through ICANN’s contractual compliance process by completing a simple online form.

Donuts is registry operator for .movie, .wine, .computer, .education., .clothing and others. MPAA has announced another partnership and created a template for agreements w/registry operators.  Donuts thus requires adherence to ICANN and acceptable use policies.  Permits registry to delete, suspend, revoke, transfer or cancel the offending domain name.  Donuts agrees to treat MPAA notices expeditiously and w/presumption of credibility, like Google’s trusted removal program for search.  Standard for complaint: has to be clear and pervasive © infringement before approaching registry; first must go to registrar of record and hosting provider; complaint must state DMCA-like good faith belief; must be the result of human review.  Intended to limit volume of notices under the program.

Normative concerns: presumption of guilt; target/sanctions affect entire domain, not URLs; no requirement of attempt to contact the registrant despite the requirement to look up WHOIS information. Lack of clarity about what’s clear and pervasive infringement; what’s careful human review. Lack of procedures for registrants to contest complaints/appeal sanctions; lack of transparency.

Goldman: great to do all this digging; glad it was you and not me.  [I’ve joined ICANN’s TM review group and I share this sentiment.]  Is this an unstoppable train? Is there a way to combat that, similar to §512(f) for wrongful takedowns? Is there any cause of action possible?  W/o §512(f), fox is in henhouse; what can the chickens do?  We need a better §512(f).

A: that’s a hole in the law, and not clear what public law can do b/c users have consented to terms of use. More productive way to go about this: try to get these agreements to look more like the Copyright Alert system. That had a right to a third party appeal to a neutral third party, and these don’t.  Can’t get details of Donuts agreement or Radix agreement though did get template for trusted notifier agreement.

Justin Hughes discussion: Someone registers Harrypotter.education, and MPAA detects a bunch of streaming going on. They’re under no obligation to contact the registrant?  Yes. They’re under an obligation to contact the registrar, then Donuts.  Then the registry is under an obligation to assess clear & pervasive © infringement identified through human review—it’s a bit of a black box.  If Donuts finds so, they are obligated to cut off the registrant no matter what the registry has found.

A: Donuts has said that there have been 6 complaints filed under trusted notifier system; 3 domain names blocked.  This was their evidence that it’s working, but no info is available, for example about what a user sees when a site has been blocked or locked.

RT: ICANN could require disclosure/transparency in its agreements. There is something we as a community can do: join ICANN’s working groups on these issues. I’ve done it for TM and it is not fun, but it is necessary work and right now they are not hearing from the policy/academic community, only from people with stark economic interests.  Show up!  Voice matters at ICANN.

IP, the Constitution and the Courts

A Free Speech Right to Trademark Protection?

Lisa Ramsey

International issues: US and other countries are members of Paris Convention, w/obligations to allow certain registrations.  Says that nations may deny registration/invalidate registrations for marks contrary to morality or public order. WTO members agreed in TRIPS to keep that the same.  International conventions on human rights—allow restrictions to freedom of expression if necessary to protect public order and morals; rights and reputations of others; to prevent incitement to violence.

Consider, not just in the US but as a template for evaluating free expression issues: 1. Gov’t action. Who is regulating the expression?  If FB deletes your post, there’s no state action.  If it’s a misleading ad taken down by the FTC, that’s gov’t action though ok.  In Tam, the gov’t action is a law barring registration of disparaging TMs (gov’t inaction).

2. Suppression, punishment, or other harm to expression. Consider how the regulation actually harms expression. Unconstitutional conditions doctrine: big debate.  Ramsey’s position is that unconstitutional conditions shouldn’t apply where the benefit being denied is the right to suppress the free speech of others.

3. What’s being regulated? TMs are expression, even though you sometimes see people deny it.

4. Whose expression is being regulated?  Tam is not about gov’t speech—TM registration is individual speech.  Could also consider whether corporations have free speech rights, though they do in the US.

5. Are there categorical exclusions for this type of expression?  Misleading commercial speech, incitement to violence. But scandalous/disparaging marks aren’t categorically excluded.

6. Whether the regulation of expression fails constitutional scrutiny—level of constitutional scrutiny depends on local doctrine.  Is it content- or viewpoint-based?  Does it cover commercial or noncommercial speech? Requires evaluation of law’s purpose, fit between law and purpose, amount of harm to expression.

Upholding options: SCt might use unconstitutional conditions doctrine to say that §2(a) is constitutional.  Could say it satisfies constitutional scrutiny, though unlikely to say it satisfies strict scrutiny.  Or it could go the (c) route, treating (c) differently than other kinds of speech as long as Congress doesn’t alter the “traditional contours.”  Offensive TM laws seem pretty traditional; but might be a problem for dilution.

Linford: Ginsburg isn’t going to want to go near “traditional contours”—Golan signals that traditional contours means only 2 things.  What about Harper & Row claiming that there’s no conflict, and we’ll say hands off.

Q: What is the state action requirement?  Enforcement of TM including injunctive relief.

RT: my question was similar—Linford says “hands off” but what does that mean?   “In Tam, the gov’t action is a law barring registration of disparaging TMs” but that’s gov’t inaction.

A: When the examiner denies your application that’s gov’t action.

RT: but in that case if I go to court and say “FB suspended my account for using a non-real name and you should bar that part of the TOU b/c it violates my free speech rights” then you also get state action in enforcing FB’s contractual terms.

A: true.  Reminder that TM registration allows lots of suppression of speech—can interfere w/T-shirts, merchandising, claim dilution, etc.

Charles Duan: disparaging marks have particularly strong expression values—people use them to express feelings.  Preventing others from using those terms may thus be worse than ordinary suppression through TM.

A: yes, one of the dissents does a really good job—makes a difference from ordinary unconstitutional conditions cases, where benefit sought was not the right to suppress others’ speech. Still, troublesome to have individual examiners deciding what’s disparaging.  Internationally, nations can decide (Afghanistan bars marks that are harmful to chastity).

Pam Samuelson: Different nations have different ideas of scandalousness, public order, disparagement. Are you thinking we need harmonization?

A: the opposite. We need to allow nations to make their own decisions.  I worry that after Tam, people will go to other countries and demand registration of these marks. Some people will only register marks that they can register in multiple countries, so there's a chilling effect no matter what.

http://tushnet.blogspot.com/feeds/posts/default?alt=rss

Show more