How the technical community fails at multi-stakeholderism

IGFWatch news

How the technical community fails at multi-stakeholderism
User: terminus
Date: 6/10/2013 8:11 pm
Views: 7091
Rating: 3    Rate [ 1, 2, 3, 4, 5 ]

One of the standard arguments that the United States and other developed countries make in opposing changes to Internet governance is that the Internet is already well governed through a multi-stakeholder model by a network of grassroots Internet technical community organisations. These are said to include the IETF (Internet Engineering Taskforce), ICANN (the Internet Corporation for Assigned Names and Numbers), the RIRs (Regional Internet Registries) and the W3C (World Wide Web Consortium).

Yet when you look a little closer, none of these organisations actually represent grassroots Internet users, or are even multi-stakeholder by any usual definition of the term. Neither are they capable of legitimately dealing with broader public policy issues, that go beyond the development of purely technical standards development and the allocation of Internet resources. As a result, the process by which they reach such decisions is undemocratic, and some of the policy choices embodied in those decisions are unsupportable.

Unfortunately those organisations often don't seem to realise this, and will quite happily go about making policy heedless of their own limitations. An example is the failed process by which the W3C's tracking preference working group sought to develop a specification for a standard called "Do Not Track" or DNT. The concept behind this standard (which I've written about in detail elsewhere) was to specify how a website or advertiser should respond to a notification expressed by a user (typically through a browser setting) that they do not wish to be tracked online.

The W3C is not the only example of this sort of dysfunction. The IETF has (to its credit) acknowledged its own limited inclusiveness (its parent body the IAB has 11 white males on its board of 13), ICANN has recently received blistering criticism over its failure to pay attention to the community's wishes (while drawing in millions from the new global top-level domain goldrush), and soon to be released research will unveil how decisions of the RIRs such as APNIC are similarly driven by shallow discussion from a narrow segment of stakeholders (even though this takes place on notionally open mailing lists).

The underlying problem is that the Internet community bodies have been captured by industry, and by a narrow segment of civil society that is beholden to industry (exemplified by the global Internet Society, ISOC). As a result Internet technical standards are biased in favour of a US-led, free market-directed model of competition, which fails to incorporate broader public interest objectives (this has even been formalised in the OpenStand Declaration). Standards development that involves issues such as consumer privacy and access to knowledge is a political process, and as such, capture by powerful interests becomes inevitable unless safeguards are set in place.

The industry-led specifications that have resulted from this paradigm speak for themselves. In July this year, industry released a standard for mobile apps to notify users of data collection using short-form notices, rather than lengthy privacy policies. This voluntary standard, although based on a supposedly multi-stakeholder process set up by the US National Telecommunications and Information Administration (NTIA), has been criticised by American consumer groups both for its substance and for the process by which it was developed, which allowed an industry-dominated panel to push through a code that served their commercial interests.

Another example is the United States' Copyright Alert System (CAS), by which Internet users' privacy is sacrificed to facilitate the delivery of copyright infringement notices to those who share content online – the system does not take account of "fair use" or other copyright user rights. This follows on from the 2007 Principles for User Generated Content Services, also written by industry, that were adopted by most major content platforms, and from codes agreed by major credit card companies and payment processors in June 2011, and by advertisers in May 2012, to withdraw payment services from websites allegedly selling counterfeit and pirated goods. No consumer representatives (or even elected governments) had any say in the development of these codes. How is this a "multi-stakeholder" model?

True multi-stakeholder processes (as defined at the 2002 Earth Summit, long before the Internet technical organisations appropriated the term) are:

processes which aim to bring together all major stakeholders in a new form of communication, decision-finding (and possibly decision-making) on a particular issue. They are also based on recognition of the importance of achieving equity and accountability in communication between stakeholders, involving equitable representation of three or more stakeholder groups and their views. They are based on democratic principles of transparency and participation, and aim to develop partnerships and strengthened networks between stakeholders.

Although often described (for example by the United States government, and bodies like ISOC that follow US foreign policy) as "the" multi-stakeholder model of Internet governance, the Internet technical community organisations actually don't tend to embody these principles very well. Although they are typically open to participants from stakeholder groups, no attempt is made to balance their participation so that the voices of weaker stakeholders (such as consumers) are not drowned out by those with the most resources or privilege. Having open mailing lists is not enough, and indeed can mask abuses of the process – after all, it has been revealed that the NSA used IETF processes with the aim of weakening encryption standards.

Continue reading in "Web Consortium's failures shows the limits of self-regulation" at Digital News Asia.

Re: How the technical community fails at multi-stakeholderism
User: jcurranarin
Date: 8/10/2013 11:02 am
Views: 4970
Rating: 3    Rate [ 1, 2, 3, 4, 5 ]
It is true that industry self-regulation does not automatically result in inclusiveness, and even co-regulation processes are not necessarily insure adequate consideration of public policy issues, particularly with respect to issues that disproportionately affect weaker stakeholders.

However, this is not due to the lack of ability to participate (to the contrary, the referenced organizations have done a solid job in providing mechanisms which are open, participatory, and transparent) but far more likely the normal result of the technical focus of these organizations, as opposed to any failure to consider the input received.

I can state that in the case of ARIN (the Regional Internet Registry [RIR] serving Canada, US, and a portion of the Caribbean), we work very hard to make sure that openness is maintained and that all views can be heard during policy development. The problem, as I see it, is that even the most well articulated public policy concern may not hold much sway without an existing public policy basis (e.g. high-level guidelines or principles that have been through traditional pubic policy development process found in law and regulation.) With respect to ARIN's address policy development, applicable public policy guidance (for example, Canada's Personal Information Protection and Electronic Documents Act) has always been given due consideration in the deliberations.

The challenge (for all of us) is in finding models which provides for the full and open consideration of public policy issues, with all stakeholders able to be heard, and yet also respect the interconnected and global nature of the Internet itself, which requires corresponding global technical standards and practices. Governments have been understandably cautious in exploration of public policy issues with respect to the Internet, and yet it will take their engagement in addition to civil society and the technical community to establish balanced and globally applicable public policy norms.

The present open and transparent multi-stakeholder model used in Internet standard and policy development does function extremely well within the limited scope of technical matters, but indeed, it may need to be complemented or evolved with new mechanisms in consideration of the increasing large role of the Internet in economic and social development globally. This is likely to be one of the dominant discussion themes at IGF in Bali, and it is a challenge worthy of everyone's consideration.

Thank you for raising this interesting topic,

John Curran
President and CEO
[Previous] [Next]
© 2017 Jeremy Malcolm and contributors. Licensed under a Creative Commons Attribution-ShareAlike Licence. Powered by WebGUI.