Uncategorized

Election integrity and platform governance updates

I haven’t been doing a good job of updating this site, and a full refresh is in development, but I wanted to pin some recent work at the top. Below is some info on recent work on the Canadian election monitoring project, on platform governance, and a range of writing, talks and media on both.

The Digital Democracy Project

This is a joint initiative led by the Max Bell School of Public Policy at McGill University and the Public Policy Forum.  We are currently doing a large scale monitoring project of the Canadian election. 

The project has two components. An online monitoring effort that is collecting and analyzing social and traditional media data, being led by Derek Ruths at McGill, and a survey effort that is doing weekly national surveys and a metered online consumption study being led by Peter Loewen at the University of Toronto. Together we are seeking to track the online media ecosystem during the election and to assess behaviour change based on exposure to both political and media narratives as well as disinformation. We published weekly reports through the election and will be producing an extensive post election report.  We have released seven reports to date, which can be found here:

I am also the Co-PI of the Digital Election Research Challenge, with Elizabeth Dubois, which is funding 18 research teams to study the online ecosystem during the Canadian election. Further details of this collaboration can be found here.

Finally, I am involved in the production of an election podcast called Attention Control.The show is hosted by Kevin Newman and produced by Antica. I do weekly segments on the show, our research is integrated into the podcast, and a detailed interview about the project can be found in the second half of this episode (starts at: 21:28). And some reflections on the election can be found in our post election episode here.

Recent Media

Recent Writing

  • Who will answer the Christchurch Call? Nobody, if tech platforms continue ungoverned, The Globe and Mail
  • Big Tech’s net loss: How governments can turn anger into action, The Globe and Mail
  • The era of Big Tech self-governance has come to an end, The Globe and Mail
  • We can save democracy from destructive digital threats, The Globe and Mail
  • The new rules for the internet – and why deleting Facebook isn’t enough, The Globe and Mail
  • Democracy Divided: Countering Disinformation and Hate in the Digital Public Sphere, PPF.
  • Ungoverned Space: How Surveillance Capitalism and AI Undermine Democracy, CIGI 

Recent Presentations

Uncategorized

Statement to the International Grand Committee on Big Data, Privacy and Democracy

Last week had the privilege of appearing before the International Grand Committee (representatives from 12 countries) alongside Maria Ressa, Shoshana Zuboff, Jim Balsille, Heidi Tworek, Jason Kint, Ben Scott and Roger McNamee. After years of working on this wicked set of problems, it was a major milestone to see the attention of smart and focused law makers zero in on the structural problems at the core of this issue. They agree on the problem, now it’s time to act.

Here is the video and text of my statement.

Video: https://www.pscp.tv/HoCCommittees/1LyGBAbRLjzKN?t=39m36s

Co-Chairs Zimmer and Collins, Committee Members;

Thank you for having me, it is an honor to be here. I am particularly heartened because even three years ago a meeting like this would have seemed unnecessary by many in the public, the media, the technology sector, and by governments themselves.

But we are now in a very different public policy moment, about which I will make five observations.

First, self-regulation (and most forms of co-regulation) have and will continue to prove insufficient to this problem. Like in the lead up to the 2008 financial crisis, the financial incentives are powerfully aligned against meaningful reform. These are publicly traded, largely unregulated companies, whose shareholders and directors expect growth by maximizing a revenue model that is itself part of the problem. This growth may or may not be aligned with the public interest.

Second, the problem is not one of bad actors but one of structure.Disinformation, hate speech, election interreference, privacy breaches, mental health issues, and anti-competitive behavior must be treated as the symptoms of the problem, not its cause. Public policy should therefore focus on the design and the incentives of the platforms themselves.

It is the design of the attention economy which incentivizes virality and engagement over reliable information. It is the design of the financial model of surveillance capitalism which incentivizes data accumulation and its use to influence our behavior. It is the design of group messaging, which allows for harmful speech, even the incitement of violence, to spread without scrutiny. It is the design for global scale that has incentivized imperfect automated solutions to content filtering, moderation and fact checking. And it is the design of our unregulated digital economy that has allowed our public sphere to become monopolized.

If democratic governments determine that this structure is leading to negative social and economic outcomes, then it is their responsibility to govern.

Third, governments that are taking this issue seriously are converging on a similar platform-governance agenda. This agenda recognizes that there are no silver-bullets, and that instead policies must be domestically implemented and internationally coordinated across three domains.

Content policies which seek to address a wide range of both supply and demand issues about the nature, amplification, and legality of content in our digital public sphere.

Data policies which ensure that public data is used for public good and that citizens have far greater rights over the use, mobility and monetization of their data.

And Competition policies which promote free and competitive markets in the digital economy.

Fourth, the propensity in the platform governance conversation to overcomplicate solutions serves the interests of the status quo. There are actually many sensible policies that could and should be implemented immediately.

The online ad microtargeting market must be made radically more transparent, and in some cases suspended entirely.

Data privacy regimes should be updated to provide far greater rights to individuals and greater oversight and regulatory power to punish abuses.

Tax policy can be modernized to better reflect the consumption of digital goods and to crack down on tax base erosion and profit shifting.

Modernized competition policy can be used to restrict and rollback acquisitions and to separate platform ownership from application or product development.

Civic media can be supported as a public good.

Andlarge-scale and long-term civic literacy and critical thinking efforts can be funded at scale by national governments.

That few of these have been implemented is a problem of political will, not policy or technical complexity.

Finally, there are three policy questions for whichthere are neither easy solutions, meaningful consensus nor appropriate existing institutions, and where there may be irreconcilable tensions between the design of the platforms and the objectives of public policy.

The first is how we regulate harmful speech in the digital public sphere. At the moment, we have largely outsourced the application of national laws, as well as the interpretation of difficult tradeoffs between free speech and personal and public harms to the platforms themselves – companies who seek solutions that can be implemented at scale globally. In this case, what is possible technically and financially, might be insufficient for the public good.

The second is who is liable for content online? We have clearly moved beyond the notion of platform neutrality and absolute safe harbor, but what legal mechanisms are best suited to holding platforms, their design, and those that run them accountable?

Third, as artificial intelligence increasingly shapes the character and economy of  our digital public sphere, how will we bring these opaque systems into our laws, norms and regulations?

These difficult conversations should not be outsourced to the private sector, they need to be led by democratically accountable governments and their citizens. But this is going to require political will and policy leadership – precisely what this committee represents.

Thank you again for this opportunity.

Uncategorized

Oped on Christchurch Call, IGC and Digital Charter

Here is a recent oped in the Globe and Mail in advance of my appearance before the International Grand Committee on Big Data, Privacy and Democracy:

Who will answer the Christchurch Call? Nobody, if tech platforms continue ungoverned

Speaking to a technology conference in Paris last week, Prime Minister Justin Trudeau – a leader who has long championed the political and economic benefits of digital technology – channelled our cultural moment of tech backlash.

“What we’re seeing now is a digital sphere that’s turned into the Wild West,” he argued. “And it’s because we – as governments and as industry leaders – haven’t made it a real priority.”

This change in tone came the day after he signed the Christchurch Call – an effort led by New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron to curb the problem of viral hate speech and violent content online in the wake of a massacre that was livestreamed and distributed on platforms such as Facebook and YouTube.

But while it was a helpful rallying call, the Christchurch compact was also ultimately a missed opportunity. It has no enforceable mandates, it focuses overwhelmingly on technical fixes to what are also political, social and economic problems and its framing around terrorism and hate speech is far too narrow, treating the symptom of the problem while ignoring the underlying disease. We don’t need to militarize the problem or play Whac-A-Mole with extremists: We need to govern platforms. The Christchurch Call won’t accomplish that.

In its wake, this week, the International Grand Committee on Disinformation and Fake News, a group of parliamentarians from 14 countries, continues its work with a second set of hearings in Ottawa. The work of the committee (of which I will serve as a witness) has become a catalyst for a community of scholars, policy-makers and technologists who believe that a broader conversation about tech governance – one that squarely addresses problems embedded in the design of digital platforms themselves – is long overdue.

These problems include the financial model of what Harvard professor Shoshana Zuboff calls “surveillance capitalism,” by which vast stores of data about our lives are used to target content designed to change our behaviour. The problems of surveillance capitalism include the way platforms manage their vast scale using opaque, commercially driven and poorly understood algorithmic systems and the market dominance of a small number of global companies with rapidly growing purchase on our social, political and economic lives.

While governments have been slow to take on the challenge of governing big tech, those that turned attention to this policy space in a serious way are coming to markedly similar conclusions. In short, that there are no silver bullets to the social and economic costs caused by the platform economy.

Instead, governments in France, Germany, Britain, the European Union, New Zealand, Australia, Canada and even a growing number of political leaders in the United States are articulating a need for a broad and comprehensive platform-governance agenda that is both nuanced to account for domestic differences in areas such as free-speech laws, and internationally coordinated to create sufficient market pressure.

The contours of this agenda are taking shape through three policy frameworks. The first: content policies. Democratic governments need to decide whether their speech laws require updating for the digital world and how they will be enforced. At the moment, we have delegated this regulatory role to the platforms, who hire thousands of moderators to enforce their terms of service agreements. Democratizing this system will involve difficult decisions around liability (who should be liable for speech online, the individual who spoke, or the company that amplified, and profited off, the speech?), moderation (who is responsible for implementation, the platforms who host and filter content, or the governments that are ultimately democratically accountable?) and transparency (how can we bring daylight to the secretive art of microtargeting, by which advertisers target and effectively influence narrow bands of people using extremely precise data?). Early experiments in content policy by Germany and France are yielding evidence of what works and what doesn’t, examples upon which other countries can iterate.

Second: data policies. If we believe in the premise that society should be able to leverage public data for the public good, citizens should have far greater rights over the use, mobility and monetization of their data, and regulation must be matched with meaningful enforcement. Even the reported FTC fine of $5-billion to Facebook was seen as inconsequential by the markets. The EU General Data Protection Regulation provided an example for such a package that is being tinkered with in other jurisdictions, including in California, where California Consumer Privacy Act is poised to push Silicon Valley directly from its home state.

Third: competition policies. The EU and Britain have begun to explore new ways to curb the power of digital giants, and several of the U.S. Democratic presidential candidates have come out in favour of pursuing antitrust regulation. Such efforts could also include restrictions or rollbacks on how services and platforms are acquired and developed, as well as antitrust oversight that accounts for more than price increases in judging a company’s market power, but also how much data it controls, whether it constrains innovation and whether it threatens consumer welfare.

On Monday, Innovation Minister Navdeep Bains built on Mr. Trudeau’s speech, laying out the 10 principles of a proposed Digital Charter. It’s a signal that Ottawa might finally be ready to take a broader view of its responsibilities. But whether this charter can be more than a collection of digital initiatives and instead become a co-ordinated policy agenda, implemented with the urgency that the problem demands, remains yet to be seen.

Uncategorized

End of year Globe and Mail oped

Some year end reflections and thoughts on the year to come for the Globe and Mail:

Big Tech’s net loss: How governments can turn anger into action

It has been a game-shifting 2018 for Big Tech. It was the year that long-simmering concerns about its potential negative effects on our economy, on our personal lives and even on our democracy broke into public debate.

It was the year that much of the media got serious about tech journalism, when the balance of tech journalism tipped from gadget reviews and chief-executive profiles to treating Silicon Valley as a node of power in society to be held accountable.

It was the year that tech company employees began holding their employers to account. At Google, walkouts were staged over gender policy, and petitions demanded an end to Chinese expansion plans and to the development of “warfare technology.” Protests were held over Microsoft contracts with the U.S. Immigration and Customs Enforcement agency. There was backlash against the use of facial recognition to assist law enforcement at Amazon. And at Facebook, employees started speaking far more openly to the media as the company careened from scandal to scandal.

It was the year that tech executives awoke to their new operating environment. The U.S. Congress and parliaments around the world ordered CEOs, accustomed to being adored as the leaders of venerated companies, to testify and answer tough questions. It was also the year that these same CEOs, whether motivated by sincere interest in fixing structural problems with their products or concern for protecting public image and shareholder value, began making meaningful reforms to their companies, to varying degrees of success.

Finally, and perhaps most consequentially, 2018 was the year that tech companies lost the benefit of the doubt from governments. This was a result of a growing body of investigations, academic research and enterprise reporting detailing the ways in which social platforms have been used to undermine democracy. It also stems from a concern that the economic benefits of the digital economy are flowing predominantly to a small handful of U.S.-based global companies. But the final straw for many legislators was a November article in The New York Times revealing a disconnect between Facebook’s public statements about abuses on their platform and the aggressive tactics being used by executives to fight the story. To many in government, this confirmed that the tech-sector giants should be treated like any other large multinational corporation, and that it’s time to get serious about governing Big Tech.

Luckily, there are some relatively easy places for governments to start. They can bring sunlight to the world of micro-targeted advertising through new transparency laws. They can overhaul data-privacy regimes that are limited in scope, weak in capacity and unco-ordinated globally. They can mandate the identification of automated accounts so that citizens know when they are engaging with a machine or a human. They can modernize tax and competition policy for the digital economy. And they can fund large-scale digital literacy initiatives for citizens of all ages.

But beyond these short-term Band-Aids, 2019 must also be the year we start grappling with a set of thornier questions at the intersection of technology and democracy.

Democratic governments will need to wrestle with how their speech laws apply to the digital world. This is going to require bringing together the private sector and civil society in a hard discussion about the nature and limits of free speech, about who is censored online and how, about responsibilities for moderating speech at scale, and about universal versus national speech norms.

And while the idea that platform companies are simply intermediaries – and therefore not liable for how their services are used – has been foundational to the innovation, growth and empowerment created by the open internet, the sheer breadth of the economic and social services now provided by platforms might demand a more nuanced approach to how they are governed. If this comes at the cost of that innovation, democracies must be allowed to decide about the trade-off.

Such democracies will need to start co-ordinating their public-policy efforts around emerging technologies, too. There is currently a disconnect between the global scale, operation and social impact of technology companies and the national jurisdiction of most countries’ tech laws and regulations. As former BlackBerry co-CEO Jim Balsillie has argued, the digital economy may need its Bretton Woods moment.

How we handle these challenges will set the tone for how we’ll grapple with the even knottier ones that are to come. As de facto public places increasingly involve private interests – such as Alphabet’s planned smart city in Toronto, or Amazon’s competition over which city would earn the right to be home to its HQ2 headquarters – governments will need to lead a conversation about what this collision looks like. What, for instance, would it mean to treat the data created by the citizens of cities as a public good?

And while governments devote substantial resources to growing the business of artificial intelligence, which promises to reshape broad aspects of our lives, we must work ahead to ensure these nodes of decision-making power are brought into the norms of accountability and transparency that we demand in democracies.

This year was defined by outrage against tech – but 2019 will be the year that the long and messy process of governing it begins.

Uncategorized

Globe and Mail Oped: We can save democracy from destructive digital threats

I had the privilege of speaking to the Federal Cabinet retreat this week. Details from the event can be found here. I was there to address the challenges of misinformation and disinformation in relation to the upcoming election. This oped, published in the advance of the retreat, provides some context to this issue, and is based on Ed Greenspon and my recent Democracy Divided report.

 

A decade ago, governments and regulators allowed Wall Street to run amok in the name of innovation and freedom until millions of jobs were lost, families were forced from their homes and trust in the financial system was decimated.

Today, the same kinds of systemic risks – so-called because the damage ripples way beyond its point of origin – are convulsing the information markets that feed our democracy.

The growth of the internet has resulted in tremendous opportunities for previously marginalized groups to gain voice, but an absence of a public-interest governance regime or even a civic-minded business ethos has resulted in a flood of disinformation and hate propagated by geopolitical, ideological, partisan and commercial operatives.

The result is that the giant digital platforms that now constitute a new public sphere are far too often being used to weaponize information, with a goal of deepening social divisions, fostering unrest and ultimately undermining democratic institutions and social cohesion. As we’ve seen in other countries, the integrity of elections themselves are at risk.

What can be done?

Some people say we need to invest in digital literacy. This is true, as is the broader need to increase civic knowledge and sharpen critical thinking skills. Yet this isn’t sufficient in itself. When Lake Erie was badly polluted a generation ago, signs were erected along the beaches warning swimmers to stay out of the water. But governments also passed laws and enforced regulations to get at the source of the pollution.

Others say these issues are not present in Canada. That would be a welcome kind of exceptionalism if remotely true. But misogynists, racists and other hate groups foment resentment online against female politicians and just about anyone else. Both the Quebec City mosque shooter and the suspect in the Toronto van attack were at least partially radicalized via the internet. That said, research into digital threats to our democracy is so thin in this country that we know almost nothing about who is purchasing our attention or exploiting our media ecosystem. There’s certainly no basis for complacency about protecting Canada’s 2019 federal election against attacks that would never be tolerated if they manifested themselves physically rather than digitally.

Here are some measures that merit serious consideration. The Elections Act needs to be reformed to bring complete transparency to digital advertising. Publishers and broadcasters are legally obligated to inform their audiences about who purchases political ads in election campaigns. Canadians have the same right to know about who is paying for digital ads and to whom they are being targeted.

Secondly, we need to do more to make sure that individuals exercise greater sovereignty over the data collected on them and then resold to advertisers or to the Cambridge Analyticas of the world. This means data profiles must be exportable by users, algorithms and AI must be explained, and consent must be freely, clearly and repeatedly given – not coerced through denial of services.

Thirdly, platforms such as YouTube, Facebook and Twitter need to be made liable to the same legal obligations as newspapers and broadcasters for defamation, hate and the like. Some people say this would amount to governments getting into the censorship business. That’s simply wrong; newspaper publishers and editors abide by these laws – or face the consequences – without consulting government minders. These digital platforms use algorithms to perform the same functions as editors: deciding what readers will see what content and with what prominence.

A fake news law would be a trickier proposition, but it is not impossible to think anew about a statute that existed in Canada’s Criminal Code from 1892 to 1992, until it was deemed unconstitutional in a split decision. It said that anyone who “wilfully publishes a statement, tale or news that he knows is false and that causes or is likely to cause injury or mischief to a public interest is guilty of an indictable offence.” The key words here are “wilfully” and causing “injury” to the public interest. We’re not sure such a measure is warranted, but as with the 1960s commission that recommended hate laws in Canada, we think it’s worth public discussion.

In the new digital public sphere, hate runs rampant, falsehood often outperforms truth, emotion trumps reason, extremism muscles out moderation. These aren’t accidents. They are products of particular structures and incentives. Let’s get with the program before democracy has its own Great Recession.

Uncategorized

Democracy Divided: Countering Disinformation and Hate in the Digital Public Sphere

Ed Greenspon and I have just published a report as a collaboration between the UBC School of Public Policy and Global Affairs and the Public Policy Forum, called Democracy Divided: Countering Disinformation and Hate in the Digital Public Sphere. The Report outlines what we see as a structural problem in our current information ecosystem that has led to our current problem of mis and disinformation, and details a range of policy ideas being discussed and tested around the world.

The report can be downloaded here.

And the Introduction is below.

Introduction:
For more than a quarter-century, the internet developed as an open web—a system to retrieve and exchange information and ideas, a way of connecting individuals and building communities and a digital step forward for democratization. It largely remains all these things. Indeed, the internet is supplanting the old concept of a public square, in which public debate occurs and political views are informed and formed, with a more dynamic and, in many ways, inclusive public sphere. But along the way, particularly in the last half-dozen years, the “open internet” has been consolidated by a handful of global companies and its integrity and trustworthiness attacked by malevolent actors with agendas antithetical to open societies and democratic institutions. These two phenomena are closely interrelated in that the structures, ethos and the economic incentives of the consolidators—Google (and YouTube), Facebook and Twitter in particular—produce an incentive system that aligns well with the disseminators of false and inflammatory information.

The digital revolution is famous for having disrupted broad segments of our economy and society. Now this disruption has come to our democracy. The Brexit referendum and the 2016 American election awakened the world to a dark side of digital communications technologies. Citizens and their governments are learning that a range of actors—foreign and domestic, political and economic, behaving in licit and illicit ways—can use disinformation, hate, bullying and extremist recruitment to erode democratic discourse and social cohesion both within and outside of election periods. And the problem is getting worse.

By and large, the internet has developed within a libertarian frame as compared, for instance, to broadcasting and cable. There has been until recently an almost autokinetic response that public authorities had little or no role to play. To some extent, the logic flows from a view that the internet is not dependent on government for access to spectrum, so therefore no justification exists for a government role. So long as it evolved in ways consistent with the public interest and democratic development, this logic—although flawed—was rarely challenged. And so governments around the world—and tech companies, too—were caught flat-footed when they discovered the internet had gone in directions unanticipated and largely unnoticed.

Today, the question is how to recapture and build on the values of the open internet so that it continues to promote the public good without also facilitating the run-off of social effluents and contaminants that pollute public discourse and the very security of open societies. “Keeping the web open isn’t enough,” said World Wide Web founder Tim Berners-Lee in 2017. “We need to make sure that it’s used in a way that’s constructive and promotes truth and supports democracy.”

It is not surprising that more than 50 years after its creation and a quarter century following the development of the World Wide Web, a sweeping review is required. With this paper, we seek to explore the fundamental challenges that have arisen. We will offer a range of policy options for consideration because there is no single fix. We do so understanding that the combination of the urgency and novelty of these threats creates a tension of needing to execute corporate and public policy in quick order yet with high precision given the possibility of unintended consequences to innovation and free expression. Nobody wants to suppress individual rights on the way to rebuilding trust or discourage the pioneering spirits that have made the internet so central to our lives. Yet doing nothing is not an option either; the current track is unacceptable for both civic life and fair and open marketplaces.

In some cases, this report will suggest actions; in others, the need for more study and more public engagement. In all instances, we believe that certain behaviours need to be remedied; that digital attacks on democracy can no more be tolerated than physical ones; that one raises the likelihood of the other in any case; and that a lowering of standards simply serves to grant permission to those intent on doing harm.

On April 5-6, 2018, PPF and the University of British Columbia’s School of Public Policy and Global Affairs convened a mix of subject matter experts, public officials and other interested parties from academia, philanthropy and civil society. This workshop flowed out of PPF’s 2017 report, The Shattered Mirror: News, Democracy and Truth in the Digital Age, which provided a diagnostic of the deteriorating economics of journalistic organizations, an analysis of negative impacts on Canadian democracy and recommendations for improving the situation. Named in recognition of a 1970 Senate of Canada study of mass media called The Uncertain Mirror, the PPF report noted that in the intervening decades this mirror has cracked and shattered under the pressure of content fragmentation, revenue consolidation and indifference to truth. Now we are speaking of the need for the internet to become a more faithful mirror of the positive attributes of greater human connectivity. This latest piece of work is part of continuing efforts by PPF to work with a wide range of partners in addressing two distinct but intertwined strands (think of a double-helix in biology): how to sustain journalism and how to clean up a now-polluted—arguably structurally so—internet. The April workshop succeeded in sharing and transferring knowledge about recent developments and what might be done about them among experts and policy-makers. It was capped by a public event featuring some of the leading thinkers in the world on the state of the digital public sphere. This report advances the process by canvassing a range of possible policy responses to a rapidly evolving environment replete with major societal consequences still in the process of formation.

PPF hosted a follow-up workshop on May 14-15, 2018, which brought international and Canadian experts together to discuss policy and industry responses to disinformation and threatening speech online, a report from which will be published in the fall.

The report is divided into three parts:

  • Discussion on the forces at play;
  • Assumptions and principles underlying any actions; and
  • A catalogue of potential policy options.

We submit Democracy Divided: Countering Disinformation and Hate in the Digital Public Sphere in the hopes of promoting discussion and debate and helping policy-makers steer the public sphere back toward the public good.

 

Uncategorized

Globe and Mail oped: The era of Big Tech self-governance has come to an end

piece in the Globe and Mail on the Zuckerberg hearings:

 

The era of Big Tech self-governance has come to an end

Twenty years ago, another young Silicon Valley tycoon was grilled in front of the U.S. Congress. Then, as this week, Congressional leaders grandstanded, asked long-winded questions, and showed at times shocking ignorance about how technology worked. And then, as this week, a tech CEO was contrite, well-rehearsed, and obfuscated on key aspects of his business practices.

But the hearings had consequences. They led to an anti-trust lawsuit brought against Microsoft by the U.S. Department of Justice and the Attorneys General of 20 U.S. states. Instead of trusting Bill Gates and Microsoft to behave better or act differently, the government punished them for perceived wrongdoings.

This is how democratic governance is supposed to work. We don’t have to simply trust citizens and corporations to act in the benefit of society; we impose rules, regulations and appropriate punishments to incentivize them to do so.

In the years since Mr. Gates’s testimony, a new generation of digital technology monopolies has emerged, reshaping online life and concentrating activity on a series of giant, global platforms. And they have done so in a policy context virtually void of regulation.

But in 2018, it’s hard to ignore the many troubling cases of abuse regularly perpetrated on and by platforms, from the manner in which the Russian government used the tools provided by companies such as Facebook and Google to interfere in the 2016 U.S. election, to the way in which hate groups in countries such as Myanmar have organized mass violence against minority populations.

Both the government and Mark Zuckerberg know that citizens are finally paying attention to the political impact of Facebook and its effect on our elections, that citizens are understandably concerned about the way Facebook has repeatedly and consistently flaunted and neglected user privacy, and that they are concerned about the hateful and divisive character of the civic discourse that is a result of Facebook’s business model.

And so this week the era of Silicon Valley self-regulation came to an end. It’s now time for a difficult debate about how the new internet – an internet of multinational corporations, and of platforms – will be governed.

While Congressmen and Mr. Zuckerberg appeared to agree that they could work together to develop the “right” regulations, this week’s hearing revealed clear tensions on several key policy issues.

First, while Mr. Zuckerberg says that Facebook now supports digital advertising transparency laws that they had previously lobbied against, it is unclear whether the proposed Honest Ads Act will go far enough or whether it will even pass.

Second, on privacy: The world is watching the response to Europe’s General Data Privacy Regulation (GDPR), and while Mr. Zuckerberg argued that the privacy tools that Facebook will roll out in response to GDPR will be available in other markets, the U.S. (and Canada) still seem unwilling to enshrine the punitive mechanisms that will be needed to ensure these new data rights. While he claims that he supports the principles of the GDPR, the details will be litigated in European courts for years to come.

Third, when pressed on whether they have any competitors, Mr. Zuckerberg strained to name any. Having aggressively acquired many potential competitors, Facebook – as well as Google and Amazon – will all surely fight aggressively against a new generation of competition policy.

Fourth, Mr. Zuckerberg surprised many by agreeing that Facebook is responsible for the content on their platforms. While this seems anodyne, the debate over whether Facebook is a neutral platform or a media company is rife with legal and regularity implications.

Finally, Mr. Zuckerberg suggested that law makers should focus attention on governing artificial intelligence. They repeatedly changed the subject. Since Facebook operates at a mind-boggling global scale, they use AI to implement and even determine their policies, regulations and norms. How states will in turn govern these algorithms is certain to be a central challenge for democracy. Mr. Zuckerberg knows it; Congress was disinterested.

Over the past 20 years, the internet has shown flashes of its empowering potential. But the recent Facebook revelations also demonstrate what can happen if we fail to hold it accountable.

Mr. Zuckerberg’s testimony is only the beginning of a long-overdue conversation about whether we will govern platforms or be governed by them.

Uncategorized

Globe and Mail oped: The new rules for the internet

There has been lots of discussion lately about regulating social media but much less on what this might look like. Ben Scott (former tech policy for Obama & Clinton) and I suggest some options in The Globe and Mail. In short, it will take a broad new approach to how we think about governing the internet. The piece is here, and below.

 

The new rules for the internet – and why deleting Facebook isn’t enough

While being pessimistic about the depressing tableau of Silicon Valley malfeasance is easy, let us not forget that the internet has brought tremendous value to our society. Therefore, the answer is not to lock down the open internet or even to delete Facebook (however satisfying that might feel, with 2.2-billion users it is embedded in our society). Instead, we urgently need new democratic rules for the internet that enhance the rights of citizens, protect the integrity of our public sphere and tackle the structural problems of our current digital economy.

Here are seven ideas:

Data rights. Much of the internet economy is built on trading personal data for free services with limited consumer protection. This model has metastasized into a vast complex of data brokers and A.I.-driven micro-targeting with monopolists such as Google and Facebook at the centre. With the curtain pulled back, there may at last be political will to build a rights-based framework for privacy that adapts as technologies change. For starters, we need major new restrictions on the political exploitation of personal data (including by political parties themselves, who remain exempt from our privacy law) and much greater user control over how data is collected and used. Europe’s new General Data Protection Regulation sets a high standard, though since it took 10 years to legislate, it was of date before it was implemented. We must evolve it to the next level.

Modernize and enforce election law. Few dispute that citizens deserve to know who is trying to sway them during elections, but our laws were designed for TV and radio. We need to update them for the internet era, where ads can be purchased from anywhere, disguised as normal social media posts, micro-targeted to polarize voters, and loaded up with sensational and divisive messages. All online ads should carry a clearly visible cache of information that states who bought them, the source of the funds, how much they spent, who saw them, and the specific targeting parameters they selected.

Audit artificial intelligence. Facebook and Google monetize billions of data points a day using powerful A.I. to target and influence specific audiences. The social and ethical implications of A.I. are a blinking red light as this technology advances, and we need to lay some ground-rules for accountability. Just as we require drug manufacturers and car makers to submit to rigorous public safety checks, we need to develop a parallel system for algorithms.

Tax Silicon Valley fairly. The titans of technology dominate the list of the most valuable companies on the planet. And yet, they are still coddled by tax law as if they were an emerging industry. It is time for Silicon Valley to pay unto Caesar — not least so that we plebeians can use the tax revenue to fix the things they keep breaking, like journalism, for example.

Aggressive competition policy. Before we start a decade-long trust-busting crusade, let’s begin with a competition policy agenda that delivers immediate, tangible value. This might include restrictions on acquisition of up-and-coming competitors, structural separation of behavior tracking and ad targeting businesses and consumer data portability from one service provider to another.

Improve digital security. What the Russians did in 2016 to exploit digital media should be a wake-up call. Without unleashing a surveillance dragnet, we need effective capabilities to counter foreign disinformation operations using measures such as “know your customer” rules for ad buyers and closing down the armies of fake accounts.

Transform civic literacy, and scale civic journalism. As social-media users, we all own part of this problem. It is our appetite for sensationalism, outrage and conspiracy that creates the audience for disinformation. Instead of relying on tech-funded literacy campaigns, the government needs to rebuild our civic literacy from the ground up, and couple these efforts with serious investments and policy changes to reinvigorate public service and accountability journalism.

Ironically, Facebook’s own conduct has awoken its vast user base to the need for a new generation of internet regulation. And with the United States mired in the politics of Donald Trump and the European Union slowed by a complex bureaucracy, there is an opportunity for Canada to present this new vision. But we will only be effective if the rigor and scale our response is commensurate with the threat posed to our democracy.

Disruptive Power

The Crisis of the State in the Digital Age

Cover

 

Anonymous. WikiLeaks. The Syrian Electronic Army. Edward Snowden. Bitcoin. The Arab Spring.

Digital communication technologies have thrust the calculus of global political power into a period of unprecedented complexity. In every aspect of international affairs, digitally enabled actors are changing the way the world works and disrupting the institutions that once held a monopoly on power. No area is immune: humanitarianism, war, diplomacy, finance, activism, or journalism. In each, the government departments, international organizations and corporations who for a century were in charge, are being challenged by a new breed of international actor. Online, networked and decentralized, these new actors are innovating, for both good and ill, in the austere world of foreign policy. They are representative of a wide range of 21st century global actors and a new form of 21st century power: disruptive power.

In Disruptive Power, Taylor Owen provides a sweeping look at the way that digital technologies are shaking up the workings of the institutions that have traditionally controlled international affairs. The nation state system and the subsequent multinational system were founded on and have long functioned through a concentration of power in the state. Owen looks at the tools that a wide range of new actors are using to increasingly control international affairs, and how their rise changes the way we understand and act in the world. He considers the bar for success in international digital action and the negative consequences of a radically decentralized international system. What new institutions will be needed to moderate the new power structures and ensure accountability? And how can governments and corporations act to promote positive behavior in a world of disruptive innovation? Owen takes on these questions and more in this probing and sober look at the frontier of international affairs, in a world enabled by information technology and increasingly led by disruptive innovators.

With cutting edge analysis of the fast-changing relationship between the declining state and increasingly powerful non-state actors, Disruptive Power is the essential road map for navigating a networked world.

 

Endorsements

“The 21st century state is using new technologies both to serve and protect citizens and also to control them. Citizens are using the same technologies to fight back. Taylor Owen’s analysis is the one you want to read on this battle and the way it will shape the 21st century.”

–Michael Ignatieff, Edward R. Murrow Professor of Practice, Harvard Kennedy School

“Cyber technology has led to disruptive power in the form of hackers like Anonymous and crypto-currencies like Bitcoin. How should states respond? Taylor Owen offers a provocative analysis and recommendations.”

–Joseph S. Nye, Jr., Harvard University, author of The Future of Power

“In Disruptive Power, Owen gives us a tour of the digital challenges to the nation-state, from newly flexible protest groups like Occupy and Anonymous to the rise of algorithms as weapons, often in the hands of non-state actors and often targeting civilian life. He weaves these observations into a forcefully argued thesis: the model of a world governed by stable nation-states is in crisis, forcing most state-led institutions into a choice between adaptation and collapse.”

–Clay Shirky, author of Here Comes Everybody: The Power of Organizing Without Organizations

“Taylor Owen gives us an incisive set of reflections on the ways in which the decentralized, collaborative, and resilient power of digital networks is undermining the state’s ability to govern. Even more disturbing is the resulting existential dilemma for democratic states: the best way to fight back is to become a surveillance state. Disruptive Power does not provide answers, but it poses important and unsettling questions.”

–Anne-Marie Slaughter, Professor Emerita of Politics and International Affairs, Princeton University, and Director of Policy Planning, U.S. State Department, 2009-2011

 

Media and Book Talks

 

Articles:

The Violence of AlgorithmsForeign Affairs

Why the U.S. should but won’t partner with hactivists AnonymousSan Fransisco Chronicle 

Why governments must embrace the new global digital realityThe Globe and Mail

The promise and peril of digital diplomacyThe Globe and Mail

 

Reviews:

More Data, More Problems: Surveillance and the Information Economy,  Review in Foreign Affairs

Rescuing Democracy in the Age of the Internet, Review in Ethics and International Affairs

 

Videos:

CIGI Signature Lecture, Disruptive Power: The Crisis of the State in the Digital Age

World Affairs Council, San Fransisco: From Bitcoin to WikiLeaks: Shaping the World in the Digital Age

Deutsche Welle Global Media Forum, Plenary Session: Foreign policy in 140 Characters: How technology is redefining diplomacy

International Conference of Crisis Mappers: Historical Mapping and the US Bombardment of Cambodia

Highlights from a talk at USC Annenberg: Disruptive Power 

 

Chapter Summaries

 

Losing Control

Losing Control outlines how in a wide range of international areas of influence, the state is being challenged by new, digitally enabled actors. Grounded in the theory of disruption, this chapter explores the rise and power of the activist collective Anonymous, the paradox of dual use surveillance technologies, and the recent revelation on the extent of NSA surveillance.  The chapter serves as an introduction to the book.

Disruptive Power

Disruptive Power traces the development of the modern state and drawing on disruption theory, explores how the introduction of digital technology presents a crisis to state power.  The state began as a mechanism for centralizing and exercising power and over time became hierarchical, bureaucratic, and, in democratic states, accountable to the rule of law.  In a networked world, however, groups like Anonymous wield power by being decentralized, collaborative, and resilient.  These two models of power are fundamentally at odds and the resulting disruptive power threatens the institutions that have preserved the balance of power since the end of World War II.

Spaces of Dissent


Spaces of Dissent explores the rapidly evolving space of digital activism, or hacktivism, through the example of a group of hackers called Telecomix, who served as a form of tech support for the Arab Spring.  Such cyber activists have taken on a role of social and cultural provocateurs; they are dissenting actors in a culture that is increasingly hostile to protest. What’s more, they see, observe, and quickly react in ways that boggle the state and corporations – all of this instrumentalized by digital technology. This argument is grounded in an exploration of hactivism as a form of civil disobedience, though one that looks markedly different, and is potential more powerful, than the placards and megaphones of old. The chapter details how the state has responded to the perceived threat of online civil disobedience through its prosecutions against Chelsea Manning and Anonymous, and argues that their excessiveness stems form a paranoia over losing control. Finally, it explores the costs to society when we eliminate social deviancy.

New Money


New Money details how the rise of crypto-currencies such as Bitcoin represent a threat to the power the state derives from the control of currency. This chapter first outlines the history of the close connection between the control of currency and state power. It then details the rise of crypto-currencies, explain how they work, and their potential real-world benefits. Finally, it explores the potential challenge to state power posed by this decentralized and technologically enabled currency. I argue that if the use of Bitcoin were to proliferate, as it likely will, then the inability of the state to either collect revenue from, or regulate commercial activity, poses a threat to the control it currently holds over the international financial system.

Being There


Being There considers the evolution of international reporting news by juxtaposing the death of seasoned war corresponded Marie Colvin during the bombing of Homs, Syria with the new digital tools Syrian citizens used to document and stream the war to the world in real time.  In an age of live-streaming, citizen journalism, drone journalism and coming advances in virtual reality, do we even need foreign correspondents? What’s more, do these technological advances result in new forms of knowing and understanding international events, do they shift how we understand the traditional power of the media and their capability to control information, and are they ultimately affecting how we see, and act in, the world?

Saving the Saviors


Saving the Saviors looks at the impact of collaborative mapping and advances in satellite technology on humanitarian and development agencies. The world of aid, humanitarianism and development have long been dominated by state-based agencies and large international organizations. For nearly a century, organizations like the World Food Program, The Red Cross, USAID and Oxfam have attempted to lead a transfer of expertise and resources from the developed world to the developing world. But new models are emerging. In the first week following the 2010 Haiti earthquake 14,000 citizens used their cell phones to upload emergency information to a live online crisis map. How do we know if the information uploaded to a crisis map is real? How do we hold these projects to account, without the oversight that states and institutions once provided? Using examples of disruptive humanitarian actors and recent academic work assessing their impact, this chapter explores how aid and humanitarianism are being transformed from the ground up.

Diplomacy Unbound


Diplomacy Unbound explores the emerging practice of digital diplomacy. First, it outlines how we valued the efficacy and power of diplomacy before Twitter and Facebook and mesh networks by tracing the notion of diplomatic power. It then argues that we need to view digital diplomacy initiatives in two categories, those that simply expand the practice of public diplomacy into a new medium, and those that seek to fundamentally engage in the digital space, using the tools and capabilities outlined throughout this book. I argue that when the bounds of diplomacy are extended into influencing not just states, but also digital actors, then they overlap fundamentally with other foreign policy programs and objectives. And this invariably leads to conflicting methods and outcomes.  The undue negative costs associated with coercive digital diplomacy demonstrate the weakness of the state in a major realm of its foreign policy. And if the state can’t be effectively diplomatic in the digital space, then what does this tell us about the contemporary relevance of diplomacy itself? 

The Violence of Algorithms


The Violence of Algorithms looks at how advances in computational power and automation have produced military weapons and surveillance tools that blur the boundaries of the battlefield and the lines between domestic and international. While much of this book focuses on diminishing state power in the face of empowered actors, here I look at how the state is fighting back. What does it mean when the state extends the use of military technologies and tactics far beyond the battlefield? How should we view advances in automated warfare, and the power that these new technologies embed in complex and secretive algorithms? And for how long can we expect the state to have a monopoly on these news forms of pervasive violence? Put another way, where is the line between war and peacetime behaviour with the deployment of computation and surveillance based weaponry?

The Crisis of the State

The Crisis of the State outlines four challenges that together threaten the state’s traditional mechanisms of power and control, but that also might provide models for 20th century international institutions seeking to adapt— if they are structurally capable of transformation or meaningful reform.  This crisis of the state has at least four key components: democratic legitimacy, reversing the surveillance state, algorithmic accountability, and internet governance.  Solving any one of them, will not prove a panacea to this crisis, nor is this list exhaustive; there are many more innovations being developed and important questions being addressed. But luckily in each, there are individuals and groups experimenting on new models and proposing potential solutions.  This is the new landscape in which the state must constructively engage.

Twitter


About

By way of an intro, currently: 0197Taylor O. B_W WEB

My PhD was on the concept of human security, exploring how mapping and spatially analyzing local vulnerability data can help us better understand the nature of extreme insecurity.  My current personal research, however, now focuses on the intersection of digital media, technology and public policy.

I use this site as a contact point and as an aggregator of my academic work and broader writing.

A bit more officially:

Taylor Owen is the Beaverbrook Chair in Media, Ethics and Communications and Associate Professor in the Max Bell School of Public Policy at McGill University. He was previously Assistant Professor of Digital Media and Global Affairs at the University of British Columbia, and the Research Director of the Tow Center for Digital Journalism at Columbia University where he led a program studying the impact of digital technology on the practice of journalism, and has held research positions at Yale University, The London School of Economics and The International Peace Research Institute. His Doctorate is from the University of Oxford and he has been a Trudeau and Banting scholar, an Action Canada and Public Policy Forum Fellow, the 2016 Public Policy Forum Emerging Leader, and sits on the Board of Directors of the Center for International Governance Innovation (CIGI) and on the Governing Council of the Social Sciences and Humanities Research Council (SSHRC). He is the founder of the international affairs media platform OpenCanada.org, and he is the author, most recently, of Disruptive Power: The Crisis of the State in the Digital Age (Oxford University Press, 2015) and the co-editor of The World Won’t Wait: Why Canada Needs to Rethink its Foreign Policies (University of Toronto Press, 2015, with Roland Paris), Journalism After Snowden: The Future of the Free Press in the Surveillance State (Columbia University Press, 2017, with Emily Bell) and The Platform Press: How Silicon Valley Re-enginnered Journalism (Tow Center 2017, with Emily Bell). His forthcoming book on Silicon Valley, journalism and democracy will be published by Yale University Press in early 2019. His work can be found at www.taylorowen.com and @taylor_owen.

Contact

Email: taylor (dot) owen (at) gmail (dot) com

Twitter: @taylor_owen

Warning: I have been largely defeated by email flow, so please feel free to send reminders and nudges when needed.

Publications

 

Selected writing and media (more formal list below)

On technology and global affairs:

On media and democracy:

On Canadian politics and foreign policy:

On the bombing of Cambodia:

On Human Security:

On the future of think tanks:

 

Full(ish) List

Books and Manuscripts

  • Disruptive Power: The Crisis of the State in the Digital Era. March 2015, Oxford University Press, New York (About, Amazon)
  • The World Won’t Wait: Why Canada Needs to Rethink its Foreign Policies, Forthcoming December 2015, (ed., with Roland Paris), University of Toronto Press, Toronto (Amazon)
  • Journalism After Snowden, Columbia University Press (ed with Emily Bell and Jennifer Henrichson), February 2017. (CUP)
  • The New Global Journalism: Foreign Correspondence in Transition. Tow Center for Digital Journalism, Columbia University, 2014 (ed with Ann Cooper) pdf
  • Human Security.  Sage Major Work, Four-Volume Set. London, UK. 2013. Link
  • The Handbook of Human Security, Routledge Press, 2013 (ed., with Mary Martin) Link
  • Operationalizing Human Security: From Local Vulnerability to International Policy, DPhil Thesis, The University of Oxford, July 2010.

Peer Reviewed Academic

  • Belair-Gagnon, Valerie, Taylor Owen and Avery E. Holton. “Unmanned Aerial Vehicles and Journalistic Disruption: Perspectives of Early Professional Adopters.” Digital Journalism, vol. 5, no. 10, 2017, pp. 1-14, https://doi.org/10.1080/21670811.2017.1279019.
  • Owen, Taylor. “The Networked State and the End of 20th Century Diplomacy.” Global Affairs, vol. 2, no. 3, 2016, pp. 301-307, https://doi.org/10.1080/23340460.2016.1239375.
  • Burgess, J Peter, Taylor Owen and Uttam Kumar Sinha. “Human Securitization of Water? A Case Study of the Indus Water Basin.” Cambridge Review of International Affairs, vol. 29, no. 2, 2013, pp. 382-407, https://doi.org/10.1080/09557571.2013.799739. 
  • Martin, Mary, and Taylor Owen. “The Second Generation of Human Security: Lessons from the UN and EU Experience.” International Affairs, vol. 86, no. 1, 2010, pp. 211-224, https://doi.org/10.1111/j.1468-2346.2010.00876.x. 
  • Travers, Patrick, and Taylor Owen. “Between Metaphor and Strategy: Canada’s Integrated Approach to Peacebuilding in Afghanistan.” International Journal, vol. 63, no. 3, 2008, pp. 685-702, https://doi.org/10.1177/002070200806300316. 
  • Owen, Taylor. “The Critique that Doesn’t Bite: A Response to David Chandler’s ‘Human Security: The Dog That Didn’t Bark’.” Security Dialogue, vol. 39, no. 4, 2008, pp. 445-453, https://doi.org/10.1177/0967010608094038.
  • Benini, Aldo, Taylor Owen and Håvard Rue. “A Semi-Parametric Spatial Regression Approach to Post-War Human Security: Cambodia 2002-2004.” Asian Journal of Criminology, vol. 3, no 2, 2008, pp.139-158, https://doi.org/10.1007/s11417-008-9056-1.
  • Liotta, P.H., and Taylor Owen. “Why Human Security?” Whitehead Journal of Diplomacy and International Relations, vol. 7, no. 1, 2006, pp. 37-54, http://taylorowen.com/Articles/Owen%20and%20Liotta%20-%20Why%20Human%20Security.pdf.
  • Liotta, P.H., and Taylor Owen. “Sense and Symbolism: Europe Takes On Human Security.” Parameters, vol. 36, no. 3, 2006, pp. 85-102, http://ssi.armywarcollege.edu/pubs/parameters/articles/06autumn/liotta.pdf. 
  • Gleditsch, Nils Petter, et al. “Conflicts over Shared Rivers: Resource Wars or Fuzzy Boundaries?” Political Geography, vol. 25. no. 4, 2006, pp. 361-382, https://doi.org/10.1016/j.polgeo.2006.02.004. 
  • Owen, Taylor. “A Response to Edward Newman: Conspicuously Absent? Why the Secretary-General Used Human Security in All but Name.” St Antony’s International Review, vol. 1, no. 2, 2005, pp. 37–42, http://www.jstor.org/stable/26227009.
  • Owen, Taylor, and Olav Slaymaker. “Toward modeling regionally specific human security using GIS: case study Cambodia.” AMBIO: A Journal of the Human Environment, vol. 34, no.6, 2005, pp. 445-449, https://doi.org/10.1579/0044-7447-34.6.445. 
  • Owen, Taylor. “Human Security – Conflict, Critique and Consensus: Colloquium Remarks and a Proposal for a Threshold-Based Definition.” Security Dialogue, vol. 35, no. 3, 2004. Pp. 373-387, https://doi.org/10.1177/0967010604047555.
  • Owen, Taylor. “Human Security: A New View of Cambodian Vulnerability.” Cambodia Development Review, vol. 7, no. 2, 2003, pp. 9-16, https://www.cdri.org.kh/publication-page-old/pub/cdr/2003/cdr03-2.pdf.

Book Chapters

  • Kiernan, Ben and Taylor Owen. “Iraq, Another Vietnam? Consider Cambodia.” The United States, Southeast Asia, and Historical Memory, edited by Mark Pavlick. Common Courage Press, Forthcoming, July 2018.
  • Owen, Taylor. “Global Media Power.” The Sage Handbook of Digital Journalism, edited by Tamara Witschge, C.W. Anderson, David Domingo and Alfred Hermida, London, Sage Publications, 2016, pp. 25-35.
  • Bell, Emily, Taylor Owen and Smitha Khorana. “Introduction.” Journalism After Snowden: The Future of the Free Press in the Surveillance State, edited by Emily Bell and Taylor Owen, with Smitha Khorana and Jennifer Henrichsen, Columbia University Press, 2016, pp. 1-18.
  • Paris, Roland, and Taylor Owen. “Introduction: A Transforming World.” The World Won’t Wait: Why Canada Needs to Rethink Its International Policies, edited by Roland Paris and Taylor Owen, University of Toronto Press, 2016, pp. 3–19.
  • Paris, Roland, and Taylor Owen, “Conclusion: Imagining a More Ambitious Canada.” The World Won’t Wait: Why Canada Needs to Rethink Its International Policies, edited by Roland Paris and Taylor Owen, University of Toronto Press, 2016, pp. 175–188.
  • Martin, Mary, and Taylor Owen. “Introduction.” Routledge Handbook of Human Security, edited by Mary Martin and Taylor Owen, London, Routledge, 2014, pp. 1-15.
  • Owen, Taylor. “Human Security Thresholds.” Routledge Handbook of Human Security, edited by Mary Martin and Taylor Owen, London; New York, Routledge, 2014, pp. 58-65.
  • Owen, Taylor. “Human Security Mapping.” Routledge Handbook of Human Security, edited by Mary Martin and Taylor Owen, London; New York, Routledge, 2014, pp. 308-319.
  • Martin, Mary, and Taylor Owen. “Conclusion.” Routledge Handbook of Human Security, edited by Mary Martin and Taylor Owen, London; New York, Routledge, 2014, pp. 331-335.
  • Owen, Taylor. “Editor’s Introduction: Human Security.” Human Security, edited by Taylor Owen, London, Sage Publications, 2013, vol 1, pp. xxiii-xlix.
  • Owen, Taylor, and Emily Paddon. “Whither Humanitarian Space? The Costs of Integrated Peacebuilding in Afghanistan.” Modern Warfare: Armed Groups, Private Militaries, Humanitarian Organizations, and the Law, edited by Benjamin Perrin, Vancouver, UBC Press, 2013, pp. 267-287.
  • Eaves, David, and Taylor Owen. “Missing the Link: How the Internet is Saving Journalism.” The New Journalism: Roles, Skills, and Critical Thinking, edited by Paul Benedetti, Timothy Currie, and Kim Kierans, Toronto, Edmund Montgomery Press, 2010.
  • Owen, Taylor. “In All but Name: The Uncertain Future of Human Security in the UN.” Rethinking Human Security, edited by Moufida Goucha and John Crowley, Oxford, Wiley-Blackwell Press, 2008, pp. 113-127.
  • Owen, Taylor. “Critical Human Security: A Contested Concept.” The Routledge Handbook of New Security Studies, edited by J. Peter Burgess, Oxford, Routledge, 2010, pp. 39-50.
  • Owen, Taylor. “Measuring Human Security: Methodological Challenges and the Importance of Geographically-Referenced Determinants.” Environmental Change and Human Security: Recognizing and Acting on Hazard Impacts, edited by Peter Liotta, Springer NATO Science Series, 2008, pp. 35-64.

Non-Peer Reviewed Journals

  • Kiernan, Ben, and Taylor Owen. “Making More Enemies than We Kill? Calculating U.S. Bomb Tonnages Dropped on Laos and Cambodia, and Weighing Their Implications.” The Asia Pacific Journal, vol. 13, no. 16, no. 3, 2015, pp. 1-9.
  • Owen, Taylor, and Ben Kiernan. “Roots of U.S. Troubles in Afghanistan: Civilian Bombing Casualties and the Cambodian Precedent.” The Asia Pacific Journal, vol. 8, issue 26, no. 4, 2010, https://apjjf.org/-Taylor-Owen/3380/article.html.
  • Owen, Taylor, and Ben Kiernan. “Bombs over Cambodia: New Light on US Air War.” The Asia Pacific Journal, vol. 5, issue 5, 2007, https://apjjf.org/-Ben-Kiernan/2420/article.pdf.
  • Burgess, Peter J., and Taylor Owen. “Editors’ Note.” Introduction to “Special Section: What is ‘Human Security’?” edited by Peter J. Owen and Taylor Owen, Security Dialogue, vol. 35, no. 3, 2004, pp. 345- 346, http://journals.sagepub.com/doi/pdf/10.1177/0967010604047569.
  • Owen, Taylor. “Challenges and Opportunities for Defining and Measuring Human Security.” Disarmament Forum, no. 3, 2004, pp. 15-24, https://www.peacepalacelibrary.nl/ebooks/files/UNIDIR_pdf-art2138.pdf.
  • Owen, Taylor. “Measuring Human Security: Overcoming the Paradox,” Human Security Bulletin, vol. 2, no. 3, 2003, http://www.taylorowen.com/Articles/2003_Paradox.pdf.
  • Owen, Taylor. “Body Count: Rationale and Methodologies for Measuring Human Security,” Human Security Bulletin, vol. 1, no. 3, 2002, http://www.taylorowen.com/Articles/2002_%20Body%20Count.pdf.

Magazine Articles

  • Owen, Taylor, and Robert Gorwa. “Quantum Leap: China’s Satellite and the New Arms Race.” Foreign Affairs, 7 Sept. 2016, https://www.foreignaffairs.com/articles/2016-09-07/quantum-leap.
  • Owen, Taylor. “Can Journalism Be Virtual?” Columbia Journalism Review, Fall/Winter 2016, https://www.cjr.org/the_feature/virtual_reality_facebook_second_life.php.
  • Owen, Taylor. “Towards a Whole of Government Digital Strategy.” Policy Magazine, July/August 2016, pp. 6-8, http://www.policymagazine.ca/pdf/20/PolicyMagazineJulyAugust-2016-Owen.pdf.
  • Owen, Taylor. “Coin Toss: Will Blockchain undermine or buttress state power?” The Literary Review of Canada, July 2016, http://reviewcanada.ca/magazine/2016/07/coin-toss/.
  • Owen, Taylor. “The Violence of Algorithms: Why Big Data Is Only as Smart as Those Who Generate It.” Foreign Affairs, 25 May 2015, https://www.foreignaffairs.com/articles/2015-05-25/violence-algorithms.
  • Eaves, David, and Taylor Owen. “Liberal Baggage: The national party’s greatest burden may be its past success.” The Literary Review of Canada, May 2012, https://reviewcanada.ca/magazine/2012/05/liberal-baggage/.
  • Owen, Taylor. “A World Turned Upside Down: To face an age of climate change, Twitter and counterinsurgency, Canada’s foreign policy establishment needs fresh ideas.” The Literary Review of Canada, December 2010, http://reviewcanada.ca/magazine/2010/12/a-world-turned-upside-down/.
  • Eaves, David and Taylor Owen. “Progressivism’s End: In Obama, both Americans and Canadians can see the promise of something new.” The Literary Review of Canada, September 2008, http://reviewcanada.ca/magazine/2008/09/progressivisms-end/.
  • Owen, Taylor, and Emily Paddon. “Rattle and Hum: Hello, Baghdad! A Kurdish singer rocks Iraq.” The Walrus Magazine, 21 Jan. 2009, https://thewalrus.ca/2009-01-music-2/.
  • Owen, Taylor, and Patrick Travers. “3D Vision: Can Canada reconcile its defense, diplomacy and development objectives in Afghanistan?” The Walrus Magazine, 12 Jul. 2007, https://thewalrus.ca/2007-07-foreign-affairs/.
  • Owen, Taylor. “One Step Closer to an Obama-Ignatieff Continent.” The Prospect Magazine, 10 Dec. 2008, https://www.prospectmagazine.co.uk/world/one-step-closer-to-an-obama-ignatieff-continent.
  • Owen, Taylor, and Ben Kiernan. “Bombs Over Cambodia: New information reveals that Cambodia was bombed far more heavily than previously believed.” The Walrus Magazine, 12 Oct. 2006, https://thewalrus.ca/2006-10-history/.

Policy Reports

  • Bell, Emily and Taylor Owen, with Peter Brown, Codi Hauka and Nushin Rashidian. The Platform Press: How Silicon Valley Reengineered Journalism. The Tow Center for Digital Journalism, Columbia University, 2017, http://towcenter.org/wp-content/uploads/2017/04/The_Platform_Press_Tow_Report_2017.pdf.
  • The Shattered Mirror: News, Democracy and Trust in the Digital Age. The Public Policy Forum, 2016, https://shatteredmirror.ca/wp-content/uploads/theShatteredMirror.pdf.
  • Aronson-Rath, Raney, Milward, James, Owen, Taylor and Fergus Pitt. Virtual Reality Journalism. The Tow Centre for Digital Journalism, Columbia University, 2015, https://towcenter.gitbooks.io/virtual-reality-journalism/content/.
  • Cooper, Ann and Taylor Owen, editors. The New Global Journalism: Foreign Correspondence in Transition, The Tow Center for Digital Journalism, Columbia University, 2014, http://towcenter.org/wp-content/uploads/2014/09/The-New-Global-Journalism-1.pdf.
  • Owen, Taylor. Media, Technology and Intelligence. The Canadian Security and Intelligence Service (CSIS), 2013.
  • Owen, Taylor. Disruption: Foreign Policy in a Networked World. Pierre Elliott Trudeau Foundation Position Paper, 2012, http://www.trudeaufoundation.ca/sites/default/files/canada_in_the_world–en.pdf.
  • Owen, Taylor, and Alexandre Grigsby. In Transit: Gangs and Criminal Networks in Guyana. A Working Paper of the Small Arms Survey, Geneva, 2012, http://www.smallarmssurvey.org/fileadmin/docs/F-Working-papers/SAS-WP11-Guyana.pdf.
  • Owen, Taylor, and Rudyard Griffiths. The People’s Debates: A Report on Canada’s Televised Election Debates. Aurea Foundation, 2011.
  • Owen, Taylor, and Emily Paddon. The Challenges of Integrated Peacebuilding in Afghanistan. Report for the Canada Department of Foreign Affairs and International Trade, 2009.
  • Owen, Taylor. The Uncertain Future of Human Security in the UN. UNESCO Working Paper, Oxford, Blackwell Publishing, 2008, https://doi.org/10.1111/j.1468-2451.2008.00629.
  • Travers, Patrick, and Taylor Owen. Peacebuilding While Peacemaking: The Merits of a 3D Approach in Afghanistan. UBC Centre for International Relations Security and Defense Forum Working Paper, no. 3, 2007, https://www.academia.edu/148897/Peacebuilding_While_Peacemaking_The_Merits_of_a_3D_Approach_in_Afghanistan.
  • Jackson, Thomas, Marsh, Nicholas, Owen, Taylor and Anne Thurin. Who Takes the Bullet? The Impact of Small Arms Violence. Norwegian Church Aid, 2005, https://www.kirkensnodhjelp.no/contentassets/c1403acd5da84d39a120090004899173/2005/who-takes-the-bullet.pdf.
  • Owen, Taylor, and Aldo Benini. Human Security in Cambodia: A Statistical Analysis of Large-Sample Sub-National Vulnerability Data. The Centre for the Study of Civil War at the International Peace Research Institute, Oslo, 2004, https://www.gichd.org/fileadmin/GICHD-resources/rec-documents/CambodiaOwenBeniniSummaryWithMap040419.pdf.
  • Owen, Taylor, Kathryn Furlong, and Nils Petter Gleditsch. Codebook for the shared river basin GIS and database. The Centre for the Study of Civil War at the International Peace Research Institute, Oslo, 2004, https://files.prio.org/files/projects/Codebook%20for%20The%20Shared%20River%20Basin%20GIS%20and%20Database.pdf.

Selected Opeds

  • Owen, Taylor. “Data governance in the digital age: How Facebook disrupted democracy.” The Financial Post, 25 May 2018, http://business.financialpost.com/opinion/data-governance-in-the-digital-age-how-facebook-disrupted-democracy.
  • Owen, Taylor. “The era of big tech self-governance has come to an end.” The Globe and Mail, 11 Apr. 2018, https://www.theglobeandmail.com/opinion/article-the-era-of-big-tech-self-governance-has-come-to-an-end/.
  • Owen, Taylor, and Ben Scott.  “The new rules for the internet – And why deleting Facebook isn’t enough.” The Globe and Mail, 2 Apr. 2018, https://www.theglobeandmail.com/opinion/article-the-new-rules-for-the-internet-and-why-you-shouldnt-delete-facebook/.
  • Muggah, Robert, and Taylor Owen. “So, the liberal order is in freefall? Not so fast.” The Globe and Mail, 10 Jan. 2018, https://www.theglobeandmail.com/opinion/so-the-liberal-order-is-in-free-fall-not-so-fast/article37566760/.
  • Owen, Taylor. “Is Facebook a threat to democracy?” The Globe and Mail, 19 Oct. 2017, https://www.theglobeandmail.com/opinion/is-facebook-a-threat-to-democracy/article36661905/.Greenspon, Edward and Taylor Owen. “’Fake news 2.0’: A threat to Canada’s democracy.” The Globe and Mail, 28 May 2017, https://www.theglobeandmail.com/opinion/fake-news-20-a-threat-to-canadas-democracy/article35138104/.
  • Ananny, Mike, and Taylor Owen. “Ethics and governance are getting lost in the AI frenzy.” The Globe and Mail, 30 Mar. 2017, https://www.theglobeandmail.com/opinion/ethics-and-governance-are-getting-lost-in-the-ai-frenzy/article34504510/.
  • Owen, Taylor, and Elizabeth Dubois. “It’s time to reform the CBC for the digital age.” The Toronto Star, 1 Feb. 2017, https://www.thestar.com/opinion/commentary/2017/02/01/its-time-to-reform-the-cbc-for-the-digital-age.html.
  • Owen, Taylor. “What can governments learn from digital disruptors.” World Economic Forum, 6 Apr. 2016, https://www.weforum.org/agenda/2016/04/what-can-governments-learn-from-digital-disruptors/.
  • Owen, Taylor. “Why governments must embrace the new global digital reality.” The Globe and Mail, 10 Apr. 2015, https://www.theglobeandmail.com/opinion/columnists/why-governments-must-embrace-the-new-global-digital-reality/article23876924/.
  • Owen, Taylor. “Why the U.S. should but won’t partner with hactivists Anonymous.” San Francisco Chronicle, 1 May 2015, https://www.sfgate.com/news/article/Why-the-U-S-should-but-won-t-partner-with-6235020.php.
  • Owen, Taylor. “The promise and peril of digital diplomacy.” The Globe and Mail, 9 Jan. 2015., https://www.theglobeandmail.com/opinion/the-promise-and-peril-of-digital-diplomacy/article22375462/.
  • Owen, Taylor. “Bitcoin is dead— Long live bitcoin.” Vice News, 23 Mar. 2014, https://news.vice.com/article/bitcoin-is-dead-long-live-bitcoin.
  • Muggah, Robert, and Taylor Owen. “Decline in Canadian think tanks couldn’t come at a worse time.” The Toronto Star, 9 Oct. 2013, https://www.thestar.com/opinion/commentary/2013/10/09/decline_in_canadian_think_tanks_couldnt_come_at_worse_time.html.
  • Owen, Taylor. “Drones don’t just kill. Their psychological effects are creating enemies.” The Globe and Mail, 13 Mar. 2013, https://www.theglobeandmail.com/opinion/drones-dont-just-kill-their-psychological-effects-are-creating-enemies/article9707992/.
  • Muggah, Robert, and Taylor Owen. “With think tanks on the ropes, Canada is losing its bark and bite.” The Globe and Mail, 10 Oct. 2013, https://www.theglobeandmail.com/opinion/with-think-tanks-on-the-ropes-canada-is-losing-its-bark-and-bite/article14795496/.
  • Griffiths, Rudyard, and Taylor Owen. “Let a commission, not broadcasters, call the shots.” The Globe and Mail, 1 Apr. 2011, https://www.theglobeandmail.com/opinion/let-a-commission-not-broadcasters-call-the-shots/article574867/.
  • Owen, Taylor. “Afghan army: If you build it, who will come?” The Globe and Mail, 6 Sept. 2011, https://www.theglobeandmail.com/opinion/afghan-army-if-you-build-it-who-will-come/article627066/.
  • Owen, Taylor. “Why Wikileaks will lead to more secrecy, not less.” Maclean’s Magazine, 29 Nov. 2010, https://www.macleans.ca/general/why-wikileaks-will-lead-to-more-secrecy-not-less/.
  • Owen, Taylor. “Review: The Canadian Century: Moving out of America’s shadow, by Brian Lee Crowley.” The Globe and Mail, 10 Aug. 2010, https://www.theglobeandmail.com/arts/books-and-media/review-the-canadian-century-moving-out-of-americas-shadow-by-brian-lee-crowley/article4324559/.
  • Owen, Taylor. “Five reasons British coalition is not a harbinger for Canada.” The Globe and Mail, 14 May 2010, https://www.theglobeandmail.com/news/politics/five-reasons-british-coalition-is-not-a-harbinger-for-canada/article4319053/.
  • Griffiths, Rudyard, and Taylor Owen. “Learning from Britain’s three great debates.” The National Post, 1 May 2010, http://nationalpost.com/opinion/rudyard-griffiths-and-taylor-owen-learning-from-britains-three-great-debates.
  • Eaves, David, and Taylor Owen. “How about real Liberal renewal?” The Toronto Star, 20 Nov. 2008, https://www.thestar.com/opinion/2008/11/20/how_about_real_liberal_renewal.html.
  • Travers, Patrick, and Taylor Owen. “2011 is a date, not a goal.” The Toronto Star, 5 Apr. 2008, https://www.thestar.com/opinion/2008/04/05/2011_is_a_date_not_a_goal.html.
  • Eaves, David, and Taylor Owen. “Failed strategy connects Afghan fields, city streets.” The Toronto Star, 7 Dec. 2007, https://www.thestar.com/opinion/2007/12/07/failed_strategy_connects_afghan_fields_city_streets.html.
  • Eaves, David, and Taylor Owen. “Kandahar deal breakers: The Afghan poll is not a blank cheque.” The Globe and Mail, 2 Nov. 2007, https://eaves.ca/2007/11/02/kandahar-deal-breakers-op-ed-in-globe-and-mail/.
  • Eaves, David, and Taylor Owen. “Africa is not a Liberal idea.” Embassy Magazine, October 3, 2007.
  • Eaves, David, and Taylor Owen. “Iraq suddenly appears on Canada’s radar screen.” Toronto Star, 29 Aug. 2007, https://www.thestar.com/opinion/editorialopinion/2007/08/29/iraq_suddenly_appears_on_canadas_radar_screen.html.
  • Eaves, David, and Taylor Owen. “Blogosphere at age 10 is improving journalism.” The Toronto Star, 30 Jul. 2007, https://www.thestar.com/opinion/2007/07/30/blogosphere_at_age_10_is_improving_journalism.html.
  • Eaves, David, and Taylor Owen. “Getting back on track in Afghanistan.” The Toronto Star, 23 Feb. 2007, https://www.thestar.com/opinion/2007/02/23/getting_back_on_track.html.
  • Eaves, David, and Taylor Owen. “Beyond Vimy Ridge: Canada’s other foreign-policy pillar.” The Globe and Mail, 18 Apr. 2007, https://www.theglobeandmail.com/news/national/beyond-vimy-ridge-canadas-other-foreign-policy-pillar/article1073930/.