Skip to content

Editorial: Balancing curbs on porn with a right to privacy

How are users to verify their age in a manner that is reliable, yet does not contemplate a major invasion of privacy?
web1_20231107091132-654a4acfc990e2c1b61ae55bjpeg
The Instagram logo is seen on a cellphone. Michael Dwyer, The Associated Press

British Columbia and Ottawa are both taking steps to prevent children accessing sexually explicit material online.

Premier David Eby has announced that a bill will be introduced this spring to prevent children from viewing inappropriate content.

The legislation will authorize the province to sue social media companies that fail to stop minors from gaining access to this material. Eby notes that the damaging effects of allowing children to view such content are well established.

In Ottawa, Parliament is passing legislation with a similar intent. Bill S-210, known as the Protecting Young Persons from Exposure to Pornography Act, makes it an offence to permit children access to sexually explicit material.

We can all agree with the intent behind these measures. But how they are to be enforced is far from simple.

Eby has offered little about how his legislation will be shaped, while Bill-S-210 merely says that to avoid conviction, social media companies must adopt “a prescribed age-verification” method.

But what is this method? The bill is silent on that matter, and with good reason.

Some U.S. states have passed legislation requiring anyone who wants access to pornography to provide a copy of their passport or driver’s licence.

Of course, that ducks the question, what about adults who have neither?

As well, constitutional objections have been raised, centring on freedom of speech and privacy rights. These have been backed by mainstream organizations like the American Civil Liberties Union and The New York Times.

For the basic issue is this: How are users to verify their age in a manner that is reliable, yet does not contemplate a major invasion of privacy?

Attempts have been made to design viable verification schemes. In France and Britain, efforts are underway to develop a token of sorts that could be issued to users by an independent third party.

Yet such tokens could only be acquired by giving access to personal documents like bank records, health files or credit cards.

One of Eby’s concerns has to do with cyberbullying, or so called “sextortion.” He mentioned the case of a 12-year-old boy in Prince George, Carson Cleland, who killed himself after being victimized in this manner.

Yet here another obstacle arises. Many of the sites which have been attacked for failing to stop cyberbullying have massive circulations.

The media giant Snapchat, a computer/cellphone app that allows users to share photographs and video, had 347 million users a day in 2022, with billions of images being shared.

No doubt some of these images might be used for cyberbullying. But what practical means exist to prevent that? The sheer volume of material is daunting.

There is also the question of who gets to say what counts as pornographic. What about legitimate art work? Or novels with explicit writing?

Bill S-210 says an exception will be made for such material. But again, on whose authority?

These two issues — how to verify the age of users, and how to delineate pornography — present daunting problems.

This is one of the dilemmas of our time. We must protect our children, yet how in practice can this be done?

In 1964 the U.S. Supreme Court attempted a definition of pornography, and failed. Justice Potter Stewart was reduced to saying: “I shall not today attempt further to define the kinds of material I understand to be embraced [pornography] and perhaps I could never succeed in doing so. But I know it when I see it.”

Unable to be specific, it appears this is the point our politicians have reached. We are being asked to take on faith that they will know it when they see it.

With the right to privacy at stake, that is asking a lot.

>>> To comment on this article, write a letter to the editor: [email protected]