Transparent heart icon with white outline and + sign.

Life's busy, read it when you're ready!

Create a free account to save articles for later, keep track of past articles you’ve read, and receive exclusive access to all RP resources.

White magnifying glass.

Search thousands of RP articles

Articles, news, and reviews that celebrate God's truth.

Open envelope icon with @ symbol

Get Articles Delivered!

Articles, news, and reviews that celebrate God's truth. delivered direct to your Inbox!



Magazine, Past Issue

July/Aug 2025 issue

WHAT'S INSIDE: Screen-fast, sports betting, & environmental stewardship

Our 10-day screen-fast challenge that we presented in the last issue is getting traction. Marty VanDriel has a story that shares how the fast went for him and others who gave it a try.

But that was just the start. Some generous supporters have recognized how important this issue is, so they are offering up a little extra motivation for us all. They have pledged to donate $100 to two fantastic kingdom causes – Word & Deed and Reformed Perspective – for every person who commits to and completes a 10-day fast from their screens from July 21 to 30 (to a maximum of $20,000 split between both causes).

Screens aren’t evil, but as the cover illustrates so well, screens can keep us from seeing reality – from seeing God’s loving hand upholding creation, this world, and our lives. Here now is your opportunity to join with some family and friends and maybe your whole church community to put screens aside and see the rest of the world unfiltered. Check out page 19 for more details or click on the QR code above to sign up.

Since sports betting was legalized in 2021, it has taken Canada by storm. If you watch any hockey you’ve noticed a lot of betting ads, and they bring with them a growing temptation for Christians to make some money while enjoying their favurite teams. But as Jeff Dykstra explains, we have good reason to steer clear of sports gambling.

In this issue we also do a deep dive into the topic of environmental stewardship by sitting down with two Christian women who work for an environmental group in the middle of a logging community in northern BC.

If you are an adult who tends to skip over the Come & Explore kids’ section, we encourage you to give this one a read. It will be sure to make you smile.

Click the cover to view in your browser
or click here to download the PDF (8 mb)

INDEX: Are you still able: A nation-wide challenge to experience life without screens / Creation stewards in a logging town / Who do you want to be? RP's 10-day screen-fast challenge / We took the no screens challenge... and now we're changing our habits / What can I do anyways? 35 screen-alternative ideas / Is TikTok the ultimate contraception? / How to stay sane in an overstimulated age / Defeated by distraction / How to use AI like a Christian boss / Who speeches were they? On AI, and others, writing for us / The Way / Who is Mark Carney? / What if we said what we mean? - the political party edition / Am I lazy or just relaxing? What does Proverbs say? / Get out of the game: Christians need to steer clear of sports gambling / Man up: ARPA leaderboards and the call to courageous action / Christians don't pray / Our forever home / Calvin as a comic / The best comics for kids / Fun is something you make: 11 times for family road trips / Come and Explore: Mr. Morose goes to the doctor / Rachel VanEgmond is exploring God's General Revelation / 642 Canadian babies were born alive and left to die / 90 pro-life MPs elected to parliament / Ontario shows why euthanasia "safeguards" can't work / RP's coming to a church near you



News

Canada’s population almost shrinking

The latest population estimation from Statistics Canada is revealing a startling change: Ontario, Quebec, and BC all saw population declines in the first quarter of 2025.

The country as a whole grew by only 20,107 people, which, as a percentage, amounted to a 0.0% increase, the second-slowest growth rate in Canada since records began in 1946. The record prior was the third quarter of 2020, when border restrictions from the Covid-19 pandemic prevented immigration. The decrease has been attributed to announcements by the federal government in 2024 to decrease temporary and permanent immigration levels, with targets of 436,000 for this year, which is still well above the 250,000 level prior to the Liberal government taking office in 2015.

So, in the first quarter of 2025 we lost 17,410 people via emigration to other countries, and there was also a drop of 61,111 in non-permanent residents – people on temporary work or student visas, along with their families. The data also shows that there were 5,628 more deaths than births in the first quarter, largely due to Canada’s quickly declining fertility rate. That’s a collective loss of population of 84,140 people.

Then, going in the other direction, we had 104,256 people immigrate to Canada, for that small net increase of 20,107.

While it is a blessing that people from other countries are still willing and able to move to Canada, it is sobering to note that two-thirds of the world’s populations are now below replacement rate and the world’s population is projected to start declining later this century.

God’s first command to humanity was to “be fruitful and multiply, and fill the earth and subdue it” (Genesis 1:28). Imagine what the world could look like in a few generations if Christians fulfilled this cultural mandate with enthusiasm while the rest of the world continued on its course.


Today's Devotional

July 5 - Fellowship in action

“He who says he abides in Him ought himself also to walk just as He walked.” - 1 John 2:6 

Scripture reading: 1 John 2:3-6; John 14:19-24

When the Bible speaks about knowledge, there is a difference between knowing something in theory and actually knowing something intimately.  The key difference between the two is that something known intimately will actually change the way a >

Today's Manna Podcast

Manna Podcast banner: Manna Daily Scripture Meditations and open Bible with jar logo

The Law of God

Serving #894 of Manna, prepared by B. Tiggelaar, is called "The Law of God" and is based on Matthew 5:17-26; 38-48.











Red heart icon with + sign.
News

Ben Shapiro, Daily Wire, launch a new kids’ TV network

What kind of TV shows or videos do you want your children or grandchildren to watch? What choices are off limits because of objectionable content, or worldviews antithetical to the Christian life? Thankfully we now have one more option to choose from. In 2022, The Daily Wire announced that it would compete with Disney and other studios, by launching a kids’ entertainment division. If you aren’t acquainted with The Daily Wire, you may still have seen YouTube clips from some of their commentators, including Jordan Peterson, Roman Catholics Michael Knowles and Matt Walsh (who made the What is a Woman? documentary), and Jews Ben Shapiro and Dennis Prager. Their entertainment service has now gone live under the brand name “Bentkey” with a handful of original series aimed at kids, and around a dozen series produced by others but vetted by the brand as “safe” viewing for children. Parents and professional reviewers have given thumbs up to the new service, with even left-leaning commenters seeming to appreciate the lack of an agenda in the streaming shows. Christian parents will still need discernment to judge if Bentkey is acceptable for their family’s viewing, but may appreciate this additional choice. Bentkey is available by subscription only, either as part of a membership at DailyWire.com or as a standalone product for $99 U.S. annually at Bentkey.com. (Currently, Canadians can access the kids’ network only through a web browser, while those in the U.S. are able to download an app to their Smart TV or tablet.) And if you’re looking for more entertainment options, last year RP published a whole issue on “Movies that King David might watch,” full of 200+ recommendations of worthwhile films for the Christian family. Find the issue here, and the article at ReformedPerspective.ca/200. Picture by Wirestock Creators/Shutterstock...





Red heart icon with + sign.
News

St. Catharines drops censorship bylaw

Just three weeks before having to appear in court to defend their bylaw that censored pictures of pre-born children, the city of St. Catharines blinked and backed down. ARPA Canada took the city to court in response to a bylaw that forbid delivering any image of a fetus to a private residence unless the material was placed in a sealed envelope with a warning label attached to it. ARPA argued that this bylaw infringed the Charter-protected freedoms of conscience, religion, and expression and was crafted to suppress pro-life content. As the court date drew close, the St. Catharines Standard reported that “councillors repealed the bylaw Monday night after an in-camera meeting with the city’s solicitor.” The turn-about is a good example of the importance of legal action, and the judicial branch of government more generally, as a check against the overreach of government agents using their power to suppress justice and truth. Although the federal and provincial legislatures tend to get the most attention, it is the cities and towns (the municipal level) that most commonly violate the fundamental freedoms protected in the Charter. The newspaper quoted extensively from ARPA’s lawyer John Sikkema, who led the challenge. “The real aim of the bylaw was to suppress opposition to abortion,” he explained to the paper. “Suppressing pro-life speech because some people find it offensive is not a pressing or substantial objective, as the Charter of Rights and Freedoms requires. Rather, in a free and democratic society, is an odious objective.” In a separate note to supporters, ARPA explained that a further outcome of this is that “other cities considering similar bylaws will be much less eager to pursue them.”...

Red heart icon with + sign.
News

One step forward, two steps back in Online Harms bill

What do pornography and hate speech have in common? Well, the federal government says they are both harmful. That’s why they’ve wrapped these issues up together in their recently announced Online Harms Act, otherwise known as Bill C-63. As the government’s news release stated, “Online harms have real world impact with tragic, even fatal, consequences.” As such, the government is of the mind that the responsibility for regulating all sorts of online harm falls to them. But the approach of the government in Bill C-63, though it contains some good content, is inadequate. BACKGROUND In June 2021, the federal government introduced hate speech legislation focused on hate propaganda, hate crime, and hate speech. The bill was widely criticized, including in ARPA Canada’s analysis, and failed to advance prior to the fall 2021 election. Nonetheless, the Liberal party campaigned in part on a promise to bring forward similar legislation within 100 days of re-election. Over two years have passed since the last federal election. In the meantime, the government pursued a consultation and an expert panel on the topic of online harms. Based on these and feedback from stakeholders, the government has now tabled legislation combatting online harm more broadly. Bill C-63 defines seven types of “harmful content”: a) intimate content communicated without consent; b) content that sexually victimizes a child or revictimizes a survivor; c) content that induces a child to harm themselves; d) content used to bully a child; e) content that foments hatred; f) content that incites violence; and g) content that incites violent extremism or terrorism. The hate speech elements of Bill C-63 are problematic for Canadians’ freedom of expression. We will address those further on. But though the bill could be improved, it is a step in the right direction on the issue of child sexual exploitation. DIGITAL SAFETY OVERSIGHT If passed, part 1 of the Online Harms Act will create a new Digital Safety Commission to help develop online safety standards, promote online safety, and administer and enforce the Online Harms Act. A Digital Safety Ombudsperson will also be appointed to advocate for and support online users. The Commission will hold online providers accountable and, along with the Ombudsperson, provide an avenue for victims of online harm to bring forward complaints. Finally, a Digital Safety Office will be established to support the Commission and Ombudsperson. The Commission and Ombudsperson will have a mandate to address any of the seven categories of harm listed above. But their primary focus, according to the bill, will be “content that sexually victimizes a child or revictimizes a survivor” and “intimate content communicated without consent.” Users can submit complaints or make other submissions about harmful content online, and the Commission is given power to investigate and issue compliance orders where necessary. Social media services are the primary target of the Online Harms Act. The Act defines “social media service” as: “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.” Further clarification is provided to include: an adult content service, namely a social media service that is focused on enabling its users to access and share pornographic content; and a live streaming service, namely a social media service that is focused on enabling its users to access and share content by live stream. Oversight will be based on the size of a social media service, including the number of users. So, at the very least, the Digital Safety Commission will regulate online harm not only on major social media sites including Facebook, X, and Instagram, but also on pornography sites and live streaming services. Some specifics are provided in Bill C-63, but the bill would grant the government broad powers to enact regulations to supplement the Act. The bill itself is unclear regarding the extent to which the Commission will address online harm besides pornography, such as hate speech. What we do know is that the Digital Safety Commission and Ombudsman will oversee the removal of “online harms” but will not punish individuals who post or share harmful content. DUTIES OF OPERATORS Three duties laid out in Bill C-63 apply to any operator of a regulated social media service – for example, Facebook or Pornhub. The Act lists three overarching duties that operators of social media services must adhere to. 1. Duty to act responsibly The duty to act responsibly includes: mitigating risks of exposure to harmful content, implementing tools that allow users to flag harmful content, designating an employee as a resource for users of the service, and ensuring that a digital safety plan is prepared. This duty relates to harmful content broadly. Although each category of “harmful content” is defined further in the Act, the operator is responsible to determine whether the content is harmful. While it’s important for the Commission to remove illegal pornography, challenges may arise with the Commission seeking to remove speech that a user has flagged as harmful.  2. Duty to protect children The meaning of the duty to protect children is not clearly defined. The bill notes that: “an operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age-appropriate design, that are provided for by regulations.” This could refer to age-appropriate designs in the sense that children are not drawn into harmful content; it could refer to warning labels on pornography sites, or it could potentially require some level of age-verification for children to access harmful content. These regulations, however, will be established by the Commission following the passage of the Online Harms Act. The Liberal government says that its Online Harms Act makes Bill S-210 unnecessary. Bill S-210 would require age-verification for access to online pornography. In its current form, however, the Online Harms Act does nothing to directly restrict minors’ access to pornography. It would allow minors to flag content as harmful and requires “age-appropriate design” but would not require pornography sites to refuse access to youth. As such, ARPA will continue to advocate for the passage of Bill S-210 to restrict access to pornography and hold pornography sites accountable.  3. Duty to make certain content inaccessible Finally, Bill C-63 will make social media companies responsible for making certain content inaccessible on their platforms. This section is primarily focused on content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. ARPA has lauded provincial efforts in British Columbia and Manitoba to crack down on such content in the past year. If such content is flagged on a site and deemed to be harmful, the operators must make it inaccessible within 24 hours and keep it inaccessible. In 2020, Pornhub was credibly accused of hosting videos featuring minors. Additionally, many women noted that they had requested Pornhub to remove non-consensual videos of themselves and that Pornhub had failed to do so. At the time, ARPA Canada submitted a brief to the Committee studying sexual exploitation on Pornhub. Our first recommendation was that pornography platforms must be required to verify age and consent before uploading content. Second, we recommended that victims must have means for immediate legal recourse to have content removed from the internet. This duty to make content inaccessible will provide some recourse for victims to flag content and have it removed quickly. Further, the Commission will provide accountability to ensure the removal of certain content and that it remains inaccessible. The Act creates a new bureaucratic agency for this purpose rather than holding companies accountable through the Criminal Code. The Criminal Code is arguably a stronger deterrent. For example, Bill C-270, scheduled for second reading in the House of Commons in April 2024, would make it a criminal offence to create or distribute pornographic material without first confirming that any person depicted was over 18 years of age and gave express consent to the content. Bill C-270 would amend the Criminal Code to further protect vulnerable people. Instead of criminal penalties, the Online Harms Act would institute financial penalties for failure to comply with the legislation. Of course, given the sheer volume of online traffic and social media content and the procedural demands of enforcing criminal laws, a strong argument can be made that criminal prohibitions alone are insufficient to deal with the problem. But if new government agencies with oversight powers are to be established, it’s crucial that the limits of their powers are clearly and carefully defined and that they are held accountable to them. THE GOOD NEWS… This first part of the Online Harms Act contains some important attempts to combat online pornography and child sexual exploitation. As Reformed Christians, we understand that a lot of people are using online platforms to promote things that are a direct violation of God’s intention for flourishing in human relationships. This bill certainly doesn’t correct all those wrongs, but it at least recognizes that there is improvement needed for how these platforms are used to ensure vulnerable Canadians are protected. Most Canadians support requiring social media companies to remove child pornography or non-consensual pornography. In a largely unregulated internet, many Canadians also support holding social media companies accountable for such content, especially companies that profit from pornography and sexual exploitation. Bill C-63 is the government’s attempt to bring some regulation to this area. … AND NOW THE BAD NEWS But while some of the problems addressed through the bill are objectively harmful, how do we avoid subjective definitions of harm? Bill C-63 raises serious questions about freedom of expression. Free speech is foundational to democracy. In Canada, it is one of our fundamental freedoms under section 2 of the Charter. Attempts to curtail speech in any way are often seen as an assault on liberty. Bill C-63 would amend the Criminal Code and the Canadian Human Rights Act to combat hate speech online. But the bill gives too much discretion to government actors to decide what constitutes hate speech. HARSHER FOR “HATE SPEECH” CRIMES The Criminal Code has several offences that fall under the colloquial term “hate speech.” The Code prohibits advocating genocide, publicly inciting hatred that is likely to lead to a breach of the peace, or willfully promoting hatred or antisemitism. The latter offence is potentially broader, but it also provides several defenses, including: the statement was true the statement was a good faith attempt to argue a religious view the statement was about an important public issue meriting discussion and the person reasonably believed the statement was true Bill C-63 would increase the maximum penalties for advocating genocide and inciting or promoting hatred or antisemitism. The maximum penalty for advocating genocide would increase to life in prison instead of five years. The bill would also raise the penalty for publicly inciting hatred or promoting hatred or antisemitism to five years instead of the current two. Bill C-63 defines “hatred” as “the emotion that involves detestation or vilification and that is stronger than disdain or dislike.” It also clarifies that a statement does not incite or promote hatred “solely because it discredits, humiliates, hurts or offends.” This clarification is better than nothing, but it inevitably relies on judges to determine the line between statements that are merely offensive or humiliating and those that generate emotions of vilification and detestation. ARPA Canada recently intervened in a criminal hate speech case involving Bill Whatcott. Whatcott was charged with criminal hate speech for handing out flyers at a pride parade warning about the health risks of engaging in homosexual relations. Prosecutors argued that Whatcott was promoting hatred against an identifiable group by condemning homosexual conduct. This is an example of a person being accused of hate speech for expressing his beliefs – his manner of expressing those beliefs, but also the content of his beliefs. NEW STAND-ALONE HATE CRIME OFFENCE The Criminal Code already makes hatred a factor in sentencing. So, for example, if you assault someone and there is conclusive evidence that your assault was motivated by racial hatred, that “aggravating factor” will likely mean a harsher sentence for you. But the offence is still assault, and the maximum penalties for assault still apply. Bill C-63, however, would add a new hate crime offence – any offence motivated by hatred – to the Criminal Code, and it may be punishable by life in prison. It would mean that any crime found to be motivated by hatred would count as two crimes. Consider an act of vandalism, for example. The crime of mischief (which includes damaging property) has a maximum penalty of 10 years. But, if you damaged property because of hatred toward a group defined by race, religion, or sexuality, you could face an additional criminal charge and potentially life in prison. ANTICIPATORY HATE CRIMES? Bill C-63 would permit a person to bring evidence before a court based on fear that someone will commit hate speech or a hate crime in the future. The court may then order the accused to “keep the peace and be of good behavior” for up to 12 months and subject that person to conditions including wearing an electronic monitoring device, curfews, house arrest, or abstaining from consuming drugs or alcohol. There are other circumstances in which people can go to court for fear that a crime will be committed – for example, if you have reason to believe that someone will damage your property, or cause you injury, or commit terrorism. However, challenges with unclear or subjective definitions of hatred will only be accentuated when determining if someone will commit hate speech or a hate crime. BRINGING BACK SECTION 13 This is the first time the government has tried to regulate hate speech. The former section 13 of the Canada Human Rights Act prohibited online communications that were “likely to expose a person or persons to hatred or contempt” on the basis of their race, religion, sexuality, etc. As noted by Joseph Brean in the National Post, section 13 was passed in 1977, mainly in response to telephone hotlines that played racist messages. From there, the restrictions around hate speech were extended to the internet (telecommunications, including internet, falls under federal jurisdiction) until Parliament repealed section 13 in 2013. Joseph Brean writes that section 13 “was basically only ever used by one complainant, a lawyer named Richard Warman, who targeted white supremacists and neo-Nazis and never lost.” In fact, Warman brought forward 16 hate speech cases and won them all. A catalyst for the controversy over human rights hate speech provisions was a case involving journalist Ezra Levant. Levant faced a human rights complaint for publishing Danish cartoons of Muhammad in 2006. In response to being charged, Levant published a video of an interview with an investigator from the Alberta Human Rights Commission. Then in 2007, a complaint was brought against Maclean’s magazine for publishing an article by Mark Steyn that was critical of Islam. Such stories brought section 13 to public attention and revealed how human rights law was being used to quash officially disapproved political views. Bill C-63 would bring back a slightly revised section 13. The new section 13 states: “It is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination.” A few exceptions apply. For example, this section would not apply to private communication or to social media services that are simply hosting content posted and shared by users. So, for example, if someone wanted to bring a complaint about an ARPA post on Facebook, that complaint could be brought against ARPA, but not against Facebook. If a person is found guilty of hate speech, the Human Rights Tribunal may order the offender to pay up to $20,000 to the victim, and up to $50,000 to the government. This possibility of financial benefit incentivizes people to bring forward hate speech complaints. British Columbia has a similar hate speech provision in its Human Rights Code. ARPA wrote about how that provision was interpreted and enforced to punish someone for saying that a “trans woman” is really a man. The Tribunal condemned a flyer in that case for “communicat rejection of diversity in the individual self-fulfillment of living in accordance with one’s own gender identity.” The Tribunal went on to reject the argument that the flyer was not intended to promote hatred or discrimination, “but only to ‘bring attention to what views as immoral behaviour, based on his religious belief as a Christian’.” Ultimately, the Tribunal argued that there was no difference between promoting hatred and bringing attention to what the defendant viewed as immoral behavior. NO DEFENSES FOR CHRISTIANS? As noted above, when it comes to the Criminal Code’s hate speech offences, there are several defenses available (truth, expressing a religious belief, and advancing public debate). These are important defenses that allow Canadians to say what they believe to be true and to express sincere religious beliefs. But the Canadian Human Rights Act offers no defenses. And complaints of hate speech in human rights law are far easier to bring and to prosecute than criminal charges. Criminal law requires proof beyond reasonable doubt. But under the Human Rights Act, statements that are likely (i.e. 51% chance, in Tribunal’s view) to cause detestation or vilification will be punishable. So, hate speech would be regulated in two different places, the Criminal Code and the Human Rights Act, the latter offering fewer procedural rights and a lower standard of proof. Bill C-63 clarifies that a statement is not detestation or vilification “solely because it expresses disdain or dislike or it discredits, humiliates, hurts or offends.” But again, the line between dislike and detestation is unclear. Human rights complaints are commonly submitted because of humiliation or offence, rather than any clear connection to detestation or vilification. Section 13 leaves too much room for subjective and ideologically motivated interpretations of what constitutes hate speech. The ideological bias that often manifests is a critical theory lens, which sees “privileged” groups like Christians as capable only of being oppressors/haters, while others are seen as “equity-seeking” groups. For example, in a 2003 case called Johnson v. Music World Ltd., a complaint was made against the writer of a song called “Kill the Christian.” A sample: Armies of darkness unite  Destroy their temples and churches with fire  Where in this world will you hide  Sentenced to death, the anointment of christ   Put you out of your misery  The death of prediction  Kill the christian  Kill the christian…dead!  The Tribunal noted that the content and tone appeared to be hateful. However, because the Tribunal thought Christians were not a vulnerable group, it decided this was not hate speech. By contrast, in a 2008 case called Lund v. Boissoin, a panel deemed a letter to the editor of a newspaper that was critical of homosexuality to be hate speech. The chair of the panel was the same person in both Johnson and Lund. Hate speech provisions are potentially problematic for Christians who seek to speak truth about various issues in our society. Think about conversion therapy laws that ban talking about biblical gender and sexuality in some settings, or bubble zone laws that prevent pro-life expression in designated areas. But beyond that, freedom of speech is also important for those with whom we may disagree. It is important to be able to have public dialogue on various public issues.    GOVERNMENT’S ROLE IN REGULATING SPEECH This all raises serious questions about whether the government should be regulating “hate speech” at all. After all, hate speech provisions in the Human Rights Act or the Criminal Code have led and could lead to inappropriate censorship. But government also has a legitimate role to play in protecting citizens from harm.  1. Reputational harm and safety from threats of violence Arguably the government’s role in protecting citizens from harm includes reputational harm. Imagine someone was spreading accusations in your town that everyone in your church practices child abuse, for example. That is an attack on your reputation as a group and as individual members of the group – which is damaging and could lead to other harms, possibly even violence. Speech can do real damage. But Jeremy Waldron, a prominent legal philosopher and a Christian, suggests that the best way to think about and enforce “hate speech” laws is as a prohibition on defaming or libeling a group, similar to how our law has long punished defaming or libeling an individual. Such a conception may help to rein in the scope of what we call “hate speech,” placing the focus on demonstrably false and damaging accusations, rather than on controversial points of view on matters relating to religion or sexuality, for example. Hatred is a sin against the 6th commandment, but the government cannot regulate or criminalize emotions per se or expressions of them, except insofar as they are expressed in and through criminal acts or by encouraging others to commit criminal acts. That’s why we rightly have provisions against advocating or inciting terrorism or genocide, or counseling or encouraging someone to commit assault, murder, or any other crime. When the law fails to set an objective standard, however, it is open to abuse – for example, by finding a biblical view of gender and sexuality to constitute hate speech. Regrettably, Bill C-63 opens up more room for subjectivity and ideologically based restrictions on speech. It does nothing to address the troubling interpretations of “hate speech” that we’ve seen in many cases in the past. And, by putting hate speech back into the Human Rights Act, the bill makes many more such abuses possible. We suspect it will result in restricting speech that is culturally unacceptable rather than objectively harmful.  2. Harm of pornography As discussed earlier, Bill C-63 does introduce some good restrictions when it comes to online pornography. In our view, laws restricting pornography are categorically different from laws restricting “hate speech,” because the former laws are not designed to or in danger of being applied to censor beliefs, opinions, or arguments. Restricting illegal pornography prevents objectively demonstrable harm. Pornography takes acts that ought to express love and marital union and displays them for consumption and the gratification of others. Much of it depicts degrading or violent behavior. Pornography’s harms, especially to children, are well documented. The argument is often made that pornography laws risk censoring artistic expression involving sexuality or nudity. But Canada is very far, both culturally and legally, from censoring art for that reason – and Bill C-63 wouldn’t do so. Its objectives as they relate to pornography are mainly to reduce the amount of child pornography and non-consensual pornography easily available online.  CONCLUSION While the Online Harms Act contains some good elements aimed at combatting online pornography, its proposed hate speech provisions are worrisome. Unfortunately, the federal government chose to deal with both issues in one piece of legislation – this should have been two separate bills. As Bill C-63 begins to progress through the House of Commons, we can continue to support Bills S-210 and C-270, private members’ bills which combat the online harms of pornography. Meanwhile, head to ARPACanada.org for action items related to the Online Harms Act. ...