Skip to Main Content
News@Ivey · Communications

HBA students learn about Facebook strategies to stop misinformation

Oct 3, 2018

Facebook Canada_Chan

With new initiatives, and a fact-checking partnership, Facebook is getting tough on misinformation and protecting election integrity.

“When it comes to election integrity, we take our responsibility seriously,” said Kevin Chan, Global Director and Head of Public Policy at Facebook Canada. “And that’s why we are devoting significant time, energy and resources to this issue. In the lead-up to the U.S. presidential election in 2016, we were slow to identify the new risks and bad actors, and we were slow to act. We are determined to make this right.”

Chan, HBA ’00, spoke to HBA2 students as part of the Corporations and Society class. He was invited by Ivey Assistant Professor Diane-Laure Arjaliés', the coordinator and one of the instructors of the course. The social media platform, which has approximately 24 million Canadian accounts, is already preparing for the federal election in 2019.

Election integrity

Facebook Canada’s five-part Canadian Election Integrity Initiative is based on a report released by the Communications Security Establishment (CSE) that outlined the potential cyber threats to the next election. The platform includes a two-year partnership between Facebook Canada and MediaSmarts to promote digital and news literacy, plus a cyber hygiene guide for Canadian political parties and politicians, and a Facebook cyber hygiene training program open to all Canadian federal political parties.

Facebook also launched a third-party fact-checking partnership with Agence France-Presse to engage Canadian fact-checkers to review news stories on Facebook in French and English in order to rate their accuracy.

Chan also walked students through the broader public policy issues raised about Facebook and democracy.

Facebook and filter bubbles

The filter bubble argument claims that, because of Facebook’s News Feed algorithm, individuals only see content that reinforces their pre-existing world views, leading to polarization and societal division. Chan pointed to research that shows approximately 23 per cent of the content in a News Feed will come from a different point of view.

Legal liability

Online platforms have been likened to digital newspapers and there are calls for them to be legally liable for content on their platforms, requiring vetting by editors in the same way as traditional media. This is not something Facebook believes is appropriate, Chan said. 

“From the beginning, Facebook has been about giving people voice, and we think people should be allowed to communicate and share information freely with each other without being reviewed and approved by Facebook or some other intermediary, and I think most people would agree with that.”

Beyond the reach of rule and law

Many argue that Internet platforms are beyond the reach of rules and laws. But Chan said Facebook has a responsibility for what appears on the platform. “That’s why we have strict rules in place, and why we adhere to local laws.”

Government regulations

Its ad transparency initiative is one example of how Facebook is working to comply with upcoming legislation in Canada, which would require organizations selling advertising space to not knowingly accept elections advertisements from foreign individuals.

Are things better? Or worse?

“The battle against misinformation online is obviously much larger than Facebook, and I can only comment on what we are doing on our platform. I think it is too early for anyone to be definitive, but I am cautiously optimistic that we are moving in the right direction.”