It’s the final day of Collision Conference, and we’re sad to see it end. We’ve seen some of the best and brightest in the realms of privacy and security, and it’s been a fantastic learning opportunity.
As the proverbial icing on the cake, today saw some outstanding conversations from the likes of Chelsea Clinton, Brittany Kaiser, Katie Moussouris, and Rebecca Parsons across a range of topics: from tackling misinformation and censorship, to data dividends and responsible tech. The first talk that really got us riled up today was Clinton’s discussion of censorship and misinformation. In it, she addressed how social media is playing a vital role in the spread of information (both true and false) regarding both the pandemic and the vaccines. She explained her concerns about the self-accountability of platforms such as Facebook and Twitter, stating:
I would hope that the social media companies would take more responsibility for the role that they’re playing as a public health landscape now in our lives, as well as clearly a marketplace for information. I also think that if they’re not willing to do that, we have to put pressure on their major ad buyers, even take a page out of the ‘Stop Hate for Profit’ campaign last summer (…) in which major civil rights organizations in the united states said that it was unacceptable that you could just monetize hate.
Clinton’s sentiments ring true: we need to hold such platforms accountable for the role they play in weaving of the fabric of society, be it for better or worse. On top of this, the algorithms used by both those platforms, and their advertising partners must be rigorously scrutinized, as they can generate echo chambers for people online. Such environments are rife when advertisers show people the products and information that they want to see (regardless of truth), in the hopes of making a quick buck, instead of what they need to see (i.e. the science-led information and research).
However, we need to ensure that the censorship of factually incorrect information does not impede the freedom of speech for individuals. The fundamental principals of democracy is our ability to disagree with one another, and an inalienable need for discourse. Legislators must tread carefully as they set regulations for platforms, and must clearly define the consequences of knowingly sharing misinformation online.
Our key takeaways from the conference:
A lot has gone on over the past three days, but a few themes kept cropping up…
We need more privacy
It was the thought on everyone’s lips; people’s information has become a commodity: traded and thrown around with total disregard for the individual. Moreover, everything is trying to add “smart” capabilities, creating an interconnected network of devices that, while immensely convenient, create countless weak points and targets for hackers across the globe. As Alex Stamos rightfully said: “make applications dumb again“!
We need more regulation
Not enough is being done to protect the digital privacy and data of people, and although we’re seeing significant strides made across Europe, these decisions are often uninformed, and ambiguous – leaving plenty of wiggle room for companies to work around the legislation. As Katherine Maher and Kaiser noted yesterday, legislators need to be more informed when creating legislation so as to produce more rigid and comprehensive regulations that protect our digital freedoms without sacrificing our privacy.
We need more enforcement
As Katherine Maher discussed yesterday, “regulation without enforcement is nothing“, and we completely agree. It’s fine setting the rules, but if no one is following them, and there’re no consequences for those actions, then companies will continue to exploit our data at every opportunity.
Collision had an enormous impact on us; a truly inspirational series of talks that we couldn’t be more thankful for. Hopefully, we can attend next year in person!