Social media platforms must abandon algorithmic secrecy | 社群媒體平臺必須放棄演算法保密 - FT中文網
登錄×
電子郵件/用戶名
密碼
記住我
請輸入郵箱和密碼進行綁定操作:
請輸入手機號碼,透過簡訊驗證(目前僅支援中國大陸地區的手機號):
請您閱讀我們的用戶註冊協議私隱權保護政策,點擊下方按鈕即視爲您接受。
FT英語電臺

Social media platforms must abandon algorithmic secrecy
社群媒體平臺必須放棄演算法保密

More transparency is required about the algorithms that wield enormous power over billions of people
對數十億人擁有巨大權力的演算法需要更高的透明度。
00:00

Algorithmic advances have spurred quantum leaps in myriad human endeavours — including medical devices, models for climate change, financial trading systems, GPS mapping, even online dating.

But while these algorithmic forces impact our daily lives in beneficial ways, they are often inaccessible and mysterious to the average citizen.

In driving our social media platforms, feeding us daily news, family updates and friend suggestions, algorithms have held us rapt for some time. Yet most of us have neither the training nor the faculties to understand how these systems impact us, and the protocols that govern them.

Frederick Mostert and Alex Urbelis

We must take the word of others about how these arcane systems work and of what they are composed.

The testimony of technology executives called before the UK parliament or US congress to explain algorithmic data processes, or account for data breaches, tells us little about how their algorithms really operate.

And the algorithms animating our social media news feeds are often protected as trade secrets, and not found on a publicly accessible registry such as the US or UK Patent Office.

「Patents work on the basis of sufficient disclosure of an inventor』s scientific innovation to the benefit of society,」 says Tanya Aplin, our colleague at King』s College. 「Trade secrets, on the other hand, keep the knowhow of formulas and technical developments confidential. 」

This balance between full disclosure and secrecy sits at the heart of the debate around the use of algorithmic forces.

Critically, these algorithmic systems have no form of community review. Also, an epistemological conundrum compounds the problem: we do not know who knows how these algorithms work.

Welton Chang, chief technology officer at Human Rights First, an advocacy group, says: 「Within the labyrinthine structures of social media companies, it is doubtful that there is a department or team with full visibility of a platform』s secretive black box of algorithms. 」

Algorithmic, robotic content has, in large part, assisted and powered election interference, fomented domestic rebellion and facilitated extremism online.

Absolute and unchecked, platforms enabled by algorithms wield enormous power over billions of citizens worldwide.

Frank Pasquale, writing about secretive algorithms in his book The Black Box Society, says: 「However savvy absolute secrecy may be as a business strategy, it is doubtful that public policy should be encouraging it. 」

Protecting algorithmic 「secret sauces」 via trade secret law has become de rigueur over the past few years. Ironically, the antithesis of this approach — the open-source algorithm — may not only be these algorithms』 saving grace, but one possible antidote against secrecy.

By disclosing their formulas for the benefit of society, open-source algorithms allow a cross section of professionals to examine the fundamental principles at play.

With platforms inextricably linked to our political and democratic processes, it is time to abandon secrecy and mystery in favour of transparency

Security researchers can determine whether our personal data were put at risk during algorithmic processing. Human rights organisations can help to avoid infringement of our fundamental freedoms. Academics can dig into these systems for bias.

But, until we have some basic understanding of how social media algorithms use our personal data, platforms will always be able to resist accountability and efforts at regulation will be too imprecise to make an impact.

「Users have the right to know what inputs are being made both into the algorithms that choose their content and those used to moderate their content,」 says Jillian York, author of Silicon Values.

While the underlying algorithms at play in apps remain opaque and inaccessible, a new step in this direction is Apple』s App Tracking Transparency program. This feature returns some measure of control over personal data to users who can prevent tracking across third-party apps and websites.

Full disclosure and transparency, as opposed to secrecy, form the foundations of liberal democracies. With platforms inextricably linked to our political and democratic processes, it is time to abandon secrecy and mystery in favour of transparency.

Social media platform users must be able to come to their own conclusions about the place of the digital algorithm in their lives.

Open, transparent, fair and accountable algorithmic decision-making processes should form the linchpin of operating principles set for and by platforms and policymakers.

Frederick Mostert is a professor of practice in intellectual property law at King』s College, London and a member of the Digital Scholarship Institute.

Alex Urbelis is a partner at the Blackstone Law Group LLP and a member of Human Rights First』s Technology Advisory Board

版權聲明:本文版權歸FT中文網所有,未經允許任何單位或個人不得轉載,複製或以任何其他方式使用本文全部或部分,侵權必究。

和平協議的藝術

美國和俄羅斯之間的談判能否就烏克蘭問題達成持久的解決方案?歷史學家瑪格麗特•麥克米倫認爲,目前的跡象並不樂觀。

詹姆斯•邦德:屬於美國的英國間諜

伊恩•弗萊明的虛構作品,對英國情報機構來說既是福音又是負擔,如今將由科技巨擘亞馬遜來重新定義。

曼聯正在處理錯誤的問題

十年來積累的體育失敗如今正在衝擊俱樂部的盈虧底線。

政治學家比約恩•隆堡:『你不能在所有事情上都花錢』

這位「持懷疑態度的環保主義者」曾利用成本效益分析來反對減排。現在,他將注意力轉向了海外援助。

建立了一個隱祕全球房地產帝國的孟加拉政治家

賽福扎曼•喬杜裏和他的家人在海外購買了482處房產,花費了2.95億美元。新政府希望收回其中的一部分資金。

脫碳的自私指南

新的研究表明,即使美國對全球協議完全不感興趣,仍然會看到減少排放帶來的成本效益。
設置字型大小×
最小
較小
默認
較大
最大
分享×