下載App 希平方
攻其不背
App 開放下載中
下載App 希平方
攻其不背
App 開放下載中
IE版本不足
您的瀏覽器停止支援了😢使用最新 Edge 瀏覽器或點選連結下載 Google Chrome 瀏覽器 前往下載

免費註冊
! 這組帳號已經註冊過了
Email 帳號
密碼請填入 6 位數以上密碼
已經有帳號了?
忘記密碼
! 這組帳號已經註冊過了
您的 Email
請輸入您註冊時填寫的 Email,
我們將會寄送設定新密碼的連結給您。
寄信了!請到信箱打開密碼連結信
密碼信已寄至
沒有收到信嗎?
如果您尚未收到信,請前往垃圾郵件查看,謝謝!

恭喜您註冊成功!

查看會員功能

註冊未完成

《HOPE English 希平方》服務條款關於個人資料收集與使用之規定

隱私權政策
上次更新日期:2014-12-30

希平方 為一英文學習平台,我們每天固定上傳優質且豐富的影片內容,讓您不但能以有趣的方式學習英文,還能增加內涵,豐富知識。我們非常注重您的隱私,以下說明為當您使用我們平台時,我們如何收集、使用、揭露、轉移及儲存你的資料。請您花一些時間熟讀我們的隱私權做法,我們歡迎您的任何疑問或意見,提供我們將產品、服務、內容、廣告做得更好。

本政策涵蓋的內容包括:希平方學英文 如何處理蒐集或收到的個人資料。
本隱私權保護政策只適用於: 希平方學英文 平台,不適用於非 希平方學英文 平台所有或控制的公司,也不適用於非 希平方學英文 僱用或管理之人。

個人資料的收集與使用
當您註冊 希平方學英文 平台時,我們會詢問您姓名、電子郵件、出生日期、職位、行業及個人興趣等資料。在您註冊完 希平方學英文 帳號並登入我們的服務後,我們就能辨認您的身分,讓您使用更完整的服務,或參加相關宣傳、優惠及贈獎活動。希平方學英文 也可能從商業夥伴或其他公司處取得您的個人資料,並將這些資料與 希平方學英文 所擁有的您的個人資料相結合。

我們所收集的個人資料, 將用於通知您有關 希平方學英文 最新產品公告、軟體更新,以及即將發生的事件,也可用以協助改進我們的服務。

我們也可能使用個人資料為內部用途。例如:稽核、資料分析、研究等,以改進 希平方公司 產品、服務及客戶溝通。

瀏覽資料的收集與使用
希平方學英文 自動接收並記錄您電腦和瀏覽器上的資料,包括 IP 位址、希平方學英文 cookie 中的資料、軟體和硬體屬性以及您瀏覽的網頁紀錄。

隱私權政策修訂
我們會不定時修正與變更《隱私權政策》,不會在未經您明確同意的情況下,縮減本《隱私權政策》賦予您的權利。隱私權政策變更時一律會在本頁發佈;如果屬於重大變更,我們會提供更明顯的通知 (包括某些服務會以電子郵件通知隱私權政策的變更)。我們還會將本《隱私權政策》的舊版加以封存,方便您回顧。

服務條款
歡迎您加入看 ”希平方學英文”
上次更新日期:2013-09-09

歡迎您加入看 ”希平方學英文”
感謝您使用我們的產品和服務(以下簡稱「本服務」),本服務是由 希平方學英文 所提供。
本服務條款訂立的目的,是為了保護會員以及所有使用者(以下稱會員)的權益,並構成會員與本服務提供者之間的契約,在使用者完成註冊手續前,應詳細閱讀本服務條款之全部條文,一旦您按下「註冊」按鈕,即表示您已知悉、並完全同意本服務條款的所有約定。如您是法律上之無行為能力人或限制行為能力人(如未滿二十歲之未成年人),則您在加入會員前,請將本服務條款交由您的法定代理人(如父母、輔助人或監護人)閱讀,並得到其同意,您才可註冊及使用 希平方學英文 所提供之會員服務。當您開始使用 希平方學英文 所提供之會員服務時,則表示您的法定代理人(如父母、輔助人或監護人)已經閱讀、了解並同意本服務條款。 我們可能會修改本條款或適用於本服務之任何額外條款,以(例如)反映法律之變更或本服務之變動。您應定期查閱本條款內容。這些條款如有修訂,我們會在本網頁發佈通知。變更不會回溯適用,並將於公布變更起十四天或更長時間後方始生效。不過,針對本服務新功能的變更,或基於法律理由而為之變更,將立即生效。如果您不同意本服務之修訂條款,則請停止使用該本服務。

第三人網站的連結 本服務或協力廠商可能會提供連結至其他網站或網路資源的連結。您可能會因此連結至其他業者經營的網站,但不表示希平方學英文與該等業者有任何關係。其他業者經營的網站均由各該業者自行負責,不屬希平方學英文控制及負責範圍之內。

兒童及青少年之保護 兒童及青少年上網已經成為無可避免之趨勢,使用網際網路獲取知識更可以培養子女的成熟度與競爭能力。然而網路上的確存有不適宜兒童及青少年接受的訊息,例如色情與暴力的訊息,兒童及青少年有可能因此受到心靈與肉體上的傷害。因此,為確保兒童及青少年使用網路的安全,並避免隱私權受到侵犯,家長(或監護人)應先檢閱各該網站是否有保護個人資料的「隱私權政策」,再決定是否同意提出相關的個人資料;並應持續叮嚀兒童及青少年不可洩漏自己或家人的任何資料(包括姓名、地址、電話、電子郵件信箱、照片、信用卡號等)給任何人。

為了維護 希平方學英文 網站安全,我們需要您的協助:

您承諾絕不為任何非法目的或以任何非法方式使用本服務,並承諾遵守中華民國相關法規及一切使用網際網路之國際慣例。您若係中華民國以外之使用者,並同意遵守所屬國家或地域之法令。您同意並保證不得利用本服務從事侵害他人權益或違法之行為,包括但不限於:
A. 侵害他人名譽、隱私權、營業秘密、商標權、著作權、專利權、其他智慧財產權及其他權利;
B. 違反依法律或契約所應負之保密義務;
C. 冒用他人名義使用本服務;
D. 上載、張貼、傳輸或散佈任何含有電腦病毒或任何對電腦軟、硬體產生中斷、破壞或限制功能之程式碼之資料;
E. 干擾或中斷本服務或伺服器或連結本服務之網路,或不遵守連結至本服務之相關需求、程序、政策或規則等,包括但不限於:使用任何設備、軟體或刻意規避看 希平方學英文 - 看 YouTube 學英文 之排除自動搜尋之標頭 (robot exclusion headers);

服務中斷或暫停
本公司將以合理之方式及技術,維護會員服務之正常運作,但有時仍會有無法預期的因素導致服務中斷或故障等現象,可能將造成您使用上的不便、資料喪失、錯誤、遭人篡改或其他經濟上損失等情形。建議您於使用本服務時宜自行採取防護措施。 希平方學英文 對於您因使用(或無法使用)本服務而造成的損害,除故意或重大過失外,不負任何賠償責任。

版權宣告
上次更新日期:2013-09-16

希平方學英文 內所有資料之著作權、所有權與智慧財產權,包括翻譯內容、程式與軟體均為 希平方學英文 所有,須經希平方學英文同意合法才得以使用。
希平方學英文歡迎你分享網站連結、單字、片語、佳句,使用時須標明出處,並遵守下列原則:

  • 禁止用於獲取個人或團體利益,或從事未經 希平方學英文 事前授權的商業行為
  • 禁止用於政黨或政治宣傳,或暗示有支持某位候選人
  • 禁止用於非希平方學英文認可的產品或政策建議
  • 禁止公佈或傳送任何誹謗、侮辱、具威脅性、攻擊性、不雅、猥褻、不實、色情、暴力、違反公共秩序或善良風俗或其他不法之文字、圖片或任何形式的檔案
  • 禁止侵害或毀損希平方學英文或他人名譽、隱私權、營業秘密、商標權、著作權、專利權、其他智慧財產權及其他權利、違反法律或契約所應付支保密義務
  • 嚴禁謊稱希平方學英文辦公室、職員、代理人或發言人的言論背書,或作為募款的用途

網站連結
歡迎您分享 希平方學英文 網站連結,與您的朋友一起學習英文。

抱歉傳送失敗!

不明原因問題造成傳送失敗,請儘速與我們聯繫!
希平方 x ICRT

「James Bridle:YouTube 兒童影片的惡夢,以及現代網路的問題」- The Nightmare Videos of Children's YouTube—and What's Wrong with the Internet Today

觀看次數:1776  • 

框選或點兩下字幕可以直接查字典喔!

I'm James. I'm a writer and artist, and I make work about technology. I do things like draw life-size outlines of military drones in city streets around the world, so that people can start to think and get their heads around these really quite hard-to-see and hard-to-think-about technologies. I make things like neural networks that predict the results of elections based on weather reports, because I'm intrigued about what the actual possibilities of these weird new technologies are. Last year, I built my own self-driving car. But because I don't really trust technology, I also designed a trap for it.
And I do these things mostly because I find them completely fascinating, but also because I think when we talk about technology, we're largely talking about ourselves and the way that we understand the world. So here's a story about technology.

This is a "surprise egg" video. It's basically a video of someone opening up loads of chocolate eggs and showing the toys inside to the viewer. That's it. That's all it does for seven long minutes. And I want you to notice two things about this. First of all, this video has 30 million views.

And the other thing is, it comes from a channel that has 6.3 million subscribers, that has a total of eight billion views, and it's all just more videos like this—30 million people watching a guy opening up these eggs. It sounds pretty weird, but if you search for "surprise eggs" on YouTube, it'll tell you there's 10 million of these videos, and I think that's an undercount. I think there's way, way more of these. If you keep searching, they're endless. There's millions and millions of these videos in increasingly baroque combinations of brands and materials, and there's more and more of them being uploaded every single day. Like, this is a strange world. Right?

But the thing is, it's not adults who are watching these videos. It's kids, small children. These videos are like crack for little kids. There's something about the repetition, the constant little dopamine hit of the reveal, that completely hooks them in. And little kids watch these videos over and over and over again, and they do it for hours and hours and hours. And if you try and take the screen away from them, they'll scream and scream and scream. If you don't believe me—and I've already seen people in the audience nodding—if you don't believe me, find someone with small children and ask them, and they'll know about the surprise egg videos. So this is where we start. It's 2018, and someone, or lots of people, are using the same mechanism that, like, Facebook and Instagram are using to get you to keep checking that app, and they're using it on YouTube to hack the brains of very small children in return for advertising revenue.

At least, I hope that's what they're doing. I hope that's what they're doing it for, because there's easier ways of making ad revenue on YouTube. You can just make stuff up or steal stuff. So if you search for really popular kids' cartoons like "Peppa Pig" or "Paw Patrol," you'll find there's millions and millions of these online as well. Of course, most of them aren't posted by the original content creators. They come from loads and loads of different random accounts, and it's impossible to know who's posting them or what their motives might be. Does that sound kind of familiar? Because it's exactly the same mechanism that's happening across most of our digital services, where it's impossible to know where this information is coming from. It's basically fake news for kids, and we're training them from birth to click on the very first link that comes along, regardless of what the source is. That's doesn't seem like a terribly good idea.

Here's another thing that's really big on kids' YouTube. This is called the "Finger Family Song." I just heard someone groan in the audience. This is the "Finger Family Song." This is the very first one I could find. It's from 2007, and it only has 200,000 views, which is, like, nothing in this game. But it has this insanely earwormy tune, which I'm not going to play to you, because it will sear itself into your brain in the same way that it seared itself into mine, and I'm not going to do that to you. But like the surprise eggs, it's got inside kids' heads and addicted them to it. So within a few years, these finger family videos start appearing everywhere, and you get versions in different languages with popular kids' cartoons using food or, frankly, using whatever kind of animation elements you seem to have lying around. And once again, there are millions and millions and millions of these videos available online in all of these kind of insane combinations. And the more time you start to spend with them, the crazier and crazier you start to feel that you might be.
And that's where I kind of launched into this, that feeling of deep strangeness and deep lack of understanding of how this thing was constructed that seems to be presented around me. Because it's impossible to know where these things are coming from. Like, who is making them? Some of them appear to be made of teams of professional animators. Some of them are just randomly assembled by software. Some of them are quite wholesome-looking young kids' entertainers. And some of them are from people who really clearly shouldn't be around children at all.

And once again, this impossibility of figuring out who's making this stuff—like, this is a bot? Is this a person? Is this a troll? What does it mean that we can't tell the difference between these things anymore? And again, doesn't that uncertainty feel kind of familiar right now?

So the main way people get views on their videos—and remember, views mean money—is that they stuff the titles of these videos with these popular terms. So you take, like, "surprise eggs" and then you add "Paw Patrol," "Easter egg," or whatever these things are, all of these words from other popular videos into your title, until you end up with this kind of meaningless mash of language that doesn't make sense to humans at all. Because of course it's only really tiny kids who are watching your video, and what the hell do they know? Your real audience for this stuff is software. It's the algorithms. It's the software that YouTube uses to select which videos are like other videos, to make them popular, to make them recommended. And that's why you end up with this kind of completely meaningless mash, both of title and of content.

But the thing is, you have to remember, there really are still people within this algorithmically optimized system, people who are kind of increasingly forced to act out these increasingly bizarre combinations of words, like a desperate improvisation artist responding to the combined screams of a million toddlers at once. There are real people trapped within these systems, and that's the other deeply strange thing about this algorithmically driven culture, because even if you're human, you have to end up behaving like a machine just to survive.

And also, on the other side of the screen, there still are these little kids watching this stuff, stuck, their full attention grabbed by these weird mechanisms. And most of these kids are too small to even use a website. They're just kind of hammering on the screen with their little hands. And so there's autoplay, where it just keeps playing these videos over and over and over in a loop, endlessly for hours and hours at a time. And there's so much weirdness in the system now that autoplay takes you to some pretty strange places. This is how, within a dozen steps, you can go from a cute video of a counting train to masturbating Mickey Mouse. Yeah. I'm sorry about that. This does get worse. This is what happens when all of these different keywords, all these different pieces of attention, this desperate generation of content, all comes together into a single place. This is where all those deeply weird keywords come home to roost. You cross-breed the finger family video with some live-action superhero stuff, you add in some weird, trollish in-jokes or something, and suddenly, you come to a very weird place indeed.

The stuff that tends to upset parents is the stuff that has kind of violent or sexual content, right? Children's cartoons getting assaulted, getting killed, weird pranks that actually genuinely terrify children. What you have is software pulling in all of these different influences to automatically generate kids' worst nightmares. And this stuff really, really does affect small children. Parents report their children being traumatized, becoming afraid of the dark, becoming afraid of their favorite cartoon characters. If you take one thing away from this, it's that if you have small children, keep them the hell away from YouTube.

But the other thing, the thing that really gets to me about this, is that I'm not sure we even really understand how we got to this point. We've taken all of this influence, all of these things, and munged them together in a way that no one really intended. And yet, this is also the way that we're building the entire world. We're taking all of this data, a lot of it bad data, a lot of historical data full of prejudice, full of all of our worst impulses of history, and we're building that into huge data sets and then we're automating it. And we're munging it together into things like credit reports, into insurance premiums, into things like predictive policing systems, into sentencing guidelines. This is the way we're actually constructing the world today out of this data. And I don't know what's worse, that we built a system that seems to be entirely optimized for the absolute worst aspects of human behavior, or that we seem to have done it by accident, without even realizing that we were doing it, because we didn't really understand the systems that we were building, and we didn't really understand how to do anything differently with it.

There's a couple of things I think that really seem to be driving this most fully on YouTube, and the first of those is advertising, which is the monetization of attention without any real other variables at work, any care for the people who are actually developing this content, the centralization of the power, the separation of those things. And I think however you feel about the use of advertising to kind of support stuff, the sight of grown men in diapers rolling around in the sand in the hope that an algorithm that they don't really understand will give them money for it suggests that this probably isn't the thing that we should be basing our society and culture upon, and the way in which we should be funding it.

And the other thing that's kind of the major driver of this is automation, which is the deployment of all of this technology as soon as it arrives, without any kind of oversight, and then once it's out there, kind of throwing up our hands and going, "Hey, it's not us, it's the technology." Like, "We're not involved in it." That's not really good enough, because this stuff isn't just algorithmically governed, it's also algorithmically policed. When YouTube first started to pay attention to this, the first thing they said they'd do about it was that they'd deploy better machine learning algorithms to moderate the content. Well, machine learning, as any expert in it will tell you, is basically what we've started to call software that we don't really understand how it works. And I think we have enough of that already. We shouldn't be leaving this stuff up to AI to decide what's appropriate or not, because we know what happens. It'll start censoring other things. It'll start censoring queer content. It'll start censoring legitimate public speech. What's allowed in these discourses, it shouldn't be something that's left up to unaccountable systems. It's part of a discussion all of us should be having.

But I'd leave a reminder that the alternative isn't very pleasant, either. YouTube also announced recently that they're going to release a version of their kids' app that would be entirely moderated by humans. Facebook—Zuckerberg said much the same thing at Congress, when pressed about how they were going to moderate their stuff. He said they'd have humans doing it. And what that really means is, instead of having toddlers being the first person to see this stuff, you're going to have underpaid, precarious contract workers without proper mental health support being damaged by it as well.
And I think we can all do quite a lot better than that.

The thought, I think, that brings those two things together, really, for me, is agency. It's like, how much do we really understand—by agency, I mean: how we know how to act in our own best interests. Which—it's almost impossible to do in these systems that we don't really fully understand. Inequality of power always leads to violence. And we can see inside these systems that inequality of understanding does the same thing. If there's one thing that we can do to start to improve these systems, it's to make them more legible to the people who use them, so that all of us have a common understanding of what's actually going on here.

The thing, though, I think most about these systems is that this isn't, as I hope I've explained, really about YouTube. It's about everything. These issues of accountability and agency, of opacity and complexity, of the violence and exploitation that inherently results from the concentration of power in a few hands—these are much, much larger issues. And they're issues not just of YouTube and not just of technology in general, and they're not even new. They've been with us for ages. But we finally built this system, this global system, the internet, that's actually showing them to us in this extraordinary way, making them undeniable. Technology has this extraordinary capacity to both instantiate and continue all of our most extraordinary, often hidden desires and biases and encoding them into the world, but it also writes them down so that we can see them, so that we can't pretend they don't exist anymore. We need to stop thinking about technology as a solution to all of our problems, but think of it as a guide to what those problems actually are, so we can start thinking about them properly and start to address them.

Thank you very much.

Thank you.

James, thank you for coming and giving us that talk. So it's interesting: when you think about the films where the robotic overlords take over, it's all a bit more glamorous than what you're describing. But I wonder—in those films, you have the resistance mounting. Is there a resistance mounting towards this stuff? Do you see any positive signs, green shoots of resistance?

I don't know about direct resistance, because I think this stuff is super long-term. I think it's baked into culture in really deep ways. A friend of mine, Eleanor Saitta, always says that any technological problems of sufficient scale and scope are political problems first of all. So all of these things we're working to address within this are not going to be addressed just by building the technology better, but actually by changing the society that's producing these technologies. So no, right now, I think we've got a hell of a long way to go. But as I said, I think by unpacking them, by explaining them, by talking about them super honestly, we can actually start to at least begin that process.
And so when you talk about legibility and digital literacy, I find it difficult to imagine that we need to place the burden of digital literacy on users themselves. But whose responsibility is education in this new world?

Again, I think this responsibility is kind of up to all of us, that everything we do, everything we build, everything we make, needs to be made in a consensual discussion with everyone who's avoiding it; that we're not building systems intended to trick and surprise people into doing the right thing, but that they're actually involved in every step in educating them, because each of these systems is educational. That's what I'm hopeful about, about even this really grim stuff, that if you can take it and look at it properly, it's actually in itself a piece of education that allows you to start seeing how complex systems come together and work and maybe be able to apply that knowledge elsewhere in the world.
James, it's such an important discussion, and I know many people here are really open and prepared to have it, so thanks for starting off our morning.

Thanks very much. Cheers.

播放本句

登入使用學習功能

使用Email登入

HOPE English 播放器使用小提示

  • 功能簡介

    單句重覆、重複上一句、重複下一句:以句子為單位重覆播放,單句重覆鍵顯示綠色時為重覆播放狀態;顯示白色時為正常播放狀態。按重複上一句、重複下一句時就會自動重覆播放該句。
    收錄佳句:點擊可增減想收藏的句子。

    中、英文字幕開關:中、英文字幕按鍵為綠色為開啟,灰色為關閉。鼓勵大家搞懂每一句的內容以後,關上字幕聽聽看,會發現自己好像在聽中文說故事一樣,會很有成就感喔!
    收錄單字:框選英文單字可以收藏不會的單字。
  • 分享
    如果您有收錄很優秀的句子時,可以分享佳句給大家,一同看佳句學英文!