下載App 希平方
攻其不背
App 開放下載中
下載App 希平方
攻其不背
App 開放下載中
IE版本不足
您的瀏覽器停止支援了😢使用最新 Edge 瀏覽器或點選連結下載 Google Chrome 瀏覽器 前往下載

免費註冊
! 這組帳號已經註冊過了
Email 帳號
密碼請填入 6 位數以上密碼
已經有帳號了?
忘記密碼
! 這組帳號已經註冊過了
您的 Email
請輸入您註冊時填寫的 Email,
我們將會寄送設定新密碼的連結給您。
寄信了!請到信箱打開密碼連結信
密碼信已寄至
沒有收到信嗎?
如果您尚未收到信,請前往垃圾郵件查看,謝謝!

恭喜您註冊成功!

查看會員功能

註冊未完成

《HOPE English 希平方》服務條款關於個人資料收集與使用之規定

隱私權政策
上次更新日期:2014-12-30

希平方 為一英文學習平台,我們每天固定上傳優質且豐富的影片內容,讓您不但能以有趣的方式學習英文,還能增加內涵,豐富知識。我們非常注重您的隱私,以下說明為當您使用我們平台時,我們如何收集、使用、揭露、轉移及儲存你的資料。請您花一些時間熟讀我們的隱私權做法,我們歡迎您的任何疑問或意見,提供我們將產品、服務、內容、廣告做得更好。

本政策涵蓋的內容包括:希平方學英文 如何處理蒐集或收到的個人資料。
本隱私權保護政策只適用於: 希平方學英文 平台,不適用於非 希平方學英文 平台所有或控制的公司,也不適用於非 希平方學英文 僱用或管理之人。

個人資料的收集與使用
當您註冊 希平方學英文 平台時,我們會詢問您姓名、電子郵件、出生日期、職位、行業及個人興趣等資料。在您註冊完 希平方學英文 帳號並登入我們的服務後,我們就能辨認您的身分,讓您使用更完整的服務,或參加相關宣傳、優惠及贈獎活動。希平方學英文 也可能從商業夥伴或其他公司處取得您的個人資料,並將這些資料與 希平方學英文 所擁有的您的個人資料相結合。

我們所收集的個人資料, 將用於通知您有關 希平方學英文 最新產品公告、軟體更新,以及即將發生的事件,也可用以協助改進我們的服務。

我們也可能使用個人資料為內部用途。例如:稽核、資料分析、研究等,以改進 希平方公司 產品、服務及客戶溝通。

瀏覽資料的收集與使用
希平方學英文 自動接收並記錄您電腦和瀏覽器上的資料,包括 IP 位址、希平方學英文 cookie 中的資料、軟體和硬體屬性以及您瀏覽的網頁紀錄。

隱私權政策修訂
我們會不定時修正與變更《隱私權政策》,不會在未經您明確同意的情況下,縮減本《隱私權政策》賦予您的權利。隱私權政策變更時一律會在本頁發佈;如果屬於重大變更,我們會提供更明顯的通知 (包括某些服務會以電子郵件通知隱私權政策的變更)。我們還會將本《隱私權政策》的舊版加以封存,方便您回顧。

服務條款
歡迎您加入看 ”希平方學英文”
上次更新日期:2013-09-09

歡迎您加入看 ”希平方學英文”
感謝您使用我們的產品和服務(以下簡稱「本服務」),本服務是由 希平方學英文 所提供。
本服務條款訂立的目的,是為了保護會員以及所有使用者(以下稱會員)的權益,並構成會員與本服務提供者之間的契約,在使用者完成註冊手續前,應詳細閱讀本服務條款之全部條文,一旦您按下「註冊」按鈕,即表示您已知悉、並完全同意本服務條款的所有約定。如您是法律上之無行為能力人或限制行為能力人(如未滿二十歲之未成年人),則您在加入會員前,請將本服務條款交由您的法定代理人(如父母、輔助人或監護人)閱讀,並得到其同意,您才可註冊及使用 希平方學英文 所提供之會員服務。當您開始使用 希平方學英文 所提供之會員服務時,則表示您的法定代理人(如父母、輔助人或監護人)已經閱讀、了解並同意本服務條款。 我們可能會修改本條款或適用於本服務之任何額外條款,以(例如)反映法律之變更或本服務之變動。您應定期查閱本條款內容。這些條款如有修訂,我們會在本網頁發佈通知。變更不會回溯適用,並將於公布變更起十四天或更長時間後方始生效。不過,針對本服務新功能的變更,或基於法律理由而為之變更,將立即生效。如果您不同意本服務之修訂條款,則請停止使用該本服務。

第三人網站的連結 本服務或協力廠商可能會提供連結至其他網站或網路資源的連結。您可能會因此連結至其他業者經營的網站,但不表示希平方學英文與該等業者有任何關係。其他業者經營的網站均由各該業者自行負責,不屬希平方學英文控制及負責範圍之內。

兒童及青少年之保護 兒童及青少年上網已經成為無可避免之趨勢,使用網際網路獲取知識更可以培養子女的成熟度與競爭能力。然而網路上的確存有不適宜兒童及青少年接受的訊息,例如色情與暴力的訊息,兒童及青少年有可能因此受到心靈與肉體上的傷害。因此,為確保兒童及青少年使用網路的安全,並避免隱私權受到侵犯,家長(或監護人)應先檢閱各該網站是否有保護個人資料的「隱私權政策」,再決定是否同意提出相關的個人資料;並應持續叮嚀兒童及青少年不可洩漏自己或家人的任何資料(包括姓名、地址、電話、電子郵件信箱、照片、信用卡號等)給任何人。

為了維護 希平方學英文 網站安全,我們需要您的協助:

您承諾絕不為任何非法目的或以任何非法方式使用本服務,並承諾遵守中華民國相關法規及一切使用網際網路之國際慣例。您若係中華民國以外之使用者,並同意遵守所屬國家或地域之法令。您同意並保證不得利用本服務從事侵害他人權益或違法之行為,包括但不限於:
A. 侵害他人名譽、隱私權、營業秘密、商標權、著作權、專利權、其他智慧財產權及其他權利;
B. 違反依法律或契約所應負之保密義務;
C. 冒用他人名義使用本服務;
D. 上載、張貼、傳輸或散佈任何含有電腦病毒或任何對電腦軟、硬體產生中斷、破壞或限制功能之程式碼之資料;
E. 干擾或中斷本服務或伺服器或連結本服務之網路,或不遵守連結至本服務之相關需求、程序、政策或規則等,包括但不限於:使用任何設備、軟體或刻意規避看 希平方學英文 - 看 YouTube 學英文 之排除自動搜尋之標頭 (robot exclusion headers);

服務中斷或暫停
本公司將以合理之方式及技術,維護會員服務之正常運作,但有時仍會有無法預期的因素導致服務中斷或故障等現象,可能將造成您使用上的不便、資料喪失、錯誤、遭人篡改或其他經濟上損失等情形。建議您於使用本服務時宜自行採取防護措施。 希平方學英文 對於您因使用(或無法使用)本服務而造成的損害,除故意或重大過失外,不負任何賠償責任。

版權宣告
上次更新日期:2013-09-16

希平方學英文 內所有資料之著作權、所有權與智慧財產權,包括翻譯內容、程式與軟體均為 希平方學英文 所有,須經希平方學英文同意合法才得以使用。
希平方學英文歡迎你分享網站連結、單字、片語、佳句,使用時須標明出處,並遵守下列原則:

  • 禁止用於獲取個人或團體利益,或從事未經 希平方學英文 事前授權的商業行為
  • 禁止用於政黨或政治宣傳,或暗示有支持某位候選人
  • 禁止用於非希平方學英文認可的產品或政策建議
  • 禁止公佈或傳送任何誹謗、侮辱、具威脅性、攻擊性、不雅、猥褻、不實、色情、暴力、違反公共秩序或善良風俗或其他不法之文字、圖片或任何形式的檔案
  • 禁止侵害或毀損希平方學英文或他人名譽、隱私權、營業秘密、商標權、著作權、專利權、其他智慧財產權及其他權利、違反法律或契約所應付支保密義務
  • 嚴禁謊稱希平方學英文辦公室、職員、代理人或發言人的言論背書,或作為募款的用途

網站連結
歡迎您分享 希平方學英文 網站連結,與您的朋友一起學習英文。

抱歉傳送失敗!

不明原因問題造成傳送失敗,請儘速與我們聯繫!
希平方 x ICRT

「P.W. Singer:軍事機器人以及戰爭的未來」- Military Robots and the Future of War

觀看次數:2412  • 

框選或點兩下字幕可以直接查字典喔!

I thought I'd begin with a scene of war. There was little to warn of the danger ahead. The Iraqi insurgent had placed the IED, an Improvised Explosive Device, along the side of the road with great care. By 2006, there were more than 2,500 of these attacks every single month, and they were the leading cause of casualties among American soldiers and Iraqi civilians. The team that was hunting for this IED is called an EOD team—Explosives Ordinance Disposal—and they're the pointy end of the spear in the American effort to suppress these roadside bombs. Each EOD team goes out on about 600 of these bomb calls every year, defusing about two bombs a day. Perhaps the best sign of how valuable they are to the war effort, is that the Iraqi insurgents put a $50,000 bounty on the head of a single EOD soldier.

Unfortunately, this particular call would not end well. By the time the soldier advanced close enough to see the telltale wires of the bomb, it exploded in a wave of flame. Now, depending how close you are and how much explosive has been packed into that bomb, it can cause death or injury. You have to be as far as 50 yards away to escape that. The blast is so strong it can even break your limbs, even if you're not hit. That soldier had been on top of the bomb.

And so when the rest of the team advanced they found little left. And that night the unit's commander did a sad duty, and he wrote a condolence letter back to the United States, and he talked about how hard the loss had been on his unit, about the fact that they had lost their bravest soldier, a soldier who had saved their lives many a time. And he apologized for not being able to bring them home. But then he talked up the silver lining that he took away from the loss. "At least," as he wrote, "when a robot dies, you don't have to write a letter to its mother."

That scene sounds like science fiction, but is battlefield reality already. The soldier in that case was a 42-pound robot called a PackBot. The chief's letter went, not to some farmhouse in Iowa like you see in the old war movies, but went to the iRobot Company, which is named after the Asimov novel and the not-so-great Will Smith movie, and... um...if you remember that in that fictional world, robots started out carrying out mundane chores, and then they started taking on life-and-death decisions. That's a reality we face today.

What we're going to do is actually just flash a series of photos behind me that show you the reality of robots used in war right now or already at the prototype stage. It's just to give you a taste. Another way of putting it is you're not going to see anything that's powered by Vulcan technology, or teenage wizard hormones or anything like that. This is all real. So why don't we go ahead and start those pictures.

Something big is going on in war today, and maybe even the history of humanity itself. The U.S. military went into Iraq with a handful of drones in the air. We now have 5,300. We went in with zero unmanned ground systems. We now have 12,000. And the tech term "killer application" takes on new meaning in this space.

And we need to remember that we're talking about the Model T Fords, the Wright Flyers, compared to what's coming soon. That's where we're at right now. One of the people that I recently met with was an Air Force three-star general, and he said basically, where we're headed very soon is tens of thousands of robots operating in our conflicts, and these numbers matter, because we're not just talking about tens of thousands of today's robots, but tens of thousands of these prototypes and tomorrow's robots, because, of course, one of the things that's operating in technology is Moore's Law, that you can pack in more and more computing power into those robots, and so flash forward around 25 years, if Moore's Law holds true, those robots will be close to a billion times more powerful in their computing than today.

And so what that means is the kind of things that we used to only talk about at science fiction conventions like Comic-Con have to be talked about in the halls of power and places like the Pentagon. A robots revolution is upon us.

Now, I need to be clear here. I'm not talking about a revolution where you have to worry about the Governor of California showing up at your door, a la the Terminator. When historians look at this period, they're going to conclude that we're in a different type of revolution: a revolution in war, like the invention of the atomic bomb. But it may be even bigger than that, because our unmanned systems don't just affect the "how" of war-fighting, they affect the "who" of fighting at its most fundamental level. That is, every previous revolution in war, be it the machine gun, be it the atomic bomb, was about a system that either shot faster, went further, had a bigger boom. That's certainly the case with robotics, but they also change the experience of the warrior and even the very identity of the warrior.

Another way of putting this is that mankind's 5,000-year-old monopoly on the fighting of war is breaking down in our very lifetime. I've spent the last several years going around meeting with all the players in this field, from the robot scientists to the science fiction authors who inspired them to the 19-year-old drone pilots who are fighting from Nevada, to the four-star generals who command them, to even the Iraqi insurgents who they are targeting and what they think about our systems, and what I found interesting is not just their stories, but how their experiences point to these ripple effects that are going outwards in our society, in our law and our ethics, etc. And so what I'd like to do with my remaining time is basically flesh out a couple of these.

So the first is that the future of war, even a robotics one, is not going to be purely an American one. The U.S. is currently ahead in military robotics right now, but we know that in technology there's no such thing as a permanent first move or advantage. In a quick show of hands, how many people in this room still use Wang Computers? It's the same thing in war. The British and the French invented the tank. The Germans figured out how to use it right, and so what we have to think about for the U.S. is that we are ahead right now, but you have 43 other countries out there working on military robotics, and they include all the interesting countries like Russia, China, Pakistan, Iran.

And this raises a bigger worry for me. How do we move forward in this revolution given the state of our manufacturing and the state of our science and mathematics training in our schools? Or another way of thinking about this is, what does it mean to go to war increasingly with soldiers whose hardware is made in China and software is written in India? But just as software has gone open-source, so has warfare. Unlike an aircraft carrier or an atomic bomb, you don't need a massive manufacturing system to build robotics. A lot of it is off the shelf. A lot of it's even do-it-yourself. One of those things you just saw flashed before you was a raven drone, the handheld tossed one. For about a thousand dollars, you can build one yourself, equivalent to what the soldiers use in Iraq. That raises another wrinkle when it comes to war and conflict. Good guys might play around and work on these as hobby kits, but so might bad guys. This cross between robotics and things like terrorism is going to be fascinating and even disturbing, and we've already seen it start.

During the war between Israel, a state, and Hezbollah, a non-state actor, the non-state actor flew four different drones against Israel. There's already a jihadi website that you can go on and remotely detonate an IED in Iraq while sitting at your home computer. And so I think what we're going to see is two trends take place with this. First is, you're going to reinforce the power of individuals against governments, but then the second is that we are going to see an expansion in the realm of terrorism. The future of it may be a cross between al Qaeda 2.0 and the next generation of the Unabomber. And another way of thinking about this is the fact that, remember, you don't have to convince a robot that they're gonna receive 72 virgins after they die to convince them to blow themselves up.

But the ripple effects of this are going to go out into our politics. One of the people that I met with was a former Assistant Secretary of Defense for Ronald Reagan, and he put it this way, quote: "I like these systems because they save American lives, but I worry about more marketization of wars, more shock-and-awe talk, to defray discussion of the costs. People are more likely to support the use of force if they view it as costless."

Robots for me take certain trends that are already in play in our body politic, and maybe take them to their logical ending point. We don't have a draft. We don't have declarations of war anymore. We don't buy war bonds anymore. And now we have the fact that we're converting more and more of our American soldiers that we would send into harm's way into machines, and so we may take those already lowering bars to war and drop them to the ground.

But the future of war is also going to be a YouTube war. That is, our new technologies don't merely remove humans from risk. They also record everything that they see. So they don't just delink the public; they reshape its relationship with war. There's already several thousand video clips of combat footage from Iraq on YouTube right now, most of it gathered by drones.

Now, this could be a good thing. It could be building connections between the home front and the war front as never before. But remember, this is taking place in our strange, weird world, and so inevitably the ability to download these video clips to, you know, your iPod or your Zune gives you the ability to turn it into entertainment. Soldiers have a name for these clips. They call it war porn. The typical one that I was sent was an email that had an attachment of video of a Predator strike taking out an enemy site. Missile hits, bodies burst into the air with the explosion. It was set to music. It was set to the pop song "I Just Want To Fly" by Sugar Ray.

This ability to watch more but experience less creates a wrinkle in the public's relationship with war. I think about this with a sports parallel. It's like the difference between watching an NBA game, a professional basketball game on TV, where the athletes are tiny figures on the screen, and being at that basketball game in person and realizing what someone seven feet really does look like. But we have to remember, these are just the clips. These are just the ESPN Sports Center version of the game. They lose the context. They lose the strategy. They lose the humanity. War just becomes slam dunks and smart bombs.

Now the irony of all this is that while the future of war may involve more and more machines, it's our human psychology that's driving all of this, it's our human failings that are leading to these wars. So one example of this that has big resonance in the policy realm is how this plays out on our very real war of ideas that we're fighting against radical groups. What is the message that we think we are sending with these machines versus what is being received in terms of the message.

So one of the people that I met was a senior Bush Administration official, who had this to say about our unmanning of war, "It plays to our strength. The thing that scares people is our technology." But when you go out and meet with people, for example in Lebanon, it's a very different story. One of the people I met with there was a news editor, and we're talking as a drone is flying above him, and this is what he had to say. "This is just another sign of the coldhearted cruel Israelis and Americans, who are cowards because they send out machines to fight us. They don't want to fight us like real men, but they're afraid to fight, so we just have to kill a few of their soldiers to defeat them."

The future of war also is featuring a new type of warrior, and it's actually redefining the experience of going to war. You can call this a cubicle warrior. This is what one Predator drone pilot described of his experience fighting in the Iraq War while never leaving Nevada. "You're going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants, and then you get in the car and you drive home and within 20 minutes, you're sitting at the dinner table talking to your kids about their homework."

Now, the psychological balancing of those experiences is incredibly tough, and in fact those drone pilots have higher rates of PTSD than many of the units physically in Iraq. But some have worries that this disconnection will lead to something else, that it might make the contemplation of war crimes a lot easier when you have this distance. "It's like a video game," is what one young pilot described to me of taking out enemy troops from afar. As anyone who's played Grand Theft Auto knows, we do things in the video world that we wouldn't do face to face.

So much of what you're hearing from me is that there's another side to technologic revolutions, and that it's shaping our present and maybe will shape our future of war. Moore's Law is operative, but so is Murphy's Law. The fog of war isn't being lifted. The enemy has a vote. We're gaining incredible new capabilities, but we're also seeing and experiencing new human dilemmas. Now, sometimes these are just "oops" moments, which is what the head of a robotics company described it, you just have "oops" moments. Well, what are "oops" moments with robots in war? Well, sometimes they're funny. Sometimes, they're like that scene from the Eddie Murphy movie "Best Defense," playing out in reality, where they tested out a machine gun-armed robot, and during the demonstration it started spinning in a circle and pointed its machine gun at the reviewing stand of VIPs. Fortunately the weapon wasn't loaded and no one was hurt, but other times "oops" moments are tragic, such as last year in South Africa, where an anti-aircraft cannon had a "software glitch," and actually did turn on and fired, and nine soldiers were killed.

We have new wrinkles in the laws of war and accountability. What do we do with things like unmanned slaughter? What is unmanned slaughter? We've already had three instances of Predator drone strikes where we thought we got bin Laden, and it turned out not to be the case. And this is where we're at right now. This is not even talking about armed, autonomous systems with full authority to use force. And do not believe that that isn't coming. During my research I came across four different Pentagon projects on different aspects of that.

And so you have this question: What does this lead to issues like war crimes? Robots are emotionless, so they don't get upset if their buddy is killed. They don't commit crimes of rage and revenge. But robots are emotionless. They see an 80-year-old grandmother in a wheelchair the same way they see a T-80 tank: they're both just a series of zeroes and ones. And so we have this question to figure out: How do we catch up our 20th century laws of war that are so old right now that they could qualify for Medicare to these 21st century technologies?

And so, in conclusion, I've talked about what seems the future of war, but notice that I've only used real world examples and you've only seen real world pictures and videos. And so this sets a great challenge for all of us that we have to worry about well before you have to worry about your Roomba sucking the life away from you. Are we going to let the fact that what's unveiling itself right now in war sounds like science fiction and therefore keeps us in denial? Are we going to face the reality of 21st century war? Is our generation going to make the same mistake that a past generation did with atomic weaponry, and not deal with the issues that surround it until Pandora's box is already opened up?

Now, I could be wrong on this, and one Pentagon robot scientist told me that I was. He said, "There's no real social, ethical, moral issues when it comes to robots. That is," he added, "unless the machine kills the wrong people repeatedly. Then it's just a product recall issue."

And so the ending point for this is that actually, we can turn to Hollywood. A few years ago, Hollywood gathered all the top characters and created a list of the top 100 heroes and top 100 villains of all of Hollywood history, the characters that represented the best and worst of humanity. Only one character made it onto both lists: The Terminator, a robot killing machine. And so that points to the fact that our machines can be used for both good and evil, but for me it points to the fact that there's a duality of humans as well.
This week is a celebration of our creativity. Our creativity has taken our species to the stars. Our creativity has created works of arts and literature to express our love. And now, we're using our creativity in a certain direction, to build fantastic machines with incredible capabilities, maybe even one day an entirely new species. But one of the main reasons that we're doing that is because of our drive to destroy each other, and so the question we all should ask: Is it our machines, or is it us that's wired for war?

Thank you.

播放本句

登入使用學習功能

使用Email登入

HOPE English 播放器使用小提示

  • 功能簡介

    單句重覆、重複上一句、重複下一句:以句子為單位重覆播放,單句重覆鍵顯示綠色時為重覆播放狀態;顯示白色時為正常播放狀態。按重複上一句、重複下一句時就會自動重覆播放該句。
    收錄佳句:點擊可增減想收藏的句子。

    中、英文字幕開關:中、英文字幕按鍵為綠色為開啟,灰色為關閉。鼓勵大家搞懂每一句的內容以後,關上字幕聽聽看,會發現自己好像在聽中文說故事一樣,會很有成就感喔!
    收錄單字:框選英文單字可以收藏不會的單字。
  • 分享
    如果您有收錄很優秀的句子時,可以分享佳句給大家,一同看佳句學英文!