胡一菲杰克逊:2010年计算机技术的发展

来源:百度文库 编辑:九乡新闻网 时间:2024/04/27 13:35:40

MIT技术评论文章:

Wednesday, December 29, 2010

The Year in Computing


New ways to feed our need for computing speed, novel controllers for ourgadgets, and scary security risks all appeared in 2010.

The last 12 months changed the shapeand definition of computers, which no longer necessarily involve a keyboard, amonitor, and a mouse. Apple started the year by launching its tablet (The iPad, Like an iPhone, Only Bigger), which soon spawnedmany imitators (Androids Will Challenge the iPad). Google started the year byshowing off the most powerful smart phone yet (Google Reveals its New Phone) and ended it with a personalcomputer that relies entirely on the Web, by way of Chrome OS (The Browser Takes All).


Another new category of computers grewout of the industry's obsession with adding computing power to television.Google's ambitious but troubled effort (Google TV Faces Some Prime-Time Challenges) joined a moreestablished apps-for-TV scheme from Yahoo (Yahoo Brings Apps to TVs) and a stripped-down entrant fromApple (Apple Shows a Facebook Rival and Apple TV 2.0). All putWeb-streamed content and social networking at the heart of their strategies,trying to connect living-room viewing with online friends (Making TV Social,Virtually).


The new kinds of computers required newkinds of controls. 2010 saw enhancements to touch technology, such as a way tosimulate the sensation of texture on a flat screen (Touch Screens that Touch Back) and a more powerful version ofthe laptop track pad (Upgrading the Laptop's Touch Pad). New physical interfaceswere also introduced, such as Microsoft's technology for gestural control (Hackers Take the Kinect to New Levels) and a prototype devicethat the user controls by tapping a forearm (Putting Virtual Controls on Your Arm). More speculativeprojects showed that it's possible to control a cell phone with your eyes (Eye Tracking for Mobile Control) or brain (Mobile Phone Mind Control).


All these innovations were madepossible by continuing advances in the power and compactness of computercomponents. One route that both Intel (Computing at the Speed of Light) and IBM (Electricity and Light in One Chip) explored was to try toovercome the limitations of electricity by developing computers that run onlight instead. Another radical idea, realized by a startup, was to create chipsthat work with probabilities, not 1s and 0s, an approach that could speed cryptographyand other statistical calculations (A New Kind of Microchip).


Meanwhile, Apple (What's Inside the iPad's Chip?) and the Chinese government (China: a New Processor for a New Market) each took chip designin a new direction. Apple is striving to make chips for the iPad that balanceportability and power, and Chinato make computing power available inexpensively to parts of the huge countrythat are as yet unwired.


But while Moore's law held, exponentially increasingthe computing power that can fit into a given space, our power supplies haven'timproved so fast. That puts a premium on less-energy-intensive ways to usecomputing, and it motivated research showing that Wi-Fi on mobile devices usesmuch more power than necessary (How Wi-Fi Drains Your Cell Phone). Intel demonstrated thatchips allowed to make more errors use significantly less power and still getthe job done (Intel Prototypes Low-Power Circuits). And a way to cut thepower use of desktop computers by an average of 60 percent was introduced,achieved by putting a virtual copy of a desktop computer on a cloud server (PCs that Work While they Sleep).


A relatively new feature of computers,whether these are smart phones or TVs, is the cloud—the distant servers whoseample computing resources and storage space are accessed over the Internet. Thecloud seems sure to become significantly more useful. Two startups showed thatthe cloud can enable small devices to act like much bigger, more powerful ones (Cloud Services LetGadgets Punch Above Their Weight). The security worries that comewith entrusting all your data to others also inspired cryptographers to hone amethod that could let servers work with your data without being able to read(and potentially leak) it (Computing with Secrets, but Keeping Them Safe).


Google invented a new kind of cloudservice when it made rudimentary AI available to all (Google Offers Cloud-Based Learning Engine). Researchers alsotackled some of the logistical challenges to cloud computing, and came up withways to easily move desktop software into the cloud (Drag and Drop Into the Cloud) and to compare the abilities ofdifferent cloud providers (Pitting Cloud Against Cloud).

Of course, even computers thatincorporate the best of these ideas are still likely to crash. Fortunately, thelast year brought new ideas that may make future machines more reliable. Onenew system can automatically diagnose a PC's problems (Software Works Out What's Troubling a PC); another can learncomputer maintenance and repair by watching how an expert tunes a system (Software that Learns by Watching). A Stanford research projectshowed that building chips with transistors dedicated to spotting problems cancreate more reliable hardware (Speedier Bug Catching).


Security flaws, too, are universal,even unto the computer systems in cars (Is Your Car Safe From Hackers?) and ATMs (How to Make an ATM SpewOut Money), both of which can be compromised remotely, researchersdemonstrated. New ideas about boosting security came from other researchers whobravely installed malware on a high-performance research computer (Raising a Botnet in Captivity) and from a company that can addcomputer smarts to the plastic in people's wallets (A Credit Card with a Computer Inside).