from WSJ.com: US Business http://ift.tt/R5tLiB
via IFTTT
In 1982, Atari controlled eighty per cent of the video-game market. A year later, as its C.E.O. stood accused of insider trading, demand for video games had fallen so much that the company dumped fourteen trucks’ worth of merchandise in a New Mexico landfill and poured cement over the forsaken games to prevent local children from salvaging them.
It’s been more than thirty years since what’s known as the video-game crash of 1983, and the children of that era have grown up, a lingering passion for video games notwithstanding. On Saturday, several hundred of them returned to the landfill in New Mexico with heavy machinery and a team of archaeologists to excavate the site and revisit a defining moment in video-game history.
The landfill in Alamogordo, New Mexico, about ninety miles north of El Paso, is the gaming world’s Roswell. This is partly, perhaps, because of its proximity to the real Roswell, but also because they’re both rumored to be hiding aliens: the dump was said to hold more than three million copies of the famously awful Atari adaptation of Steven Spielberg’s “E.T.: The Extra-Terrestrial.” Unwanted products routinely end up in landfills, but the sudden desert burial of countless games, associated with an iconic movie and adapted for an iconic console, was material colorful enough for a legend. Saturday’s dig confirmed that the legend is true, although the archaeologists on hand estimated that the E.T. cartridges numbered in the thousands, not the millions.
“It was pretty tense right up until we found the cartridges,” Mike Burns, the C.E.O. of the marketing company Fuel Entertainment, told me after the dig. The excavation project was Burns’s brainchild, and his staff spent eighteen months securing permits from local officials and funding from Microsoft’s Xbox Entertainment Studios, which in turn hired the producers Simon and Jonathan Chinn to make a documentary about the dig.
Until Saturday, no one knew for sure what was buried in Alamogordo. The Times and local media recorded fourteen trucks dumping merchandise from Atari’s El Paso factory in September, 1983, but Atari representatives didn’t allow anyone to get close enough to see exactly what was being buried. The E.T. guess was a good one, though, because Atari had produced far more of the games than people wanted. The fans who traveled to Alamogordo to witness the excavation clearly wanted the games to be there. They cheered when workers unearthed the first Atari artifact of the day, a joystick, and they cheered more loudly when Zak Penn, who is directing Microsoft’s documentary, approached the crowd, held up an E.T. cartridge, and announced, “We found something.” The crew filled bucket after bucket with games—yes, some copies of E.T., but also more commercially successful titles like Centipede, Space Invaders, and Asteroid. Planners had hoped to insert a recovered game into a preserved Atari console and play it on site, but a dust storm during the dig convinced them to save that experiment for a more sterile environment than an open landfill in the desert.
Alamogordo is a symbol of the crash of ’83, but Atari’s troubles began a year earlier, in 1982. The company had been a classic Silicon Valley startup. It even employed Steve Jobs for a time (although, according to Walter Isaacson’s biography of Jobs, he was asked to work the night shift after his co-workers complained about his abrasive personality and body odor). But Atari hadn’t yet figured out the precise timing required to successfully transition from one generation of consoles to the next. Today’s console makers have settled into a predictable rhythm, typically releasing new machines every five or six years, enough time for customers to trust that a “next-generation” system will be genuinely superior to the one it replaced. In the early eighties, the process was far less orderly: manufacturers issued new systems much more frequently, and their newness often hinged on gimmickry, such as a built-in screen or a different kind of controller, rather than major technical milestones. The Atari 5200 console, released in 1982, thus cannibalized sales of the older Atari 2600 but didn’t offer enough technical improvements to persuade many people to trade up. Even as Atari promoted the 5200, it flooded the market with games for the 2600. In addition, company executives were overly optimistic, producing, for instance, twelve million copies of a Pac-Man game for the 2600 even though only ten million people owned that console. They ended up selling seven million copies, making it the best-selling video game in history at the time, but the unsold five million copies were an embarrassment. Worse still, the game was ridden with glitches; many customers demanded refunds.
The E.T. game, also built for the 2600 console, fared even worse: Atari sold just one and a half million copies of the five million it had shipped to stores. At the time, Atari was a subsidiary of Warner Communications, which also owned Warner Bros. Pictures and a number of other media properties. As Ray Kassar, then Atari’s C.E.O., recounts in Steven Kent’s “The Ultimate History of Video Games,” Warner’s C.E.O., Steve Ross, insisted that the game be on shelves in time for Christmas, 1982, which meant it had to be designed in just six weeks (compared to a typical lead time in those days of six months or more). The resulting game was primitive, despite the best efforts of its programmer, Howard Scott Warshaw, who attended Saturday’s excavation to explain to fans exactly why the game was so bad.
Neither of these failures appeared in Warner’s earnings reports until December 7, 1982, when Atari announced that it expected a ten- to fifteen-per-cent increase in sales in its fourth quarter, rather than the fifty-per-cent increase it had predicted earlier in the year, which video-game historians have attributed in large part to the failure of the two games. The next day Warner’s stock tumbled, and Kassar was accused of insider trading when it was revealed that he had sold five thousand of his shares just minutes before releasing the bad news. (He was later cleared by the Securities and Exchange Commission.) Atari’s resulting collapse formed the core of the industry-wide crash of ’83. Warner broke Atari into three units—home computing, game consoles, and arcade gaming—and sold them off in 1984 and 1985. What was left of Atari’s once-mighty video-game division limped along from owner to owner for another two decades, before finally declaring bankruptcy last year. The company’s later incarnations never came close to regaining the market dominance of the early eighties.
It’s often said that video games might have died forever with Atari, remembered only as a short-lived fad, if Nintendo hadn’t come to the United States to rescue them two years later. But video games are no more a fad than the Internet is; the crash lasted as long as it did only because none of Atari’s competitors in the U.S. console market—including Coleco, Mattel, and a half-dozen others—offered sufficiently diverting hardware or games to fill the void. Instead of stepping up to claim Atari’s former market share, they were dragged down with the company to ruin, and it wasn’t until Nintendo’s arrival, in late 1985, that the industry began to show new signs of life. Nintendo, which had been planning before the crash to offer Atari worldwide distribution rights outside Japan for its Famicom console (later known in the United States as the Nintendo Entertainment System), now saw that it could compete with the diminished Atari. The company came to dominate the video-game world for the rest of the eighties and much of the nineties. Both of its major challengers—Sega and Sony—were Japanese, too.
After Atari’s collapse, its successors released new consoles at a slower pace, finally settling into a five-to-six-year cycle by the time the sixth generation of hardware was released, in the early aughts. This generation included Nintendo’s GameCube, Sony’s PlayStation 2, and Microsoft’s Xbox—the first American-made console to gain significant market share since the days of Atari. Perhaps it’s appropriate that Microsoft is the one to be excavating in Alamogordo, like a young man visiting his grandfather’s grave.
Photograph: Fuel Entertainment.
When Angela Ahrendts became the C.E.O. of Burberry, in 2006, the company had an unusual problem: its brand was too blatant. Burberry’s signature pattern, a beige, red, and black plaid, which had formerly been a symbol of exclusivity, was being widely copied—a parody of luxury. It even became associated with a derogatory term in Britain: the “chav,” or working-class delinquent, who wreaked havoc while wearing a knockoff Burberry cap. Ahrendts reined in the brand, taking back some licenses that had been issued to partners in other countries, toning down the plaid, and choosing designs that were sleeker and more sophisticated. For high-end products, subtlety sells.
This week, Ahrendts will join Apple as its head of retail. With her pedigree, she has a chance to solve tech companies’ fashion dilemma: how to create wearable technologies that people actually want to wear. (Ahrendts also has considerable experience in China, where Apple currently has thirteen stores and plans to triple that number within two years.) Although Apple stores are still a big draw, devotees are becoming impatient for new products worth lining up for; on Wednesday, the company said that it sold forty-three million iPhones during its most recent quarter, but the iPhone has been around for seven years now, and Apple’s last major new product, the iPad, was released four years ago. Rumors have been building about an iWatch, which would be Apple’s first attempt at a computer disguised as a fashion accessory. (Apple declined to comment.)
If Ahrendts’s appointment is a signal that Apple is seriously considering building technology into clothes and accessories, a newish category of tech products known as “wearables,” the company wouldn’t be alone in Silicon Valley. Last year, investors put four hundred and fifty-eight million dollars into companies that make wearable products like fitness trackers and baby monitors; last month, Facebook agreed to pay two billion dollars to buy Oculus VR, a virtual-reality technology firm whose Oculus Rift gaming headset looks like a scuba mask with a metal plate bolted on the front. Companies love the idea of wearable technology because that constant data stream would be a bonanza for marketers, measuring what people are doing every second, even while they’re asleep.
Yet—surprisingly or not—customers are reluctant to strap still-bulky computers to their foreheads and wrists. One columnist has noted hundreds of Samsung’s new Galaxy Gear watches for sale on eBay. A recent survey indicated that one-third of Americans who buy a wearable device stop using it within six months. Google Glass raises widespread privacy concerns, partly because its design is so intrusive. And earlier this month, Nike laid off some of the employees working on its FuelBand fitness-tracker bracelet; a Nike design director on the FuelBand project joined Apple last fall, increasing speculation that the two companies will collaborate on a new product.
A major problem with wearable technologies—and one that Ahrendts is in a good position to fix—is that they are too conspicuous. The engineers who design them delight in advertising the fact that they’re wearing the hot new device. But outside Silicon Valley, displaying the cutting-edge equivalent of a BlackBerry holster isn’t chic. When people slip on Google Glass, they resemble the character Seven of Nine from “Star Trek: Voyager,” who had cybernetic implants in her face, signs that she once was subsumed into the dehumanizing Borg.
These sorts of constant reminders that users are plugged in could explain why so many people are reluctant to adopt Google Glass and similar technologies: a new Pew Research Center study shows that more than half of Americans think life will change for the worse if many people wear or implant technologies that constantly provide information about the world around them. (Women are particularly hesitant.) Google seems to recognize this issue. It is partnering with Luxottica, the maker of Ray-Ban and Oakley sunglasses, to design a more stylish Glass.
Ahrendts also seems to understand the potential appeal of integrating tech and fashion. Old-world fashion brands tend to be skeptical of technology—some Parisian couture houses still employ “petites mains” to hand-sew their elaborate gowns—but Burberry, under Ahrendts, embraced it. Two years ago, the company opened a flagship store on Regent Street in London, with the transcendent—or dystopian—mission of “seamlessly blurring physical and digital worlds,” as the company’s Web site put it. When a customer walks toward a handbag or sweater, a radio-frequency I.D. tag embedded in the item cues one of a hundred digital screens throughout the store to play a video with more details about, for example, the fabric or the stitching. A sweeping staircase and gleaming white marble bring to mind a neo-Victorian Apple store. Just substitute trench coats for iPhones.
In the twentieth century, designers took two distinct approaches to imagining the future of fashion. The first approach tended toward metallic, geometric, quasi-robotic styles: think Pierre Cardin’s space suits of the nineteen-sixties, inspired by the first moon landing. The first generation of high-tech wearables look a lot like what those designers predicted. But the designers of the past had another vision, too, and this one could be a big, untapped market: making clothes better serve their original purpose of keeping people warm, dry, and protected. One designer, in 1939, envisioned that decades in the future women would wear an electric belt that would adapt the body to unpredictable weather changes. It’s an attractive idea for anyone who has sweltered on the subway, then spent the rest of the day shivering in an air-conditioned office. Along those lines, a group of M.I.T. graduates have designed a ninety-five-dollar dress shirt that borrows from NASA’s space suits—not the bulky styles themselves, but the technology in their materials—to store heat away from the wearer when it’s too hot outside, then return it when temperatures cool. It isn’t hard to imagine Apple using its technological prowess to weave computers right into clothes, especially if it draws on the fashion sense of Ahrendts and Paul Deneve, the former chief executive of Yves Saint Laurent, whom Apple hired last summer to focus on special projects.
There’s another term for wearable technology that might give a better sense of its future: “intimate computing,” which evokes a product that is sensual and tactile, personal and discreet. Om Malik, a tech blogger who popularized the term, wrote about Apple’s hiring of Ahrendts: “This new intimate computing era means that Apple has to stop thinking like a computer company and more like a fashion accessory maker, whose stock in trade is not just great design but aspirational experience.” That will be Ahrendts’s mission: more Burberry, less Borg.
Photograph: CBS Photo Archive/Delivered by Online USA/Getty.