Pages

Tuesday, April 23, 2013

The rise and fall of AMD: A company on the ropes The Athlon 64 was AMD's high point, and it's been largely downhill since then.



Athlon 64, and AMD’s competitive
peakThe conclusion of our two-part series on AMD. Part one covered AMD's attempts to transform itself from a second-source supplier of Intel designs into a chipmaker powerhouse in its own right.
Overall, the Opteron's architecture was similar to K7’s but with two key differences. The first was that the CPU incorporated the system’s memory controller into the chip itself, which greatly reduced memory latency (albeit at the cost of some flexibility; new CPUs had to be introduced to take advantage of things like dual-channel memory and faster memory types like DDR2). This showed that AMD saw the benefits of incorporating more capability into the CPU itself, an instinct that would inform the later purchase of GPU maker ATI Technologies.
The K8’s biggest benefit for servers, though, was its 64-bit extensions. The extensions enabled AMD’s chips to run 64-bit operating systems that could address more than 4GB of memory at a time, but they didn’t sacrifice compatibility or speed when running then-standard 32-bit operating systems and applications. These extensions would go on to become the industry standard, beating out Intel’s alternate 64-bit Itanium architecture—Intel even licensed the AMD64 extensions for its own compatible x86-64 implementation. (Intel's initial approach could only run x86 code with an immense performance penalty.)
The K8 architecture was successful on the desktop in the form of the Athlon 64 lineup, but it was the Opteron server variants that brought AMD real success in the high-margin market. By the time Intel introduced dual-core Xeons based on the company's Core architecture in September of 2006, AMD had snapped up an estimated 25 percent of the server market. AMD continued to iterate successfully on K8 for a few years, performing several architecture tweaks and manufacturing process upgrades and even helping to usher in the multicore era of computing with the Athlon 64 X2.
 The Opteron CPU, and the K8 architecture upon which it was based, helped AMD break into some new and lucrative markets.
Despite technical successes, AMD's financial situation had become precarious. Processor unit sales were falling, and margins on most chips dropped quickly after 2000. AMD also had problems with producing too much inventory; in the second half of 2002, AMD actually had "to limit shipments and to accept receipt of product returns from certain customers," it announced, because the chips it made weren't selling fast enough. The company had a net loss of $61 million in 2001, $1.3 billion in 2002, and $274 million in 2003.
What was sucking away the company's money? It was those darned fabs, just as Raza had feared. In the company's 2001 10-K, AMD estimated, "construction and facilitation costs of Dresden Fab 30 will be approximately $2.3 billion when the facility is fully equipped by the end of 2003." There was also a $410 million to AMD Saxony, the joint venture and wholly owned subsidiary that managed the Dresden fab.
By the following year, AMD upped its estimated costs to fund Dresden to $2.5 billion and added that by the end of 2001, it had invested $1.8 billion. The estimated costs continued to rise, as per the 2003 10-K: "We currently estimate that the construction and facilitation costs of Fab 30 will be $2.6 billion when it is fully equipped by the end of 2005. As of December 29, 2002, we had invested $2.1 billion in AMD Saxony." That same year, AMD plowed ahead with a new Dresden fab ("Fab 36"), investing $440 million into it by the end of the year.
The money for these huge investments all relied on AMD's ability to sell chips, and AMD's ability to sell chips was made easier by its competitive edge over Intel. Unluckily for AMD, Intel didn't take this challenge lying down.

Intel resurgent

 Intel's Core 2 Duo took the wind out of AMD's sails, and the company has never quite recovered.
AMD's high point was, in most respects, one of Intel's lowest. "Clearly [AMD] had a very competitive product in Opteron in particular," Intel spokesperson Bill Calder told Ars, "and there was a lot of consternation inside of Intel and a lot of work going around trying to correct the problem and trying to counter not only in the market but in the press. At the time, there was quite a bit of focus on the competitive threat from AMD, but it was also very much a rallying call inside of Intel."
Even as AMD was beating Intel soundly with the Opteron server parts, the AMD64 extensions, and the Athlon desktop parts, Intel was sowing the seeds that would eventually grow into one of the company's most resounding successes: the Core architecture. By 2003, it was becoming clear that the NetBurst architecture that powered the Pentium 4 wasn't performing as well as the company had hoped—Intel had hoped to push the chips' clock speeds all the way up to 10GHz, but even at 4GHz, the Pentium 4's heat and power consumption were causing reliability problems. These same heat and power issues also made NetBurst ill-suited for use in the growing laptop segment. Rather than modify the Pentium 4's architecture to work better in laptops, the company went back to the drawing board and assigned a small team in Israel to work on a project known as Banias. This chip would later go on to be known as the Pentium M, the basis of Intel's successful Centrino marketing push (Centrino bundled a Pentium M processor, an Intel chipset, and Intel 802.11b and 802.11g wireless adapters).
Pentium M didn't start from scratch; it instead went back to Intel's Pentium III architecture and modified it to increase performance and efficiency. Pentium M also refined power saving technologies like SpeedStep, which dynamically adjusted the CPU's clock speed and voltage depending on how heavily the chip was being used.
The CPU was such a success for Intel in laptops that, when the NetBurst architecture's time was up, the company set about to adapting the Pentium M's architecture for desktops and servers as well. It ramped up Pentium M's clock speed, added 64-bit extensions (licensed, of course, from AMD), and added a second CPU core, which provided the basic ingredients for the Core 2 Duo (the original Core Duo and Core Solo were sold only in laptops and lacked 64-bit extensions—Core 2 Duo was this architecture's first foray into non-mobile form factors.)
This Core architecture accomplished several important goals: it gave Intel a fast, power-efficient 64-bit Xeon in the server market to stem Opteron's tide; it took back the symbolically important performance crown in the desktop market; and it was much more power-efficient than AMD's laptop chips right at the time when laptops began to outsell desktops for the first time. (AMD's power consumption in laptops became competitive only recently with 2011's Llano and 2012's Trinity parts.)
The Core architecture hit AMD where it hurt, but the biggest damage to AMD's long-term health came from Intel's execution strategy. Beginning around the same time, Intel moved to a system of smaller but aggressively timed processor updates that it called "tick-tock."
Every year, Intel would introduce a new processor lineup—the "ticks" would gently tweak a CPU architecture and move it to a smaller, lower-power manufacturing process, while the "tocks" would remain on the established manufacturing process and introduce more drastic architectural changes. This system limits the risk that a new process or architecture will run into significant problems during the manufacturing stage, and new processor iterations can be introduced so quickly that a competitor with a superior architecture won't necessarily be able to stay on top for years, as AMD did with K8.
Neither Core nor any subsequent Intel architecture has left AMD behind all by itself, but Core 2 kicked off a relentless string of well-executed Intel CPUs. While AMD's CPUs continued to improve, they were over time shut out of the high-end market once more and forced to compete again mainly on price, mirroring the company's early struggles. It also didn't help that, just as Intel was churning out its best products in years, AMD was trying to swallow another company whole.

The ATI acquisition

AMD bought ATI, but the two companies didn't mix well. In this 2009 photo, years after the acquisition, the "ATI Cube" in front of the company's building is finally dismantled. The ATI branding would be discontinued entirely a year later.
Ruiz explained to Ars that, when he became the CEO of the company in 2002, AMD was simply trying to do too many things. "When I joined the company we were involved in memory, in logic, in microprocessors, communication products—there was quite a bit of stuff going on,” he said. “And so I felt that the talented people were just spread a bit too much, and perhaps one of the reasons they couldn't figure out which way to kick the ball was that there were just too many things that the company was trying to do. One of the first things I tried to do was to pare down the product line."
A focus on a few product lines helped AMD succeed for several years, but it didn't last. While no single event brought AMD down from the peaks of the mid-2000s, the loss of this focus played no small part in its fall. Case in point: the company's 2006 purchase of Canadian graphics company ATI Technologies. The logic behind the ATI purchase was fundamentally sound—AMD saw a future where more than just the memory controller was integrated into the CPU, but it didn't have the graphics and chipset experience to make that future happen easily.
According to both reports at the time and to Ruiz’s own book, Nvidia was considered a potential acquisition target first, since the company had plenty of graphics experience and some of the best chipsets for AMD's K7 and K8-based CPUs. But Jen-Hsun Huang, Nvidia's outspoken CEO, wanted to run the combined company—a non-starter for AMD's leadership. The discussion then turned to ATI, which AMD eventually bought for $5.4 billion in cash and more in stock in October 2006.
Jen-Hsun Huang, the co-founder and CEO of Nvidia, apparently insisted on being CEO of a joint AMD-Nvidia—a deal that never came through.
"It was worth it," wrote Ruiz, "even if AMD shares sunk as Wall Street griped that we had paid too much and as our investors worried about the new debt. We had taken on $2.5 billion in financing to pay for the purchase, but we had shed so much debt in the past couple of years that I believed AMD could handle it. With leading-edge graphics technology in our portfolio, we would be able to offer integrated graphics solutions to OEMs just as Intel had been doing. And ATI’s technology was simply better than what Intel had to offer."
Though a good idea, the two companies were never integrated well. "The vision to bring the two companies together, or at least to get a video component business, a graphics component business for AMD made sense," former AMD marketing manager Ian McNaughton told Ars. "How they went about executing that plan failed."
“This was an acquisition treated like a merger,” McNaughton continued. “So the AMD people were AMD people; the ATI people were ATI people. They continued to live in separate buildings, they continued to report into separate structures, they continued to wear their logos. It took a long time to change the mentality, and I don't think it ever really did change."
Differences in culture divided loyalties within the combined company, to the point that another former AMD staffer told us that some employees saw themselves as members of either a “green” AMD (the CPU side) or a “red” AMD (the GPU side). Those employees often prioritized the needs of their division’s individual products rather than the combined products that ATI had been purchased to help build. This led to delays. Reportedly, three versions of the combined “Fusion” chips were produced before one was deemed market-ready, by which point the chips were much later than initially promised.
The internal issues also distracted the company from the very real engineering issues affecting products already in AMD’s pipeline at the time of the ATI purchase—2007’s “Barcelona” Opteron processor arrived late and didn’t meet performance expectations due in part to a nasty bug (the short-term fix for which sapped performance by an additional ten percent or so). This situation was substantially duplicated in 2011 when the “Bulldozer” architecture also arrived late and with less performance than promised. In both cases, AMD stumbled when it badly needed a win, and Intel's solid execution during this period threw AMD's problems into even harsher relief.
"The next resulting four years of lack of innovation [after the ATI purchase] was probably one of those hangovers of that failed acquisition," McNaughton told Ars, "because they didn't come down and lay down the law and say 'this is our roadmap and our vision and let's start executing to the roadmap,' like you have to do when you have 6,000 engineers on staff."

Fight fair

Intel and AMD had a history of competition in the courtroom as well as the marketplace. The companies had bad blood going back for decades—in the 1980s, as a way to appease IBM, AMD and Intel were both supplying chips to Big Blue. AMD agreed to abandon its own design and instead work on x86 chips, whose specifications were provided by Intel. As AMD saw it, Intel played dirty and didn’t honor that contract—in its legal complaint, it alleged that the information Intel provided was “deliberately incomplete, deliberately indecipherable, and deliberately unusable by AMD engineers.”
This conflict finally ended after five years of arbitration, culminating in a 1990 decision. As the New York Times reported at the time, the retired judge handling the arbitration called Intel's behavior "a classic example of the breach of the covenant of good faith and fair dealing: preaching good faith but practicing duplicity." But he "characterized Advanced Micro's behavior as 'unremitting vindictiveness accompanied by a large dollop of opportunism.' He also said Intel's behavior did not excuse Advanced Micro from having to come up with acceptable products to trade. 'The problem is that AMD assumes a somewhat romanticized factual situation which, like Camelot, never existed,' he said in the ruling."
In the mid-2000s, AMD came to believe that Intel was unfairly out to sabotage it in the marketplace once more, using money and clout to beat back AMD's technological superiority. Ruiz describes the company's view in his book:
Toshiba had accepted a hefty payment from Intel in 2001 on the promise that it wouldn’t use AMD processors. The “market development funds” totaled between $25 million and $30 million per quarter—a sum Toshiba executives likened to “cocaine” because it was a deal they just couldn’t quit.
Intel had bought Hitachi’s exclusivity as well. Whereas AMD had been shipping 50,000 Athlon chips to Hitachi in the first and second quarters of 2002, by the third quarter AMD’s shipments suddenly fell to zero.
NEC’s stance was especially disappointing. By the third quarter of 2002, AMD had won 84 percent of NEC’s Japanese consumer desktop business—a substantial achievement given our historical position as number two in the global semiconductor market. Looking at notebooks and desktops together, we supplied 40 percent of the company’s microprocessor needs. That would end shortly after Intel agreed to pay NEC more than ¥3 billion per quarter, as long as NEC would give 90 percent of its business to Intel and strictly limit its dealings with AMD. By 2003, AMD’s share of NEC’s consumer desktop business had slid to nearly zero too.
NEC went so far as to tell us firsthand about its agreement with Intel, which dictated that AMD’s share of NEC’s Japanese market had to be held to single digits. Globally, AMD’s share of NEC business would fall from 40 percent to 15 percent.
Amid all this activity, it seemed that whenever we took one step forward, we stumbled two steps back. This was particularly frustrating because AMD had become the market leader in technology; we had been expecting advances, so the sudden retreats seemed undeserved. I knew from the start that AMD’s fight for market share was going to be an uphill battle. But I held faith that the market would right itself in time.
AMD had already filed a similar complaint with the European Commission back in 2000, and Japan’s Fair Trade Commission found that Intel violated antitrust rules there. After consulting with a noted political science professor at the University of California, Berkeley, AMD decided in 2005 to file an antitrust lawsuit in the US. AMD’s internal codename for this lawsuit was “Slingshot,” a David-and-Goliath reference reflecting AMD’s desire to vanquish its much larger competitor in court (and it's where Ruiz’s book gets its name.)
AMD’s former CFO, Fran Barton, told Ars that Intel was "just not going to let AMD in. There’s a thing—Microsoft is reputed to have done it, Intel is reputed to have done it, when they get to near-monopoly, certain executives take advantage knowing that it’s hard to prove, and when you do get caught it will be five or eight years down the road.” But Barton admitted that Intel also had “cost advantages over us from scale.”
“AMD is never going to win going head-to-head with Intel,” Stacy Rasgon, a financial analyst at Sanford Bernstein, told Ars. “Intel can sleep for a while, [but it has] a tremendous amount of resources—AMD kicked the sleeping giant.”
For well over a decade, AMD has said in its own SEC filings that Intel “dominates” the processor industry and has attributed its own shortcomings to Intel’s ability to outflank it at every turn. Case in point: as of this writing, AMD’s market capitalization is just $1.9 billion, while Intel is 57 times larger, at $106 billion. (As of the end of 2012, AMD's debt and capital lease obligations were $2 billion. By contrast, Intel holds about $23 billion in long-term debt obligations.) Even if Intel had not acted the way it did, AMD faced an uphill battle—and that slope has simply gotten steeper as Intel has grown larger and as AMD’s smaller competitors (read: ARM licensees) have grown more numerous.
“They’ve put themselves in this corner of the marketplace, and it’s an odd one, they admit it," Craig Stice, an analyst with IHS Global Insight, told Ars. “But at the same time, you never get the sense that they are making strides for attempting to get themselves out of that corner. You never get the intention that they want to be bigger than Intel. They seem happy in their little corner to an extent and that piece of the market has been dwindling from them.”

Going global with GlobalFoundries

More than a year after the ATI deal closed, AMD couldn’t execute well, couldn’t sell enough of its products, and was straining financially. AMD ended 2007 over $5 billion in debt and lost $3.3 billion on the year, its worst single-year loss over the last 15 years.
AMD was therefore finding it difficult to keep up with the costs of running its own chip fabrication facilities. The company needed to keep upgrading its existing fabs in Germany, but it was also looking to build a new facility up in New York state. Especially in light of the ATI purchase, the company’s coffers just couldn’t support the stress.
By the summer of 2008, AMD’s top brass were looking for a way to get help from some of the world’s richest financiers, including Mubadala Development, the investment company for the government of Abu Dhabi. Established in 2002, Mubadala has become a massive investment vehicle, now worth over $48 billion. Through personal connections between AMD executives and government officials in the emirate, AMD found itself in a position to sell its fabs to a new company that came to be called GlobalFoundries. It was one of Ruiz’s last acts as CEO.
The idea behind the strategy, called “Asset Smart” at the time, was that the newly fab-less AMD could concentrate on architectural design while the newly AMD-less fabs could seek business from other fab-less companies, taking advantage of capacity that AMD couldn’t use by itself and defraying the cost of improving the facilities.
Incredibly, Ruiz managed to seal the deal by playing the Abu Dhabi investors off themselves, making them believe that there were other potential investors when there really weren’t.
For starters, we made certain that all along Abu Dhabi believed they weren’t the only ones with skin in the game. We never let on—not in nearly two years of talks—that they were the lead bidder. Not even when, toward the end, they were the only bidder. In addition to our conversations with other countries around the world, especially Brazil, we had engaged early on with Saudi Arabia, although talks never went as far or as deep as they had with Abu Dhabi. But I repeatedly dropped hints to Abu Dhabi that the Saudis wanted to be involved (and indeed the negotiations had gone on for at least a year).
Over a breakfast in Italy with [Mubadala CEO Khaldoon Al Mubarak]—who was in the country to meet with Fiat executives—I even made the suggestion that Abu Dhabi and Saudi Arabia consider a joint venture, possibly even building a factory on their shared border. Being from a borderland myself, I liked the sound of it.
Khaldoon nipped that idea in the bud. “No, Hector,” he said. “We’re going to do this ourselves.”
Nothing like a little national pride to motivate negotiations, I thought.
That’s how we played it through all of a difficult 2007—when Abu Dhabi made that initial $608 million investment—and on into 2008, when the industry was suffering a brutal downturn and AMD’s stock plummeted from around $20 a share to $4.
As Ruiz tells it, “closing the deal with Abu Dhabi required that I leave AMD for the new foundry company.” He stepped down as CEO and left the company as of July 2008, after seven consecutive quarterly losses. Ruiz was replaced by Dirk Meyer, leader of the team that had designed the K7 architecture.
Khaldoon Al Mubarak (center) is the CEO of Mubadala Development, the investment company for the Government of Abu Dhabi.
When the deal closed, Intel pounced on this opportunity to gain some leverage in the AMD suit, arguing that the cross-licensing agreement for the x86 instruction set—without which AMD couldn’t make or sell x86 chips—applied only to a combined AMD that controlled both the design and the manufacture of those chips. An AMD that contracted manufacturing out to a third-party was not authorized to manufacture those processors, Intel said.
Despite AMD’s frustrations, allegations, and lawsuits, it ultimately decided to bury the hatchet with Intel near the end of 2009. While AMD maintained all along that it was not in violation of the cross-licensing agreement with Intel, a new version of the agreement that specifically allowed manufacturing of x86 chips at non-AMD fabs was part of the final settlement, suggesting that this was at least part of what encouraged AMD to settle. It also didn't help that, by 2009, AMD could no longer argue that it was being shut out of the market despite having a technically superior product, as it could in 2005 when the suit was originally filed.
AMD also got $1.25 billion in cash from Intel, plus assurances that Intel wouldn’t engage in anticompetitive practices. The settlement was widely seen as a success for AMD, though the money amounted to little more than a slap on the wrist for Intel, which reported profits of $2.3 billion in the fourth quarter of 2009 alone. While AMD technically came out on top, all it got was a one-time cash infusion and a guarantee that its x86 license wouldn’t disappear. All Intel ultimately had to give up to make the suit go away was a portion of one quarter’s profits and a promise not to do a bunch of stuff that it maintains it wasn’t doing anyway.
From suit to settlement, this entire saga showed how AMD’s technical slip-ups impacted everything else the company was doing. AMD's inability to execute made it more difficult to argue that Intel was holding it back from market success. The execution problems also made it necessary to sell off the fabs, weakening AMD's bargaining position with Intel.

In a bad way

AMD's problems have not gone away. Despite its best efforts in engineering, its share of the PC and server markets continues to slip; despite new products intended for use in Ultrabooks and tablets, its presence in these market segments is negligible. Many computers equipped with AMD CPUs are confined to the low end of the market, where margins are low and the sales numbers are especially sensitive to the gradually slowing demand for PCs.
Former CFO Barton is not convinced that the company has much a future. “[Even without the lawsuit against Intel,] it wouldn’t have mattered,” he said. “[Sanders] took his shot, and the game’s been played.”
Ruiz also doesn't feel bullish on AMD's present chances. He argues that his strategy, while it may have had short-term downsides, would have worked out in the end. "I don't care who you are, if you're a semiconductor company you're going to compete with Intel whether you like it or not," he told Ars in a phone interview in March 2013. "Now if you ask me what is AMD all about? It's not obvious. I can't tell you. When I was there, it was obvious. We were going to compete with Intel and we were going to take market share away from them."
Ruiz's replacement lasted less than three years as CEO and was in turn replaced in late 2011 by current CEO Rory Read. Read had previously been president of Lenovo and before that had been at IBM for over 20 years. He hasn't yet been able to retool AMD; the company’s most recent 10-K SEC filing notes ominously that “approximately 85 percent of our business is focused on the legacy PC portions of the market, projected to have slowing growth over the next several years.”
On Wall Street, AMD has few friends—the stock price has fallen from around $8 at the beginning of 2012 to just above $2 a share today. “Until fairly recently, I was one of the most bullish people on [Wall Street],” said Stacy Rasgon, an analyst at Sanford Bernstein. The turning point for Rasgon came in late 2010, when AMD had trouble shipping its highly anticipated chip, the “Fusion” 32nm chip codenamed Llano.
“They had big problems ramping the yields,” Rasgon said. “I remember if I go back to July 2010 I was the only guy on the Street that was bullish [on AMD]. I was on the [quarterly investors’] call and they had a lot of issues. They said we fixed the problems and Llano is now going to ramp faster and the stock was up 20 percent the next day. Then they pre-announced [the company’s next earnings], saying they’re not going to make it. This was October 2010, and I downgraded the stock not because I didn’t believe the structural story anymore. They had no clue if it was a three-week or a six-month problem.”
In the end, it turned out to be closer to the latter: Llano didn’t ship until April 2011.
“[AMD is] slashing operating expenses and laying off engineers,” Rasgon added. “Last year was [cutting] fat, this year is meat and bones. Their core business is basically collapsing. The client PC [market] is down 13 percent. I was never worried about liquidity, I’m worried about them now. They are faced with a PC market that is not dead, but is not growing anymore.”

Fight for the future

AMD acknowledges its struggles. “Of course, talk is cheap—we need to deliver not just on the product front but also show progress in our financials,” said Michael Silverman, an AMD spokesperson. But the company believes it has stanched the bleeding, and AMD officials espouse full confidence a new strategy that doesn't rely on trying to "out x86" Intel—by the end of 2013, CEO Rory Read wants 20 percent of the company's revenue to come from other markets.
AMD also retains a sizable portion of both the PC and server markets, though these have been reduced since the company's heyday, and the company has rolled out relatively consistent improvements to core products like Trinity and Steamroller. The graphics division continues to put out gaming GPUs that perform competitively with those from Nvidia. Indeed, the graphics division produced one of the company's rare bits of recent good news: AMD will supply both the GPU and CPU for Sony's PlayStation 4 and is widely expected to do the same for Microsoft's next Xbox. AMD also supplies the GPU for the outgoing Wii and the struggling Wii U. Game consoles are a relative drop in the bucket compared even to the dwindling PC market (Microsoft has sold a little over 70 million Xbox 360s since 2005, and the PC market can generally meet or beat that number in a single quarter), but having its silicon in all three of the major contenders is a chunk of change and a good bit of publicity for AMD.
AMD has essentially missed the boat on tablets and smartphones, though, mostly because of its own short-sighted decisions: the company’s low-power Bobcat architecture didn’t arrive in time to take advantage of the height of the netbook craze, and AMD hasn’t been able to drum up much OEM interest in the architecture for Windows 8 tablets. In 2009, AMD sold its Imageon mobile graphics unit to Qualcomm for a mere $65 million; Qualcomm soon managed to turned this tech into the Adreno GPUs included in its immensely popular Snapdragon processors.
On the desktop side, it appears that the company is focusing on quiet but competent execution—its chips’ performance and power usage remains stuck about a year to two years behind Intel’s, but AMD has made steady and on-schedule improvements to the initially disappointing Bulldozer architecture. Still, it's now clear that this is no longer enough.
One market that AMD is really trying to corner is relatively untested today: high-density micro-servers. The company now bets that server rooms at companies like Facebook and Twitter will need to handle a huge quantity of tasks with relatively low processing requirements, and to that end AMD purchased an outfit called SeaMicro in early 2012. This purchase, at least in theory, does two things: it gives AMD an opportunity to demonstrate and sell its upcoming ARM-based Opteron chips to potential customers, and it gets AMD in the business of selling server hardware directly to Web companies rather than to hardware vendors like Dell or HP.
SeaMicro has been working on its “Freedom Fabric,” the software side of the equation, for several years now, and among the ARM licensees in the game today AMD probably has the longest history in the server market. Micro-servers could be a chance for AMD to get a toehold in a lucrative, high-margin market, something the company’s balance sheet desperately needs.
Will it work? Possibly—but AMD has to overcome deeply ingrained problems. "AMD became a large, slothy company and were careless about capital management," said Atiq Raza when asked about the company's chances. "Now, no matter what they do, they've got a great team, they've got good management, unfortunately I think it's very late."
Time may be short, but AMD is used to playing the underdog.
"This path that we’re on... we made the big bet [by purchasing graphics card maker] ATI and big engineering to optimize our engineering such that we’re cranking out all this IP on a single chip that we can mass commercially produce or in a custom configuration for someone who wants to do something unique, like Sony," John Taylor, the director of global business marketing at AMD, told Ars. "We believe it puts us in a place for the first time where we can create a place for sustained differentiation. I have a tremendous amount of faith in the new leadership team at AMD. I can’t remember one day at my time at AMD that didn’t feel urgent."



No comments:

Post a Comment

Let us know your Thoughts and ideas!
Your comment will be deleted if you
Spam , Adv. Or use of bad language!
Try not to! And thank for visiting and for the comment
Keep visiting and spread and share our post !!
Sharing is a kind way of caring!! Thanks again!