Processor makers regularly exaggerate the performance of their chips (remember Intel’s obsession with clock speed?), but AMD is learning that there are limits around what you can claim. It’s facing a class action lawsuit accusing the company of misleading buyers about the number of cores in its Bulldozer-based CPUs. It would advertise that a given processor had eight cores, for example, when it effectively had four — each core in AMD-speak was really half of a module, and couldn’t operate independently. As such, that Bulldozer part couldn’t handle as many simultaneous instructions as you’d expect in a true eight-core design. That was bound to be a disappointment if you were a performance junkie expecting eight-way computing in your gaming PC or server.
Via: Ars Technica
Source: Legal Newsline
Qualcomm has been the biggest player in the mobile SoC market for the past few years, powering a wide selection of Android smartphones and tablets with its range of high and mid performance processors. However, the loss of Samsung as a major customer, troubles with its high-end Snapdragon 810 processor and the growth in cost-effective smartphones has left Qualcomm to rethink their market position with a major workforce restructure.
Compounding Qualcomm’s problems, the company has posted poor financial results for the year so far and is now under investigation by the European Commission regarding predatory pricing tactics in the mobile market. With chink’s appearing in Qualcomm’s armor, is there a mobile SoC manufacturer out there who can step up to claim pole position?
Samsung may seem like the most obvious contender, having topped the benchmarks with its Exynos 7420 processor this year. The high-end market is the most lucrative, with higher margins for both consumer and processor products. As a result, the decline in flagship smartphone sales has hurt Qualcomm’s revenue this year.
Samsung, on the other hand, has seen big gains in its semi-conductor business lately due to its own manufacturing capability. The company’s products have ended up benefiting from its cutting edge processor designs and manufacturing processes. Samsung beat Qualcomm to 14/16nm this year, as Qualcomm is reliant on TSMC’s manufacturing facilities.
The trouble with Samsung is that its Exynos line-up of mobile chips have remained virtually exclusive to Samsung phones. Only a handful of manufacturers, such as Meizu which is rumored to use Samsung’s leading Exynos 7420 in its MX5 Pro, have made regular use of Exynos processors. However, Samsung is gradually building up a portfolio of modern chips that might appeal to a larger range of manufacturers, from low-end quad-core 3470, last generation octa and hexa-core 5 series, and its high end Exynos 7 range.
However, production capacity is also potentially an issue here, with much of its supply used up on its own handsets. New orders from Apple for a new iPhone chip could use up the rest of its manufacturing space. Samsung has been making efforts to invest in additional manufacturing facilities and has been able to completely reduce its reliant on Qualcomm for its high-end phones this year, so perhaps the next step is to begin selling to additional OEMs.
If Samsung is Qualcomm’s biggest competitor in the high-end market, then MediaTek is hot on Qualcomm’s heels in the mid-tier. The MediaTek brand has long been synonymous with low-cost mobile products, but the semiconductor company has been rolling out vastly improved mid-range processors over the past couple of years and has grabbed itself a notable portion of this market too.
MediaTek has been at the forefront of big.LITTLE ARM SoC designs, which has resulted in a range of mid-range octa-core processors capable of competitive performance at a fraction of Qualcomm’s costs.
the decline in flagship smartphone sales has hurt Qualcomm’s revenue this year.
Additional features, such as higher resolution support and built-in 4G LTE data connectivity, have also helped MediaTek level the playing field with Qualcomm in the mid-tier. MediaTek’s latest high-performance Helio processors, such as the deca-core X20, also offer OEMs something to think about when building higher-end products.
Due to the low cost nature of its SoCs, MediaTek has been been unaffected by the lack of demand for flagship smartphones and is currently capitalizing on the huge growth in emerging markets, such as China and India. As the next billion smartphone users come online, they may be more familiar with MediaTek than Qualcomm, which could secure the company a significant long-term share in these markets.
However, MediaTek’s problem remains its links to the Chinese government and past controversy regarding various security issues. While things have changed over the past few years, MediaTek’s reputation is perhaps its biggest barrier to challenging Qualcomm in more markets around the world.
If Nvidia’s recent financial results are anything to go by, the company is much more interested in the automotive industry these days than competing with the big names in the mobile SoC market.
Nvidia’s flagship Tegra X1 SoC boasts a cutting edge CPU design based on ARM’s Cortex range and GPU technology from the desktop Maxwell set of graphics cards. The SoC also competes with Qualcomm on display, camera, and audio features, something which not every other SoC developer can claim to do.
While its Tegra chips offer impressive gaming capabilities for its SHIELD console and tablet, Nvidia doesn’t have a portfolio range capable of contesting the low and mid segments of the market, and is more focused on 3D performance than efficient smartphone designs these days. However when it comes to tablets, Nvidia is still a compelling choice when stacked up next to Qualcomm’s processor.
Given how long it has taken Intel to push its processor technology into just a handful of smartphones, the company is unlikely to suddenly leap into pole position. However, Intel is looking to expand into the entry level market, and may be able to steal some share away from MediaTek and Qualcomm, providing that the price is right.
Intel is finally looking to push its SoFIA processors, recently renamed to the Atom X3, to mobile devices this year, along with its “Cherry Trail” Atom X5 and X7 processors.
The company will have an integrated 3G modem alongside its processor, which might make it a more compelling chip for the low end market. However, with many regions and product categories already moving over to 4G, Intel is still a considerable way behind rivals such as MediaTek and Qualcomm.
Furthermore, its X5 and X7 remain without an integrated modem, leaving them mostly targeted at the tablet market. The Atom X3 range is targeted at phones with a retail value of less than $200 where margins are much smaller, so I don’t think that Qualcomm will be too worried.
AMD is perhaps a wild card here. The company has the manufacturing legacy, the CPU and the graphics technology to make a major play for the mobile market, but has so far remained even more distant than Intel about tackling the big mobile players.
While the company may be better known for its higher TDP A-Series of laptop processors and GCN GPUs these days, AMD is also a big player in the server business, with multi-core server SoCs built from familiar ARM Cortex CPUs.
The AMD Opteron A-Series was one of the first ranges to make use of quad and octa-core Cortex-A57 CPUs, which you can find in modern mobile SoCs. That’s pretty much where the similarities with mobile chips end, but the company has the experience and know-how to put a mobile product out there if it wanted.
In 2016 or early 2017, depending on how well the company sticks to its schedule, AMD is expected to release its first custom 64-bit ARMv8 CPU core, codenamed K12. This is expected to be built on a 14nm manufacturing process and is rumored to be targeted at embedded applications, notebooks, Chromebooks and perhaps even Android based devices, such as tablets. That being said, servers are expected to remain the primary market for K12.
The company’s latest roadmap showcases products that will bring its GCN graphics technology down to SoCs that fit within a 2W power budget, which is right in the mobile sweet spot. It will be interesting to see if AMD can provide GPU performance that competes with energy efficiency mobile designs from ARM, Imagination Technologies and Nvidia.
However, AMD doesn’t appear to have too much of an interest in the smartphone market, neither the premium nor cost effective segments. Recent interviews suggest that the company is betting big on mid-range laptops returning to popularity in the near future as people look for more productive computing solutions.
We probably won’t see AMD make a major play for smartphones any time soon, but perhaps continued pressures on the laptop market may force the company to revisit its approach in the future.
While many other mobile SoC developers have been improving their product line-ups in 2015, Qualcomm may only be undergoing a temporary lull. The chip giant has its new high-end Snapdragon 820 SoC lined up for next year, which may the see company claw back ground from Samsung and reinstate itself as the performance king.
Qualcomm’s new Snapdragon 212, 412 and 612 fills out a portfolio that continues to offer something for every tier of the market, and the company remains at the cutting edge of modem, ISP and wireless mobile technology. Whether or not someone eventually overtakes Qualcomm remains to be seen, but the company will likely continue to be a major player in the mobile market for many years to come.
Since the rise of 3D graphics cards, the inexorable trend in PC gaming has been around getting bigger, better and faster. That led to a culture of PC gamers obsessing over frame rates and doing whatever it took to push their hardware as much as possible. But now that even relatively affordable graphics cards can hit a silky smooth 60 fps at 1080p, there’s only one big mountain left to climb: 4K gaming. And that’s exactly what a powerhouse card like AMD’s new Radeon R9 Fury X ($650) is poised to tackle. The only problem? 4K gaming still isn’t worth your time and money.Slideshow-312778
The Radeon R9 Fury X is the sort of thing that’s built expressly to make PC gamers salivate. While the card itself is relatively minimalist with a jet-black design, once it’s turned on you get a blingy glowing “Radeon” logo and LEDs that show off how hard the GPU is working. But, most impressively, the card also has an external water cooler attached, which takes the place of a rear fan in your computer case. It’s not the first video card to ship with water cooling, but it’s an impressive setup nonetheless (although it will make installing the card a bit more complex). It’s also worth noting that the R9 Fury X’s direct competitor, NVIDIA’s GTX 980 Ti, ships with air cooling. That’s a sign of much more power-efficient hardware. (I would have liked to compare the two cards directly, but I’m still waiting on review hardware from NVIDIA.)
While the R9 Fury X can achieve speeds of up to 1050MHz out of the box, its water cooling setup could lead to some decent overclocking potential down the line. I didn’t want to risk harming my loaner card from AMD, but initial overclocking attempts by AnandTech led to modest (75Hz) gains. With some more tweaking, though — especially going beyond the limits AMD implements in its desktop software — I wouldn’t be surprised if you could reach higher speeds. Then again, given how fast the card is already (it also packs in 4GB of “high-bandwidth memory” RAM), you might not want to bother with the whole mess of overclocking.
On my gaming rig — which consists of a 4GHz Core i7-4790K CPU, 16GB of 2400Mz DDR3 RAM and a 512GB Crucial MX100 SSD on a ASUS Z97-A motherboard — the R9 Fury X didn’t break a sweat when gaming in 1080p with every setting on high. No surprise there (and if that’s all you’re looking for, consider the plethora of sub-$300 cards out there). But once I started testing out games in 4K (with a Samsung UE590 monitor loaned by AMD), the card truly started to shine. Both The Witcher 3: Wild Hunt and Batman: Arkham Knight got around 35 fps on average with high-quality settings, and while that might not sound like much, the fact that they’re both beyond 30 fps is a decent show of progress from last year’s cards. It means you can actually play those games in 4K without any noticeable stuttering.
But enough of the numbers: How do games look in 4K? For the most part, pretty darn great. For The Witcher 3, in particular, I was able to make out even finer detail in character models, their clothing and the overall environment. But I also quickly realized that minor bump in fidelity wasn’t worth the drop from the 1080p 60 fps I was used to, which looks a lot smoother. Moving The Witcher’s Geralt of Rivia around the game’s incredibly detailed environments was less jerky and more life-like than in 4K. Basically, It’s hard to get used to lower frame rates when 60 fps was the ideal I was striving toward for years. There were also occasions where games dipped below 30 fps, which was hard to stomach on a $650 video card.
On a broader level, 4K isn’t really worth the investment for most PC owners; 4K monitors are still relatively expensive, starting at around $400 to $500 for 27-inch models (1080p screens are around half that), and their panels typically aren’t as high-quality as lower resolution screens. Some 4K monitors only offer 30Hz refresh rates, which limits your gaming to 30 fps and leaves little room for graphics upgrades down the line. (The monitor I’m using advertises 60Hz 4K, but I’ve been unable to reach that with multiple cables.) And, perhaps most damning, Windows 7 and 8 still isn’t well-suited to 4K screens. You’d have to upgrade to Windows 10, which offers much better high-resolution scaling, for a decent 4K experience.
I found that gaming at a 2,560 x 1,440 (WQHD) resolution was the best compromise between fidelity and frame rate. It’s sharper than 1080p (which runs at 1,920 by 1,080), and the R9 Fury X was able to reach 60 fps in that resolution easily. You’ll still pay a premium for WQHD displays, but models like the Dell UltraSharp U2715H (which our friends at The Wirecutter recommend as the best 27-inch monitor) sport high-quality IPS panels, so they’ll look a lot better than many 4K monitors. Plus, 2,560 x 1,440 on a 27-inch monitor is also a usable resolution for desktop work — no microscope required.
At this point, 4K gaming feels like the worst aspects of PC gaming: expensive and counterintuitive, with radically diminishing returns. It’s a badge of honor if you have a system that can actually play games in 4K, and nothing more. It could eventually become commonplace for gaming, especially as VR headsets demand more pixels, but for now you’d be better off trying to get the highest frame rate you can with a lower resolution.
Tags: 4K, amd, ArkhamKnight, engadgetirl, hdpostcross, irl, R9FuryX, TheWitcher3, videocards
You can get desktop PC displays that are curved, super-wide and gaming-friendly, but all three at once? That’s tricky. Thankfully, Acer thinks it has an answer. The company has just launched the 34-inch XR341CK in the US, giving you a curvy, 21:9 aspect ratio LCD with AMD’s anti-tearing FreeSync tech built-in. So long as you have a fast-enough gaming rig (including newer AMD graphics, if you want FreeSync), you’ll get an extra-immersive canvas for your first-person shooters and racing sims.
You won’t get 4K (this is “just” a 3,440 x 1,440 LCD), but you’ll still find DisplayPort input, Mini DisplayPort, HDMI 2.0 and a USB 3.0 hub. There’s also a 14W speaker system if the screen takes up the free space you’d normally use for audio gear. This monster monitor will cost $1,099 when it ships in July — no small potatoes, but potentially worth it if you’d otherwise get multiple displays to achieve the same all-encompassing effect.
Source: Acer (PRWeb)
Only a few weeks after NVIDIA debuted its latest high-end card, the GTX 980 Ti, AMD is now showing off its latest wares. And if you’re looking for a powerful video card, your decision just got a lot more complicated. Leading the pack is AMD’s new R9 Fury X, a liquid-cooled powerhouse with the company’s new “Fiji” GPU design and highest-bandwidth memory technology. At $649, it’s going head-to-head with NVIDIA’s 980 Ti. But if you don’t need all that power, there’s also the Radeon R9 390X ($429), R9 390 ($329), and R9 380 ($199), all of which offer Direct X 12 support (making them ideal for Windows 10) and enough power to let you game in 4K (though we’d imagine that’d be a stretch with the cheaper entry). And if you’re just looking something affordable, there’s also the R7 370 ($149) and R7 360 ($109), which are more focused on delivering solid 1080p gaming.
Just like NVIDIA, AMD now has new cards for just about every gaming price point. If you’ve got the dough and care about getting as much graphics power as possible, the R9 Fury X is made for you. If you want the most bang for your buck, the $329 R9 390 might be your best investment (it’s also the same price as NVIDIA’s GTX 970). AMD isn’t divulging technical details around the new cards yet, but we’re expecting to hear more later this week.
LG will be the first to sell a 4K monitor with AMD FreeSync technology, beating models from rival Samsung by a nose. The technology in its 27-inch 27MU67 is similar to NVIDIA’s G-Sync, matching monitor and GPU refresh rates to eliminate tearing, stutter and other gaming issues — as long as you have a compatible AMD graphics card or chip. Like Samsung, LG makes its own panels and it shows in the specs. The IPS screen can run at 40 to 60fps with 9.7 milliseconds of input lag, while being decent for color pros with a 99 percent Adobe gamut and 10-bit interpolated color. It’ll come calibrated out of the box later this month at select retailers for $599 — quite a drop in price from last year.
With AMD’s new sixth-generation A-series processors, laptops in the $400 to $700 range could soon become far more capable. Formerly code-named “Carizzo,” the new chips offer twice the gaming performance of Intel’s Core i7, thanks to discrete Radeon graphics. They’re the first mainstream processors with hardware decoding for H.265/HEVC video, the successor to the current H.264 standard which includes far better compression and support for 4K resolutions. And they’ll also pack in up to 12 compute cores (four CPU and eight GPU), which basically means they’ll be able to handle whatever you throw at them. Why focus on mainstream laptops? AMD notes that it’s the largest segment of the PC market by revenue and volume sold, so it makes sense for a company that’s traditionally focused on value to show it some love.
AMD’s basically gunning hard for the gaming and media crowd with its sixth-generation chips. Naturally, they include its dual graphics technology, which can summon the power of Radeon R7 graphics along their built-in graphics for a 42 percent jump in frame rates. They also combine game performance with hardware video encoding, which should make for much smoother game streams on Twitch. Battery-wise, AMD says the new chips will last twice as long as their predecessors, even when watching video or gaming.
While it all sounds good on paper, AMD will face some stiff competition from Intel’s new Broadwell-H chips, which were announced yesterday. Those chips also double graphics performance and boost overall media performance. Intel’s chips seem pretty expensive though, ranging from $244 to $623, and while we don’t know the pricing of AMD’s new wares yet, they’re usually cheaper than Intel. So there’s a good chance AMD can keep up the value fight. We’ll find out for sure once the new chips hit the market. AMD says computers featuring the sixth-gen A-series will start shipping in June, while Intel expects its chips to hit computers within two months.
Intel isn’t the only chip giant championing battery life over performance this year. AMD has revealed Carrizo, a processor range that’s focused heavily on extending the running time of performance-oriented laptops. While there will be double-digit boosts to speed, there’s no doubt that efficiency is the bigger deal here. The new core architecture (Excavator) is just 5 percent faster than its Kaveri ancestor, but it chews up 40 percent less energy at the same clock rate — even the graphics cores use 20 percent less juice.
Not that this is the only real trick up AMD’s sleeve. Carrizo is the first processor to meet the completed Heterogeneous System Architecture spec, which lets both the CPU and its integrated graphics share memory. That lets some tasks finish faster than they would otherwise (since you don’t need as many instructions), and it could provide a swift kick to both performance and battery life in the right conditions. You’ll also find dedicated H.265 video decoding, so this should be a good match for all the low-bandwidth 4K videos you’ll stream in the future.
The new chip is pretty promising as a result. With that said, its creator will undoubtedly be racing against time. Carrizo is expected to reach shipping PCs in the second quarter of the year, or close to Intel’s mid-year target for its quad-core Broadwell processors. You may find shiny new AMD and Intel chips in PCs at around the same time — that’s good news if you’re a speed junkie, but it’s not much help to AMD’s bottom line.
Microsoft may have already cut the Xbox One’s price to $349 for the holidays, but there are hints that the game console may get a permanent price drop before too long. An AMD chip design manager recently updated his LinkedIn resume (since made private) with word that he worked on a more efficient, “cost-reduced” version of the Xbox One’s processor. There aren’t any clues as to when this spruced-up silicon will arrive, but recent AMD roadmap leaks suggest that its first CPU architecture based on the technology will ship in 2015. In other words, you could be buying a cheaper Xbox by this time next year.
Processor upgrades are nothing new in the console world. They’re crucial for both price cuts and smaller, cooler-running systems that aren’t so noisy. However, there’s added urgency this time around — the PlayStation 4 has been outselling the Xbox One due in part to its initial price advantage, and lower prices should both help Microsoft stay competitive and leave more money in your wallet.
Source: Mosen (Beyond3D Forums)
It feels like just yesterday that AMD brought in Rory Read to turn around its ailing fortunes, but today there’s another changing of the guard. The chip designer has announced that chief operating officer Lisa Su is its new CEO, effective immediately; Read will stick around as an advisor until the end of the year. The company isn’t going into detail about the reasons behind the shift, but it does say that Read has been planning a succession with the board of directors. It’s an “ideal time” for Su to take the reins, the board’s Bruce Clafin says.
Su hasn’t said much about what she’ll do so far, but she hints that she’s likely to continue a strategy of crafting chips for “diverse” categories (think game consoles and mobile devices) that Read used to great success. Whatever the new CEO does, she likely has the right background for it. Su spent five years heading up technology development at mobile giant Freescale before she joined AMD in 2012, and she spent 13 years in semiconductor- and business-related positions at IBM before that. It’s safe to presume that she knows a thing or two about the importance of super-efficient silicon.
Via: New York Times