Anandtech

Syndicate content
This channel features the latest computer hardware related articles.
Updated: 33 min 55 sec ago

AMD Radeon R9 285 Review: Feat. Sapphire R9 285 Dual-X OC

Wed, 2014-09-10 11:00

Last month AMD held their 30 years of graphics celebration, during which they announced their next Radeon video card, the Radeon R9 285. Designed to be AMD’s new $249 midrange enthusiast card, the R9 285 would be launching on September 2nd. In the process the R9 285 would be a partial refresh of their R9 280 series lineup, supplying it with a new part that would serve to replace their nearly 3 year old Tahiti GPU.

The R9 285 is something of a lateral move for AMD, which is something we very rarely see in this industry. The R9 285’s immediate predecessor, the R9 280 (vanilla) has been on the market with an MSRP of $249 for nearly 4 months now. Meanwhile the R9 285 is not designed to be meaningfully faster than the R9 280 – in fact if you looked at the raw specifications, you’d rightfully guess it would be slower. Instead the R9 285 is intended to serve as a sort of second-generation feature update to R9 280, replacing it with a card at the same price with roughly the same performance level, but with 3 years’ worth of amassed feature updates and optimizations.

Categories: Tech

LaCie d2 Thunderbolt 2 DAS Review

Wed, 2014-09-10 09:55

Seagate's premium storage brand, LaCie, has been introducing a wide variety of Thunderbolt 2 products since late last year. Today, we are seeing the launch of a hybrid direct-attached storage (DAS) unit with both USB 3.0 and Thunderbolt 2 connections in the d2 Thunderbolt 2. The differentiating aspect is the availability of a full-speed PCIe SSD add-on which adds another storage module at the expense of the USB 3.0 port. We took the unit for a spin using our Windows-based Thunderbolt 2 setup. Read on to see how the unit performs.

Categories: Tech

Acer Announces Two Frameless Monitors: the UHD 27" S277HK and WQHD 25" H257HU

Wed, 2014-09-10 05:00

On Tuesday in Taiwan, Acer announced two monitors that might be worth a look for anyone looking to put a couple of multi-monitor setups together, or interested in an attractive design combined with high resolution. The first is the S277HK, which is a 27” UHD/4K model, and the second is the H257HU which is a 25" WQHD model.

The S277HK is the first 4K monitor with a frameless design according to Acer. In addition to the 3840x2160 resolution for the IPS panel, the 27” model also has DTS surround sound though Acer does not go into specifics on how the audio is achieved. With an asymmetric stand and aluminum bezel, the S277HK certainly looks as premium as the specs would indicate. Connectivity is DVI, HDMI 2.0, and DisplayPort 1.2.

S277HK (left) and H257HU (right),
images courtesy of TechPowerUp

The second monitor announced is the H257HU, which also features the frameless design, but the IPS panel is slightly lower resolution at 2560x1440 (WQHD). This monitor also features DTS sound, and a round rim base and DVI, HDMI 2.0, and DisplayPort 1.2 inputs.

'Frameless' is a little bit of a misnomer, as typically all monitors will have some sort of edge bezel. In the smallest bezel monitors on sale, sub-6mm is quite common although specialist models exist that might go smaller. The H257HU, from the sole small image we have found access to, looks like it has a larger screen-distance, despite the 'frameless' moniker keeping the edge distance small.

The IPS panels on these devices also include features to assist with eye strain including a Flicker-less technology to reduce screen flicker, a blue light filter which Acer claims helps with long term eye damage, and a Low Dimming technology to allow the backlight to be set as low as 15% for low light environments. Both monitors also include “ComfyView” to assist with screen reflections.

Both models will be available starting in Q4 2014, with global availability. Neither refresh rates, color accuracy nor prices have not been announced at this time.

Source: Acer

Categories: Tech

Analyzing Apple’s A8 SoC: PowerVR GX6650 & More

Wed, 2014-09-10 02:00

With their iPhone keynote behind them, Apple has begun updating some of their developer documentation for iOS to account for the new phone. This of course is always a fun time for tech punditry, as those updates will often include information on the hardware differences in the platform, and explain to developers the various features that different generations of hardware can offer developers.

To that end we have compiled a short analysis of the A8 SoC based on these documents and other sources. And we believe that at this point we have a solid idea of the configuration of Apple's latest SoC.

Apple SoC Specifications   Apple A6 Apple A7 Apple A8 CPU Swift @ 1.3GHz(x2) Cyclone @ 1.3GHz (x2) Enhanced Cyclone @ 1.4GHz (x2)? GPU PVR SGX543MP3 PVR G6430 PVR GX6650 RAM 1GB LPDDR2 1GB LPDDR3 1GB LPDDR3?
A8’s GPU: Imagination PowerVR Series6XT GX6650

On the GPU front this year appears to be especially bountiful. After being tipped to an update for Apple’s Metal Programming Guide, we can now infer with near certainty that we know what the A8 GPU is.

New to this edition of the Metal Programming Guide is a so-called iOS_GPUFamily2, which joins the existing iOS_GPUFamily1. We already know that the iOS_GPUFamily1 is based around Imagination’s PowerVR Series 6 G6430 GPU, so the real question is what does iOS_GPUFamily2 do that requires a separate family? The answer as it turns out is ASTC, the next generation texture compression format is being adopted by GPU vendors over the next year or so.

Imagination’s PowerVR Series6 family of GPUs predates ASTC and as a result iOS_GPUFamily1 does not support it. However we know that Imagination added support for it in their Series6XT designs, which were announced at CES 2014. Coupled with the fact that Apple’s documentation supports the idea that all of their GPUs are still TDBR (and thus PowerVR), this means that the GPU in the A8 must be a Series6XT GPU in order for ASTC support to be present.

This leaves the question of which of Imagination’s 4 Series6XT Apple is using. Imagination offers a pair of 2 core designs, a 4 core design (GX6450), and a 6 core design (GX6650). Considering that Apple was already using a 4 core design in A7, we can safely rule out the 2 core designs. That leaves us with GX6450 and GX6650, and to further select between those options we turn to Apple’s A8 performance estimates.

Apple SoC Evolution   CPU Perf GPU Perf Die Size Transistors Process A5 ~13x ~20x 122m2 <1B 45nm A6 ~26x ~34x 97mm2 <1B 32nm A7 40x 56x 102mm2 >1B 28nm A8 50x 84x 89mm2 ~2B 20nm

A8 is said to offer 84x the GPU performance of the iPhone 1, while last year Apple stated that the A7 offered 56x the iPhone 1’s performance. As a result we can accurately infer that the A8 must be 1.5x faster than the A7, a nice round number that makes it easier to determine with GPU Apple is using. Given Apple’s conservative stance on clockspeeds for power purposes and the die space gains from the 20nm process, accounting for a 50% performance upgrade is easily done by replacing a 4 core G6430 with the 6 core GX6650. At equal clockspeeds the GX6650 should be 50% faster on paper (matching Apple’s paper numbers), leading us to strongly believe that the A8 is utilizing a PowerVR Series6XT GX6650 GPU.

Once the iPhone 6 is out and Chipworks can photograph the SoC, this should be easy to confirm. If Apple is using a GX6650 then the die size of the GPU portion of the A8 should be very similar to the die size of the GPU portion of the A7. Otherwise if it is the 4 core GX6450, then Apple should see significant die size savings from using a 20nm fabrication process.

A8’s CPU: A Tweaked Cyclone?

Though we typically avoid rumors and leaks due to their high unreliability, after today’s presentation by Apple we have just enough information on A8’s CPU performance to go through the leak pile and start picking at leak. From that pile there is one leak in particular that catches our eye due to the fact that it matches Apple’s own statements.

On Monday night a supposed Geekbench 3 score of the iPhone 6 was posted. In this leak the iPhone 6 was listed as having a single-core score of 1633 points and a multi-core score of 2920 points. Curiously, these values are almost exactly 25% greater than the Geekbench 3 scores for the iPhone 5S (A7), which are 1305 points and 2347 points respectively.

The fact that ties all of this data together is that in their iPhone 6 presentation, Apple informed viewers that the iPhone 6 is 25% faster than the iPhone 5S. This data was later backed up with their latest CPU performance graph, which put the iPhone 6 at a score of 50x versus a score of 40x for the iPhone 5S.

Given Apple’s data, it looks increasingly likely that the leaked Geekbench 3 results for the iPhone 6 are in fact legitimate. The data leaked matches Apple’s own performance estimates, and in fact does so very well.

In which case we can infer a couple of points about the A8’s CPU, starting with the clockspeed. Given no other reason to doubt this data at the moment and given Apple’s preference for low clocked SoCs, the 1.4GHz reading appears legitimate. In which case this would be a 100MHz increase over the 1.3GHz A7 found in the iPhone 5S.

However the fact that it’s a 100MHz increase also means that clockspeeds alone cannot account for the full 25% performance gain that Apple is promoting and that these Geekbench results are reporting, as 1.4GHz is only a roughly 8% clockspeed increase over 1.3GHz. This in turn means that there must be more going on under the hood to improve the A8’s CPU performance other than clockspeed alone, which rules out a straight-up reuse of Apple’s Cyclone CPU.

Since Apple already had a solid ARMv8 architecture with Cyclone, there’s no reason to believe that they have thrown out Cyclone so soon. However this does strongly suggest that Apple has made some unknown revisions to Cyclone to further boost its single-threaded (Instruction Level Parallelism) performance. What those tweaks are remain to be seen as we would need to be able to benchmark the A8 in depth to even try to determine what Apple has changed, but for the moment it looks like we’re looking at an enhanced or otherwise significantly optimized version of Cyclone. And given Apple’s already high ILP, squeezing out another 16% or so would be a significant accomplishment at this time, especially for only a year’s turnaround.

1GB of RAM

Last but not least, the apparent validity of the Geekbench 3 leak means that one last piece of information on the A8 can apparently be confirmed: the earlier rumors about it being paired with 1GB of RAM are true. Unfortunately Apple’s official product image of the A8 is of no help here – it’s clearly a doctored version of the A7 image based on the product numbers attached – but this information is consistentwith earlier rumors based on leaked images of the real A8, which had also suggested the SoC contained 1GB of RAM. Again this is based on what we believe is a sound assumption that the Geekbench 3 leak is accurate since it so closely matches Apple’s own CPU performance estimates, but at this point we don’t have any substantial reason to doubt the data.


Image Courtesy Macrumors

The good news is that this is going to be the easiest aspect of the iPhone 6 to confirm, since diagnostic apps will be able to query the phone for the RAM amount. So one way or another we should know for sure come September 19th.

Categories: Tech

Quick Thoughts on Apple Watch

Tue, 2014-09-09 17:16

While I'm still unsure on what wearables should actually do, I managed to get some photos of the Apple Watch. Unfortunately I wasn't really able to find any units available for a hands-on, and as far as I can tell it wasn't possible to actually try any of the software yet. However, based upon what I've seen Apple brings at least a few great ideas to the table. The digital dial/crown is definitely one of them, as it opens up the door to all kinds of new possibilities for navigation that are currently either impractical or impossible for wearables that don't have this hardware feature. In addition, Apple's strong emphasis on personalization with two sizes, three editions, and six watch bands is something that all OEMs should pay attention to. Finally, the dedicated SoC for the Apple Watch is something that is absolutely necessary to enable a good user experience as space is so critical on these wearables. There's also no question that Apple has done a great job of focusing on industrial and material design, as it looks like all three versions of the watch have premium materials and excellent fit and finish. While it isn't clear what display is used, it seems likely that it's an OLED display judging by the amount of black in some of the watchfaces, although ambient lighting in the demo area made it hard to tell whether this was the case.

However, my reservations are largely similar to concerns that I have with all wearables. Ultimately, the Apple Watch must provide utility that's strong enough to make me turn around and get it if I forget it. As-is, I don't really think that even the Apple Watch has that level of utility, even if it is excellently executed. Of course, this is also based upon a demo unit that I wasn't able to touch or use.

Of course, a few concerns remain, mostly in the area of battery life as it seems that only the Pebble line of wearables can really deliver enough battery life to not worry about charging a wearable on any sort of regular schedule. At any rate, I've attached a gallery of photos below for those interested in seeing all the various combinations of watches that Apple will make.

Gallery: Apple Watch

Categories: Tech

IDF 2014: Intel Edison Development Platform Now Shipping

Tue, 2014-09-09 17:00

As part of today’s IDF 2014 keynote, Intel has announced that their Edison development platform is now shipping.

First announced back at CES, Edison is a development platform for Intel’s burgeoning Internet of Things development initiative. At just over a postage stamp in size and containing a dual-core Atom CPU and a Quark CPU, Intel is hoping to jump-start development of tiny internet-connected x86 devices by providing a complete platform for developers to use in their devices. Edison in this sense is closer to a platform on a chip than a system on a chip; it not only contains the processors and RAM, but also the wireless components, right on down to the antennas.

  Intel Edison Development Platform CPU Dual-Core Silvermont Atom @ 500MHz +
Quark @ 100MHz RAM 1GB LPDDR3 (2x32bit) WiFi 2.4/5GHz 802.11a/b/g/n, BT 4.0 Storage 4GB eMMC I/O SD + UART + SPI + GPIO + USB 2.0 OTG OS Yocto Linux v1.6 Dimensions 35.5 x 25 x 3.9 mm

Fabbed on Intel’s 22nm process, the Edison platform contains an interesting setup that pairs a dual-core Silvermont Atom that runs at 500MHz with a Quark that runs at 100MHz. The Atom in this sense is the primary processor for the SoC, while the Quark serves as an embedded microcontroller responsible for running other functions on the platform. Intel pairs these processors up with 1GB of LPDDR3 in a 2x32bit configuration and a 4GB of eMMC NAND for storage. Wireless meanwhile is provided by a Broadcom 43340, which offers dual-band 802.11 a/b/g/n and Bluetooth 4.0.

The Edison platform itself can be further attached to additional development boards to expand its functionality and access its I/O. Intel is demonstrating both a simple USB board and a more complex development board. But as Edison offers Arduino compatibility, it can take advantage of the current Arduino development ecosystem.

Finally, Intel tells us that the Edison module itself is expected to retail for around $50. Meanwhile the breakout board kit and the Arduino kits will sell for $60 and $85 respectively.

Categories: Tech

IDF 2014: Intel Demonstrates Skylake, Due H2’2015

Tue, 2014-09-09 16:30

Taking place this week alongside the consumer electronics clamor is the annual Intel Developer Forum (IDF) at the Moscone Center in San Francisco. Though it has and continues to be first and foremost a developers conference, IDF also offers Intel a chance to unveil new products, and in more recent editions discuss and promote their plans for further breaking into the mobile market.

Diving right into the subject of Intel’s Core microarchitecture, with the Broadwell based Core M already in the process of launching, Intel is giving developers and the public a look at what comes after Broadwell. Already on Intel’s roadmaps for some time, Intel took to the stage at IDF14 to formally announce their next-generation Skylake architecture and to demonstrate its status.

Intel's Tick-Tock Cadence Microarchitecture Process Node Tick or Tock Release Year Conroe/Merom 65nm Tock 2006 Penryn 45nm Tick 2007 Nehalem 45nm Tock 2008 Westmere 32nm Tick 2010 Sandy Bridge 32nm Tock 2011 Ivy Bridge 22nm Tick 2012 Haswell 22nm Tock 2013 Broadwell 14nm Tick 2014 Skylake 14nm Tock 2015

In Intel terminology Skylake is the Tock to Broadwell’s Tick, offering a new microarchitecture atop the 14nm process first introduced with Broadwell. As is the case with every Core update, for Skylake Intel is shooting for significant increases in performance, power efficiency, and battery life. Since Skylake is built on the same 14nm process as Broadwell, Skylake is primarily an exercise in Intel’s architecture development capabilities, with its gains needing to come from optimizations in design rather than significant manufacturing improvements.             

At roughly a year out from launch Intel is not saying anything about the architecture or design at this time, but they are using IDF to showcase that Skylake is up and running. Demonstrating this, Intel showcased a pair of Skylake development systems. The first of which was a traditional open laboratory testbed that was running 3DMark, which was being used to showcase that the GPU and CPU portions of Skylake were running and performing well. The second demonstration was a completed laptop that was playing back 4K video, and is an early version of the hardware Intel will be shipping as the software development vehicle for developers next year.

Alongside their demonstration, Intel also announced a rough timeline for the volume production and availability of Skylake. Volume production will take place in H2’2015, with product availability slated for later in the year. With Broadwell being behind schedule due to a slower than planned bring-up of their 14nm process, there has been some question over what would happen with Skylake and Intel clearly wanted to address this head-on.

Consequently a big part of Intel’s message on Skylake is that the next generation CPU is already up and running and is in a healthy state, apparently unfazed by the earlier 14nm delays that dogged Broadwell. At the same time the H2’2015 launch date for Skylake means that it’s going to be out roughly a year after the first Broadwell parts, which means Intel still intends to adhere to their roughly 1 year product replacement cadence.

Categories: Tech

Understanding Dual Domain Pixels in the iPhone 6 and iPhone 6 Plus

Tue, 2014-09-09 16:20

In the launch announcement, Apple announced that their new display had dual domain pixels, which improved viewing angles. Unfortunately they dropped the subject at this, which make for a lot of room for confusion. Anyone that does a cursory analysis through Google will only find references to this type of display in monitors for medical use or similarly technology used by IBM monitors.

However, dual domain pixels are actually not as complicated as they seem. In fact, this is a display technology I remember seeing with the announcement of LG’s AH-IPS technology back in 2011. For those interested in the technical definition, dual domain pixels refers to the fact that the electrodes in the pixels aren’t all aligned. Instead, they’re skewed when viewed from the perspective of the lines defined by the rectangular edges of the display. Because these subpixels are skewed, it’s possible to compensate for uneven lighting that occurs because each individual subpixel is viewed at a different angle, which causes a change in color and a faster fall-off of contrast.

At any rate, this is easiest to explain with a photos. Above, we see the pixel layout of the iPhone 5. This is the standard rows/columns of pixels, and not really news to anyone that knows how displays work. Let's look at a dual domain arrangement next.

Anyone that has tried the HTC One (M7) or One X will probably understand the effect of this change as these phones have had this type of skewed subpixel format to get better viewing angles and less color shifting with changes in viewing angles. This can carry some risk though as black backgrounds may have some color shifting towards purple instead of yellow/blue, which can look strange but is quite subtle in my experience. There's really not too much in the way of disadvantages, so I look forward to seeing how Apple's new displays do in our tests.

Categories: Tech

Hands on with the iPhone 6 and iPhone 6 Plus

Tue, 2014-09-09 16:15

When it comes to the iPhone 6, one of the most immediate impressions will definitely be the industrial and material design. Going back to the launch of the original iPhone 5 one of the immediate impressions that we had was that the iPhone 5 felt incredibly light and thin. If nothing else, the same is true of the iPhone 6.

While the new iPhone 6 isn’t lighter than the iPhone 5, it feels incredibly thin compared to the iPhone 5s I had on hand for comparisons. In fact, the iPhone 6 feels a lot like the HTC 8X in terms of the thickness of the edge, but without the strongly sloped back to increase the size of the phone in the hand.

The size itself is also a key feature, and as I suspected the iPhone 6 feels very much like the One (M7) in size, which I still find to be a great fit and easily used with one hand. While it’s definitely possible for the iPhone 6 to be a bit bigger without being impossible to reasonably use with one hand, it manages to hit a good balance between ease of use with one hand and display size for media consumption.

Of course, the iPhone 6 Plus isn’t really easy to use with one hand, as just the 77.8mm width makes it difficult to reach across the display horizontally, much less from diagonally. It is definitely easy to hold with one hand though, and the rounded display feels great.

In terms of the design of the device, it’s clear that Apple had to break some trends that seemed to be present in previous iPhones. For one, the noticeable camera bump came from a need to maintain and/or improve camera quality while simultaneously driving down z-height overall, so there seems to have been an industrial design trade-off here for the sake of functionality. There’s also the relatively thick plastic lines which are a departure from previous designs but seem to be necessary for NFC capabilities. I’m personally unsure how I feel on these two design elements, but they may be an issue for some.

Looking past the size of the iPhone 6, there are a lot of noticeable subtle changes to the device compared to the iPhone 5s. In terms of low-hanging fruit, the side-mounted power button definitely helps with keeping a firm grip while turning on the phone, and I didn’t find any real issues when trying to turn the phone on or off. The slightly curved glass that helps to make for a smooth transition when swiping off the edge of the display is also a nice touch, although I’m concerned about the implications that this has for drop resilience and screen protectors. This is mostly based upon my past experience with such 2.5D displays, as traditional PET screen protectors generally don’t adhere properly to curved surfaces and Android phones with this type of cover glass tended to suffer from shattered displays more readily.

There are some changes that are subtle enough that I’m not sure if this is a product of production variance. In the iPhone 6 and 6 Plus that I tried, I noticed that the home button seemed to be closer to the display when compared to the iPhone 5s, and that the feel of the button was a bit more positive, although the click is still relatively subdued compared to the volume and power buttons.

One of the highlight features of the iPhone 6 Plus is optical image stabilization, but it appears that it doesn’t run during preview so it was hard to see just how much accommodation the system has and how it works. Although the announcement seemed to suggest that the module moves vertically and horizontally, it seems more likely that we’re looking at a VCM that shifts the lenses around to compensate for horizontal and vertical motion.

Overall, it was rather hard to really notice any difference in responsiveness as the iPhone 5s almost never stuttered or hesitated in my experience. The iPhone 6 similarly had no such issues when casually trying various features but a full review may show that this changes when used in real world situations.

Unfortunately, many of the features that Apple has implemented in this latest iteration seem to follow the same pattern as they aren’t easily demonstrated. For example, seeing exactly what Apple means by dual domain pixels effectively requires a microscope to clearly see what Apple is talking about, and really seeing a difference in color shifting, along with improved maximum contrast requires a dark room with little stray light.

Gallery: iPhone 6 and 6 Plus Hands On

Categories: Tech

Apple iOS 8: Available September 17th

Tue, 2014-09-09 11:50

Alongside the launch of the launch of the iPhone 6 family, Apple is also prepping for the launch of iOS 8. The iPhone 6 family will of course ship with iOS 8 as their base OS, meanwhile Apple has announced that iOS 8 will be available as an upgrade for compatible devices on September 17th, 2 days before the iPhone 6 ships.

As Apple has already announced iOS 8 back at WWDC 2014, we won’t spend too much time recapping it here. Notable new features for iOS 8 include the low level Metal graphics API, new keyboard functionality, notification enhancements, and hands-free Siri.

Meanwhile for compatibility, Apple has confirmed that iOS 8 will be coming to A5 and higher devices. This includes the iPhone 4S, iPad 2, and iPod Touch 5th Gen, along with their respective successors including the iPhone 5 and iPad Mini. In practice this means that iOS 8 will run on anything iOS 7 ran on other than the iPhone 4, which was the sole A4 device to run that OS.

Categories: Tech

Apple Announces the Apple Watch; Available Next Year

Tue, 2014-09-09 11:06

Apple has thrown their hat into the wearable ring with the Apple watch, which tries to bring a better user experience to the watch without trying to adapt iOS to the watch with multi-touch gestures that we're familiar with on the iPhone.

There's a single crystal sapphire display, a digital dial crown that acts as a home button and a scroll system. There's also a strong emphasis on haptic feedback which allows for linking of watches to share notifications by sending taps in any possible pattern. This is done by using a force sensitive touchscreen, which is a method of navigating along with the scrolling dial. This allows for subtle communication that doesn't rely on obvious sound or gestures. It's also possible to send taps based upon pulse/heart beat.

There are IR lights and sapphire lenses on the back of the watch for heart rate and serves as a magnetic alignment wireless charging system. The accuracy of the watch is no more than 50 milliseconds off at any time.

In order to support this watch, Apple has also designed a custom SoC called S1, likely for battery life and sensor integration and reduction of board area.

There are six different straps that are easily exchanged. The sport band has multiple colors and is some kind of rubber. There's a leather sports strap which has multiple magnets to ensure that the fit works correctly. There's also a traditional leather strap and a stainless steel link bracelet. There's also a stainless steel mesh band that is infinitely adjustable. There are also two versions of each watch edition, one larger and one smaller.

The Apple Watch also has NFC and will work with Apple Pay.

There are actually three variants though, which include the standard Apple Watch, Watch Sport, and the Watch Edition which has 18 karat gold for the casing. The sport edition has a anodized aluminum casing.

Furthermore the watch will also come in two different case sizes to account for different wrist sizes (essentially his & her watch sizes). These sizes are 38mm and 42mm tall respectively.

The Apple Watch must be paired with an iPhone to work properly. It starts at $349 USD and will go on sale early 2015.

Categories: Tech

Apple Announces A8 SoC

Tue, 2014-09-09 10:45

As expected from this year’s iPhone keynote, Apple has announced a new member of their internally developed family of ARM SoCs.

The latest SoC, dubbed A8, is Apple’s first SoC built on the 20nm process, and among the first SoCs overall to be built on this process. Apple notes that it has 2 billion transistors and is 13% smaller than the A7, which would give it upwards of twice as many transistors as the A7 and would put the die size at about 89mm2.


Image courtesy The Verge

From a performance perspective Apple is promising 25% faster CPU performance than A7. As is usually the case with Apple, they aren’t talking about the underlying CPU core – though this is a problem we’re working to rectify – so it remains to be seen how much of this is due to CPU architectural upgrades and how much is from clockspeed improvements afforded by the 20nm process. Apple just introduced their 64bit Cyclone core last year, so it stands to reason that just a year later and with the transition to 64bit already completed, A8 packs a CPU that is similar to Cyclone.

Apple SoC Evolution   CPU Perf GPU Perf Die Size Transistors Process A5 ~13x ~20x 122m2 <1B 45nm A6 ~26x ~34x 97mm2 <1B 32nm A7 40x 56x 102mm2 >1B 28nm A8 50x 84x 89mm2 ~2B 20nm

Meanwhile Apple is being even less specific about the GPU, but from their published baseline performance comparisons against the iPhone 1, the A8 is said to be 84x faster on graphics. This compares to a published figure of 56x for the A7, which implies that the A8’s GPU is 1.5x faster than the A7’s. Given Apple’s conservative stance on clockspeeds for power purposes and the die space gains from the 20nm process, it seems likely that Apple has upgraded from a 4 core PowerVR GPU to a 6 core PowerVR GPU, likely the flagship GX6650, which would account for the 50% performance gain right there.

Finally, Apple notes that the A8 is designed to be 50% more energy efficient than the A7. Some of these efficiency gains would come from gains due the 20nm process, however this large of a gain would indicate that Apple has done additional work at the architectural level to improve efficiency, as smaller manufacturing nodes alone do not see these kinds of gains.

Update: We have posted our initial A8 analysis here

Categories: Tech

Apple Announces iPhone 6 and iPhone 6 Plus; Available September 19th

Tue, 2014-09-09 10:30

Today, Apple is launching the new iPhone 6 and iPhone 6 Plus. One of first changes is that the new iPhone 6 has a 4.7" 1334x750 display, and the iPhone 6 Plus has a 5.5" 1920x1080 (1080p) display. The thickness of of the 4.7" model is 6.6mm and the 5.5" model is 7.1mm.

The displays will have higher contrast, better peak brightness, and better viewing angles according to Apple. This suggests that the iPhone display has a chevron shape to its subpixels to improve viewing angles. The glass also has a 2.5D curve similar to the HTC One X and Samsung Galaxy S3 for a smooth feel when swiping off the edge of the phone.

iOS is also adapted to the new larger iPhone 6 Plus' display by adding landscape views for many native applications that are two pane. In addition, in order to work with the differing resolutions Apple has added a "desktop-class scaler" to avoid incompability issues with applications that aren't aware of the new displays. This in turn implies that Apple is not doing integer scaling/doubling in all cases, and that we'll see fractional scaling used. These displays are known as Retina HD. There's also a one-hand mode for the iPhone 6 Plus in order to deal with the larger display size.

The new A8 powers both of these iPhones, and has 2B transistors compared to 1B the A7. It's also built on 20nm but it's unclear whether this is TSMC, Samsung, or both.

Apple also claims 25% higher CPU performance on the A8 and is also emphasizing that this new SoC can do better sustained performance over time compared against other smartphones. The GPU is supposed to be a GX6650.

Image Courtesy Engadget

Apple is also emphasizing that battery life on the iPhone 6 and how it compares to the iPhone 5s. WiFi browsing battery life is slightly increased over the 5s while LTE browsing battery life is unchanged, meanwhile the iPhone 6 Plus improves to 12 hours for both WiFi and LTE.

There's also a new M8 coprocessor which makes use of a new barometer sensor to measure relative air pressure and compute distance and elevation for better fitness tracking, which is used for the health application in iOS 8.

Apple is also finally introducing MDM9x25 with carrier aggregation and VoLTE. This means that there's a dual transceiver solution in the iPhone 6 devices. On the same line, Apple is finally adding 802.11ac to its smartphones and has worked on enabling seamless WiFi calling that goes from WiFi to cellular networks.

On the camera side, we see a new 8MP sensor which adds phase detection auto focus for faster autofocus, which is touted to be able to focus at up to 2x speed. There's also better local tone map and better noise reduction in addition to the PDAF system that was first seen in the Samsung Galaxy S5. Panoramas can now be up to 43MP in total resolution and a better gyroscope reduces stitching errors.

There's a brand new ISP in the A8 SoC as well, which is likely to be named the H7 ISP if we follow from the A7. Furthermore there's one feature that the iPhone 6 Plus does have that the iPhone 6 doesn't have for the camera, which is optical image stabilization (OIS) to reduce shaking. It appears that the entire module is floating instead of just a VCM-based lens stabilization system. There's also a timelapse feature. The PDAF also helps with continuous AF in video that avoids all of the breathing effects that come with conventional contrast-based focus.

The front facing camera also has a better sensor, f/2.2 aperture, single photo HDR, HDR video, and burst shot on the front facing camera.

Both will launch with iOS 8, which has QuickType that we've talked about at the WWDC keynote in addition to Extensibility which allows for TouchID auth in third party apps.

Also new to the iPhone 6 family is Near Field Communication (NFC) hardware, which is being used to power Apple's new payment system, Apple Pay. The iPhone 6 family utilizes an encrypted secure element (likely on the NFC chip itself) and credit cards are added through Passbook and validation for a purchase can be done using TouchID.

Apple has reassured security concerns by saying that Apple cannot know what is purchased and the cashier cannot see the credit card number or any information to ensure security. Online payment is also handled by Apple Pay which is a one-touch solution using TouchID and one time number from the secure element. Groupon, Uber, Target, Panera, MLB, and Apple store applications are all already supporting this system. Another example cited was OpenTable which allows for one to pay for a dinner check through an app. The system launches in the US in October as an update and will have an API open to all developers to implement in their applications.

  Apple iPhone 5s Apple iPhone 6 Apple iPhone 6 Plus SoC Apple A7 Apple A8 Apple A8 Display 4-inch 1136 x 640 LCD 4.7-inch 1334 x 750 LCD 5.5-inch 1920 x 1080 LCD WiFi 2.4/5GHz 802.11a/b/g/n, BT 4.0 2.4/5GHz 802.11a/b/g/n/ac, single stream, BT 4.0, NFC Storage 16GB/32GB/64GB 16GB/64GB/128GB 16GB/64GB/128GB I/O Lightning connector, 3.5mm headphone Size / Mass 123.8 x 58.6 x 7.6 mm, 112 grams 138.1 x 67 x 6.9 mm, 129 grams 158.1 x 77.8 x 7.1 mm, 172 grams Camera 8MP iSight with 1.5µm pixels Rear Facing + True Tone Flash
1.2MP f/2.4 Front Facing 8MP iSight with 1.5µm pixels Rear Facing + True Tone Flash
1.2MP f/2.2 Front Facing 8MP iSight with 1.5µm pixels Rear Facing + True Tone Flash + OIS
1.2MP f/2.2 Front Facing Price $99 (16GB), $149 (32GB) on 2 year contract $199 (16GB), $299 (64GB), $399 (128GB) on 2 year contract $299 (16GB), $399 (64GB), $499 (128GB) on 2 year contract

There are new silicone and leather cases with gold, silver, and space gray. The iPhone 6 starts at the usual $199 for 16GB, $299 for 64GB and $399 for 128GB. The iPhone 6 Plus comes in the same colors at $299 for 16GB, $399 for 64GB and $499 for 128GB. The iPhone 5s is now $99 on 2 year contrast, and the iPhone 5c is free. The new phone will ship on September 19th and preorders begin on September 12th.

Categories: Tech

Nokia Lumia 930 Review

Mon, 2014-09-08 12:00
Nokia has once again refreshed its Windows Phone lineup with the release of the Lumia 930, which is the spiritual successor to the Lumia 920 which first launched with Windows Phone 8.0 way back in November 2012. But like the Lumia 630, it takes cues from more than just the Lumia with the closest model number. The Lumia 930 is an interesting combination of many of the other Nokia Windows Phone designs from over the years all wrapped up into a striking package that certainly gives it a new take on the polycarbonate bodies of all of the higher end Lumia devices over the years.
Categories: Tech

GIGABYTE Server Releases Seven C612 Series Workstation and Server Motherboards

Mon, 2014-09-08 09:31

We had the big consumer Haswell-E CPU launch just over a week ago, and today marks the release of the Xeon counterparts. Johan’s large deep-dive into what makes Haswell-EP tick is well worth a read. Alongside the CPUs come the motherboards, and the GIGABYTE Server division sent us over some info regarding their entrants into this category.  As GIGABYTE Server now sells direct to the end-user via retailers like Newegg, their hardware and packaging is coming under more scrutiny and we are getting one or two of these models in for review.

The two main features GIGABYTE is promoting with its launch starts with DDR4-2133 support in 1 DIMM per channel, 2 DPC and 3 DPC configurations.  This makes sense given that the CPUs have DDR4-2133 support, however GIGABYTE is claiming that other motherboards will have reduced limits at higher DPC counts.  The second aspect is almost a transfer of IP from the consumer group – updating the BIOS without a CPU or DRAM installed. This can be done over the IPMI 2.0 management interface or via the command line over the network.

Unfortunately I have no clue how to decipher the motherboard names, but on release will be a single workstation motherboard, the MW50-SV0:

Perhaps surprisingly we still see an mSATA here rather than an M.2 port. Also, the TPM header is slap bang in the middle of the first two PCIe slots, making any GPU + TPM user lose that first PCIe slot.

The server motherboards are all 2P, and GIGABYTE Server’s rep indicated to me that this generation is going to have extensive use of the mezzanine type-T slot for the add-in cards. Due to the ATX dimensions of this MD30-RS0, we are limited to 1 DPC in a stacked arrangement, but all 10 SATA 6 Gbps ports are here. Note that if a user wants to equip both SAS drives and a double slot PCIe device here, they will be out of luck.

The MU70-SU0 is a single socket workstation motherboard, using the ATX standard but also a narrow LGA2011-3 socket to allow for 3DPC. The arrangement should allow two large PCIe devices along with mSATA and full storage capabilities. This motherboard also has four GbE ports based on Intel’s I210 line of controllers.

The MD50-LS0 aims more to be a storage board unless riser cables are used, because in this configuration of CPUs and PCIe slots it will be impossible to add in large compute cards.  We move up to the larger SSI EEB form factor here, with onboard ports for the Type-T slot. Note there are no extra power connectors for the PCIe slots, perhaps suggesting a non-compute nature.

We actually have the MD60-SC0 in for testing, showing a dual narrow socket combined with 2 DPC memory. The CPU sockets are inline rather than staggered, and the board comes with a 10GbE QSFP+ port supporting dual output via a splitter cable. When setting up this board the other day the power connector placement is a little odd, and the PCIe arrangement again is not conducive to large PCIe devices, but the arrangement does allow for directed airflow in a server over the hot components (including that chipset and QSFP+).

The MD70-HB0 is a bit more of a rearrangement of the MD60-SC0, with the larger version of the LGA2011-3 sockets now inline top to bottom and 2 DPC channel support. The networking is provided by an Intel X540 instead, giving two 10GbE Base-T LAN ports. I have been wondering about the consumer adoption of 10GbE Base-T on motherboards and given the heatsink here to cool it, I am not surprised that it has not made it over yet.

The last motherboard in the list, aside from supporting 3DPC and 10GbE Base-T, breaks the pattern by supporting an M.2 slot. It looks here that only 2280 type devices are supported, and it is unclear if this is PCIe 2.0 x2, x4, or PCIe 3.0 x4.  The MD80-TM0 is also an odd size, using a proprietary 305x395mm form factor.

We have no official confirmation on pricing and exact release dates as of yet, although GIGABYTE is sampling to review. We have the MD60-SC0 on hand with some Xeons, so keep your eyes for that article.

Categories: Tech

Intel Xeon E5 Version 3: Up to 18 Haswell EP Cores

Mon, 2014-09-08 09:30

Intel's new Xeon is here, and once again it has impressive specs. The 662 mm² die supports up to eighteen cores, contains two integrated memory controllers, and comes with up to 45MB of L3 cache. We have four different SKUs to test and compare, with each other as well as with the previous generation. We've also added some new real world application testing, so join us for one of our biggest server CPU reviews ever.

Categories: Tech

PCIe SSD Faceoff: Samsung XP941 (128GB & 256GB) and OCZ RevoDrive 350 (480GB) Tested

Fri, 2014-09-05 12:00

We are currently on the verge of PCIe transition. Nearly every SSD controller vendor has shown or at least talked about their upcoming PCIe designs, and the first ones should enter the market in early 2015. In the meantime, there are a couple of existing PCIe drives for the early adopters, namely Samsung XP941 and Plextor M6e, and a variety of RAID-based PCIe SSDs like the OCZ RevoDrive 350. We already reviewed the 512GB Samsung XP941 in May and found it to be the fastest client SSD on the market, but today we are back with the 128GB and 256GB models along with OCZ's RevoDrive 350.

Categories: Tech

Intel’s Core M Strategy: CPU Specifications for 9mm Fanless Tablets and 2-in-1 Devices

Fri, 2014-09-05 08:30

Continuing our coverage of Intel’s 14nm Technology, another series of press events held by Intel filled out some of the missing details behind the strategy of their Core M platform. Core M is the moniker for what will be the Broadwell-Y series of processors, following on from Haswell-Y, and it will be the first release of Intel’s 14nm technology. The drive to smaller, low powered fanless devices that still deliver a full x86 platform as well as the performance beyond that of a smartphone or tablet is starting to become a reality. Even reducing the size of the CPU package in all dimensions to allow for smaller devices, including reducing the z-height from 1.5mm to 1.05 mm is part of Intel’s solution, giving a total die area 37% smaller than Haswell-Y.

The first wave of three Core M parts will all be dual core flavors, with HD 5300 graphics and all within a 4.5W TDP. For Core M Intel is no longer quoting the SDP terminology due to the new design.

Intel Core M Specifications   Core M-5Y70 Core M-5Y10a Core M-5Y10 Cores / Threads 2 / 4 2 / 4 2 / 4 Base Frequency / MHz 1100 800 800 Turbo Frequency / MHz 2600 2000 2000 Processor Graphics HD 5300 HD 5300 HD 5300 IGP Base Frequency / MHz 100 100 100 IGP Turbo Frequency / MHz 850 800 800 L3 Cache 4 MB 4 MB 4 MB TDP 4.5 W 4.5 W 4.5 W LPDDR3/DDR3L Support 1600 MHz 1600 MHz 1600 MHz Intel vPro Yes No No Intel TXT Yes No No Intel VT-d/VT-x Yes Yes Yes Intel AES-NI Yes Yes Yes

The top of the line processor will be called the Core M-5Y70, which is a bit of a mouthful but the name breaks down similarly to Intel’s main Core series. ‘5’ is similar to i5, giving us a dual-core processor with Hyper-Threading; ‘Y’ is for Broadwell-Y; and ‘70’ gives its position in the hardware stack.

The CPU will leverage both processor graphics and CPU Turbo Boost, allowing each of them to turbo at different times and different rates depending on the workload and overall power usage. Of particulary interest is that the 5Y70 features a base clock of 1.1 GHz, with turbo for both single-core and dual-core use listed as up to 2.6 GHz. The new HD 5300 GPU similarly has a 100 MHz base frequency with an 850 MHz turbo. The 5Y70 is different from the other two models in both clock speeds and features, as it will be part of Intel’s vPro program and also supports Intel TXT.

The other two processors in the stack are the 5Y10a and 5Y10, with dual-core + HT configurations and 800 MHz base frequency with turbo up to 2.0 GHz. There doesn't appear to be any major difference between the two parts, though Intel's presentation notes that the 5Y10 supports "4W Config Down TDP" (cTDP Down). The graphics is clocked slightly lower on the turbo, giving 800 MHz.

It's interesting to note that Intel informed us that the 1k unit pricing will be the same for all three processors: $281. Obviously these chips are going to end up in hybrids, tablets, and laptops that come pre-built, so the actual pricing will vary by OEM and whatever deals they have with Intel. But in general, Intel seems to be saying that OEMs can choose any of the three chips based on their power/thermal targets.

The HD Graphics 5300 is the new processor graphics and as part of the brief behind Core-M, a die shot was supplied with the important areas marked:

In the processor graphics section in the shot above, there clearly looks like 12 repeated units, with each representing two EUs (Execution Units). In our dive into the architecture in early August, it was stated that the minimum configuration here would be as a result of Broadwell taking 8 EUs per sub-slice, with the minimum configuration being three sub-slices, making 24 in total. This comes in combination with an increase in the L1 cache and samplers relative to the number of EUs, allowing for 25% more sampling throughput per EU.

Intel's Tick-Tock Cadence Microarchitecture Process Node Tick or Tock Release Year Conroe/Merom 65nm Tock 2006 Penryn 45nm Tick 2007 Nehalem 45nm Tock 2008 Westmere 32nm Tick 2010 Sandy Bridge 32nm Tock 2011 Ivy Bridge 22nm Tick 2012 Haswell 22nm Tock 2013 Broadwell 14nm Tick 2014 Skylake 14nm Tock 2015

The fundamental architecture of the GPU does not change from Haswell, albeit on a smaller process node. The GPU is confirmed as supporting DirectX 11.2, OpenGL 4.2 and OpenCL 2.0, with UltraHD (3840x2160) supported at 24 Hz through HDMI. This opens up possibilities of fanless tablets with UHD panels.

One of the main graphs Intel was pushing in their briefing was this one, indicating what power is required for a fanless tablet:

For a chassis of 7, 8 or 10mm, to have a maximum skin temperature of 41C at load, the above TDPs are required depending on the chassis size in order to go fanless. The first batch of 4.5W Core M processors aim at either the 11.6-inch, 8mm thick fanless tablet design as indicated in the graph above, or similarly a 10.1-inch 10mm thick tablet will also be suitable. Intel wants Core M to have a range of possible TDPs based on increasing or decreasing the frequency as required for a thin fanless tablet.

Intel is going to support extended docking functionality, especially with its business partners to allow features such as WiGig and additional I/O. Intel is also bringing a new 802.11ac design in the form of AC 7265, a lower powered version of the 2T2R 7260 for tablets. This will also support WiDi 5.0, and overall the platform aims to offer 1.7 hours longer battery life. Intel got to this ‘+1.7’ hour number with a reference design compared to a clocked-down Haswell-Y. I would like to point out that despite these numbers, a clocked-down part usually represents moving outside the optimal efficiency window, especially when dealing with low powered tablets.

Intel used the above slide in their presentations and drew particular attention to the power consumption of the audio during HD video playback (the orange bar on the top comparison). As part of Core M, Intel is reducing power consumption of the audio segment of the system from 100s of milliwatts down to single-digit milliwatts by integrating an audio digital signaling processor (DSP) onto the die. This is what Intel refers to as its Smart Sound Technology, and is designed to shift the majority of the audio processing onto a configured part of the die which can process at lower power.

If you think you’ve heard of something like this before, you have: AMD’s TrueAudio sounds remarkably similar in its implementation and its promotion. We asked Intel if this new DSP for Broadwell would have a configurable API similar to TrueAudio, however we are still waiting on an official response to this.

The platform controller hub layout was also provided, showing USB 3.0 support along with SATA 6 Gbps and four lanes of PCIe:

The PCH is also designed to be dynamic with power, meaning that disabling features on a design could yield a better-than-expected increase in battery life. The design will support NFC, and it is worth noting that the two USB 3.0 ports are in a mux configuration which may limit bandwidth. With a number of PCIe lanes in tow however, there are a number of controllers that could be used to expand functionality in a design.

Intel will be showing off the Core M at IFA in Berlin this week, with over 20 designs based on Core M from OEMs in the known pipeline – including designs like ASUS’ Transformer Book T300 Chi announced back at Computex. The T300 Chi was specified as a 12.5-inch fanless tablet in a 7.3mm thickness design, with LTE support and a 2560x1600 display. With the 12.5-inch size and 7.3mm width, it sounds like the T300 Chi will be modifying the Core M CPU to be around 4W in order to keep the 41C skin temperature as a maximum. Intel also listed the following Core M devices at IFA:

The CPUs will be in volume production before the end of the year (we seem to have differing reports whether volume production has started already or is just about to), with systems from ~5 OEMs available in Q4, starting in late October. Intel lists both consumer and business designs for this timeframe, however volume production is expected in Q1 2015.

Categories: Tech

Note 4 and Mate 7 side-by-side

Fri, 2014-09-05 05:44

We already saw some pictures that Josh took during his briefing just a few days ago, but I still felt that maybe the Note 4 could use a closer look. Today I managed to talk to a representative at IFA to get some better close-up shots of the device.

The Note 4's new design is really striking. As Josh already mentioned in his hands-on, the improved feeling that Samsung managed to achieve with the new metal chassis is worlds apart from the plastic designs found in its predecessors. I was still skeptical until I got to hold it with my own hands: this is indeed a premium device.

I also had the opportunity to compare it to the Ascend Mate 7. There's definitely a size difference here that is noticeable when you hold both devices. Because the Mate 7 is a tad wider than the Note 4, it isn't as easy to handle. I'd say Samsung made a very good choice in staying at 5.7" and not going larger. One aspect that was immediately visible was the difference in screen quality. The Note 4 was a lot brighter, and due to the large size of the screens, the difference in resolution between 1080p vs 1440p was quite noticeable to me.

The two devices are basically the same thickness, and you won't notice too much in that regard other than a change from feeling the grippy soft-touch plastic of Samsung pleather back to the smoother aluminium of the Mate 7.

One thing Josh told me he omitted was taking a picture of the Note 4 without the back cover. Here we see that Samsung changed the layout a bit, and no is longer employing their stacked SIM+microSD slot holder that is common in current Samsung devices, and instead separates them again as in older devices.

Some readers in the comments section were asking about how the new S-Pen handled. I tried to play around with it for a bit and noticed no problems with it, as it performed without issue. I couldn't try out the angle-sensitivity of the pen as the stock apps did not support it, and the Samsung representative did not know how to showcase it.

Interestingly, the model I handled was a N910F with a Snapdrgon 805. This could mean that we won't see any Exynos models in western LTE markets such as Europe yet again. This would mean that Snapdragon variants would be the most widely reviewed and distributed. We're still waiting on Samsung to officially relase any information regarding the Exynos 5433 SoC. We're also waiting on Samsung to release a break-down of models and their availabiliy per region.

Yesterday I was impressed by Huawei's new design and build-quality. Today I'm actually torn between it and the Note 4 and can say that the Samsung has stepped the quality of their design as well.

Gallery: Note 4 vs Ascend Mate 7 side-by-side

Addendum: 

Samsung also had a lot of Galaxy Alphas on display in their hall. Since this was the first time I saw the phone in person, I wanted to share my thoughts on it too. The device provides a great alternative to people who dislike 5"+ phones, as it's very light and thin. I was concerned about the 720p PenTile screen, but it seems that this display employs one of the newer generation PenTile matrices and it was not as visible as I thought it would be. I didn't bother with taking too many pictures as it was a tethered device and there were no press kits available. The Galaxy Alpha is definitely a device to look foward to if that's the form-factor you desire in a phone.

Categories: Tech

Dell Previews 27-inch ‘5K’ UltraSharp Monitor: 5120x2880

Fri, 2014-09-05 00:16

UHD is dead. Not really, but it would seem that displays bigger than UHD/4K will soon be coming to market. The ability of being able to stitch two regular sized outputs into the same panel is now being exploited even more as Dell has announced during its Modern Workforce livestream about the new ‘5K’ Ultrasharp 27-inch display.  The ‘5K’ name comes from the 5120 pixels horizontally, but this panel screams as being two lots of 2560x2880 in a tiled display.

5120x2880 at 27 inches comes out at 218 PPI for a total of 14.7 million pixels. At that number of pixels per inch, we are essentially looking at a larger 15.4-inch Retina MBP or double a WQHD ASUS Zenbook UX301, and seems right for users wanting to upgrade their 13 year old IBM T220 for something a bit more modern.

Displays Sorted by PPI Product Size / in Resolution PPI Pixels LG G3 5.5 2560x1440 534 3,686,400 Samsung Galaxy S5 5.1 1920x1080 432 2,073,600 HTC One Max 5.9 1920x1080 373 2,073,600 Apple iPhone 5S 4 640x1136 326 727,040 Apple iPad mini Retina 7.9 2048x1536 324 2,777,088 Google Nexus 4 4.7 1280x768 318 983,040 Google Nexus 10 10 2560x1600 300 4,096,000 Lenovo Yoga 2 Pro 13.3 3200x1800 276 5,760,000 ASUS Zenbook UX301A 13.3 2560x1440 221 3,686,400 Apple Retina MBP 15" 15.4 2880x1800 221 5,184,000 Dell Ultrasharp 27" 5K 27 5120x2880 218 14,745,600 Nokia Lumia 820 4.3 800x480 217 384,000 IBM T220/T221 22.2 3840x2400 204 9,216,000 Dell UP2414Q 24 3840x2160 184 8,294,400 Dell P2815Q 28 3840x2160 157 8,294,400 Samsung U28D590D 28 3840x2160 157 8,294,400 ASUS PQ321Q 31.5 3840x2160 140 8,294,400 Apple 11.6" MacBook Air 11.6 1366x768 135 1,049,088 LG 34UM95 34 3440x1440 110 4,953,600 Korean 27" WQHD 27 2560x1440 109 3,686,400 Sharp 8K Prototype 85 7680x4320 104 33,177,600

Dell has been pretty quiet on the specifications, such as HDMI or DisplayPort support, though PC Perspective is reporting 16W integrated speakers. If the display is using tiling to divide up the transport workload over two outputs, that puts the emphasis squarely on two DP 1.2 connections. There is no mention of frame rates as of yet, nor intended color goals.

Clearly this panel is aimed more at workflow than gaming.  This is almost double 4K resolution in terms of pixels, and 4K can already bring down the majority of graphics cards to their knees, but we would imagine that the content producer and prosumer would be the intended market. Word is that this monitor will hit the shelves by Christmas, with a $2500 price tag.

Source: Dell

Gallery: Dell Previews 27-inch ‘5K’ UltraSharp Monitor: 5120x2880

Categories: Tech