Anandtech
GIGABYTE BRIX GB-BXi7-4500 Review: Intel Core i7 in a UCFF PC
Over the last couple of years, the ultra-compact form factor (UCFF) has emerged as one of the bright spots in the troubled PC market. Intel kickstarted the category with their Sandy Bridge NUC kits in early 2013. Recognizing the popularity of this segment, other vendors also began to promote similar products. GIGABYTE targets this market segment with an extensive lineup of products under the BRIX brand. Late last year, GIGABYTE sent us their high-end vanilla BRIX, the GB-BXi7-4500. Unlike Intel's Haswell NUCs (with a i5-based SKU at the top end), this BRIX brings a Haswell i7 ULV processor into the UCFF market. Read on to find out what an i7 CPU can deliver in this form factor.
ECS Unveils Z97I-Drone: More Mini-ITX for Haswell
Our recent review of $140 Z97 mini-ITX boards left us wanting more. There is plenty of potential in the mini-ITX form factor even at the low end price points, so when we saw ECS’ planned Z97I-Drone at Computex this year it looked like it would provide an interesting look into the platform. Today the motherboard is unveiled, and should be heading to market shortly.
ECS’ design feature of late has centered on the ‘L33T Gaming’ branding, which means giving each of their motherboards names indicative of various gaming concepts. ‘Machine’ and ‘Domination’ have been used previously in conjunction with the ‘Gank’ name for Intel’s mainstream chipsets, whereas ‘Aggro’ is reserved for future AMD motherboards. ‘Drone’ makes a lot of sense in this context, although it was used on an ATX Z87 motherboard previously.
ECS is keen to promote the Gaming USB port, supporting a report rate of 1000 Hz, along with their ‘Hybrid Power’ which incorporates a digital power design. In terms of features we get five SATA 6 Gbps ports, an Intel I218-V network solution, an M.2 slot for WiFi/Bluetooth only, ALC1150 based audio, six USB 3.0 ports, two fan headers and ECS’ upgraded BIOS package.
The board puts the socket in the bottom right corner, up against the DRAM and PCIe slot meaning that large backplates or DRAM might interfere with large CPU cooling solutions. Similar to GIGABYTE, ECS is expanding the area around the screw holes to double the radius so errant screwdrivers do not start taking off capacitors when the board is being fitted into a case. The five SATA 6 Gbps ports are all in a row at the top, which unfortunately means the cable at the top is blocked in when locking cables are used.
The 24-pin ATX connector is well placed at the edge of the motherboard, although the 8-pin for the CPU is awkwardly placed just to the left of center, meaning any form of cable management goes out of the window. Also the two fan headers are in this area, surrounded by the SATA cables and the CPU power cable, making it difficult to organize.
The rear IO uses a block of four USB 2.0 ports (which I like), with video connectivity provided by a DVI-D, HDMI and DisplayPort. Four USB 3.0 ports, an Intel I218-V network port and audio jacks from the Realtek ALC1150 round it off.
One of the interesting elements here is the M.2 WiFi-only solution, where users will have to source their own M.2 card. Storage is disabled here, presumably in firmware, but also due to the lack of space on the motherboard. So far I have only seen M.2 WiFi on some GIGABYTE X99 models, using Intel’s 7260-AC solution. But the lack of a card here means that the ECS Drone should be around or below the $140 mark. We are awaiting full pricing details from ECS and we should be getting the board in for review as well, so stay tuned for that.
Gallery: ECS Unveils Z97I-Drone: More Mini-ITX for Haswell
Additional: We have just received word from ECS. This motherboard will have an MSRP of $110.
AMD Appoints Dr. Lisa Su as President & CEO, Rory Read Steps Down
As part of AMD’s broader restructuring efforts, back in June the company announced a business reorganization that would see the company organized into two major groups, the Computing and Graphics Business Group, and the Enterprise, Embedded and Semi-Custom Business Group. Furthermore at the time AMD promoted Dr. Lisa Su to the position of Chief Operating Officer (COO), a position that previously had been unfilled at AMD for some time.
Now 4 months later it turns out that Lisa’s time as COO will be a short one. Today AMD has announced that effective immediately, current CEO Rory Read will be stepping down. In his place Lisa is being promoted to President and CEO of the company, making her the 5th CEO in the company’s history.
With Lisa’s previous promotion to COO, she had essentially already been promoted to AMD’s second-in-command under Rory, so this is a straightforward promotion over at AMD. More significantly, given the resurrection of the COO post there has been good reason to suspect that Lisa was bound for a promotion to CEO sooner than later. And now that AMD has finally promoted Lisa to CEO, they are confirming that the above was exactly the plan, and that Rory has been preparing her for the CEO role for some time.
For his part, the retirement of Rory signifies that AMD’s transition is nearly complete and that his role is coming to an end. Rory was brought on in 2011 to restructure and stabilize the company after its struggles late in the last decade and at the start of this one, with an emphasis on diversifying the company beyond its traditional (and troubled) x86 and graphics products. While AMD remains significantly vested in those products, they now have a sizable business presence in other fields/technologies such as ARM processors and semi-custom IP designs, which as part of AMD and Rory’s plans ensure the company isn’t overexposed to any single business. And on the financial side AMD is unfortunately still operating at a loss, but if all goes according to plan that should be coming to an end this year.
Though AMD has never called Rory a transitional CEO, his actions overhauling AMD over the last 3 years and now stepping down as CEO after the fact serve to cement the fact that Rory was brought on board to execute the necessary restructuring rather than to lead the company in the long term. AMD is still in the process of developing some of the silicon that will be the basis of these business plans – including the x86 and ARM versions of the K12 processor – so it will be a bit longer yet until the company can fully execute on their ambidextrous plans, but on the business and development side they have completed the necessary changes to allow that. With those changes behind them AMD is now ready to move out of their transitional phase and in to their new position as a diversified IP designer, which is what has led to Rory’s retirement and Lisa’s promotion.
Internally for AMD and its product lineups, Lisa’s promotion should not result in significant changes. She has already been overseeing much of AMD throughout her career there – first as SVP of Global Business Units and then as COO – meaning that although the CEO is changing, the person overseeing much of AMD’s product lineup is not. Working alongside AMD’s CTO, Mark Papermaster, AMD’s product leadership is more or less unchanged.
Meanwhile the transition from Rory to Lisa means that AMD is also once again being led by an engineer (and a very capable one at that), which AMD leadership is treating as a great strength going forward. Lisa holds a doctorate in electrical engineering from MIT and has previously held positions at IBM and Texas Instruments, including semiconductor research and development, and is much of the reason she joined the company at the SVP level in 2012. Being the CEO is about business as much as it is about technology, but with AMD’s business situation settled by Rory, this should give Lisa a chance to settle in and focus on driving and improving AMD’s technological situation, which is ultimately what will make or break the company. AMD now has a number of very capable engineers leading the company at multiple levels, including Lisa, CTO Mark Papermaster, and K12 designer Jim Keller, so the company should be in a good position going forward.
Finally, this promotion means that AMD’s executive lineup has been slightly shuffled once more. The COO position was recreated for Lisa and now it seems just for Lisa; it will not be filled now that she is CEO, and those responsibilities will be staying with her. Meanwhile AMD is also noting that while Rory is stepping down as CEO effective immediately, he will be staying with the company in an advisory role to help see out the company through the rest of 2014.
HTC Announces the Desire EYE
Today at their Double Exposure event in New York, HTC has announced the Desire EYE. The major feature on the Desire EYE is its 13MP front facing camera which should hopefully take some of the best selfies of any smartphone. The internals are also an interesting story. While devices sporting HTC's Desire branding are typically more budget oriented devices, the Desire EYE packs significantly more powerful hardware than other Desire device. It has a 5.2" 1080p IPS display, and Qualcomm's Snapdragon 801. BoomSound continues to be present with two TFA9887 speaker amps. The full list of specifications has been laid out below.
HTC Desire EYE SoC MSM8974ABv3 2.26 GHz Snapdragon 801 RAM/NAND 2 GB LPDDR3, 16GB NAND + microSDXC Display 5.2” 1920x1080 IPS LCD Network 2G / 3G / 4G LTE (Qualcomm MDM9x25 UE Category 4 LTE) Dimensions 151.7 x 73.8 x 8.5 mm, 154 grams Camera 13MP rear camera, 1.12 µm pixels, 1/3.06" CMOS size (Sony IMX214), F/2.0, 28mm (35mm equiv) lens.13MP front camera, 1.12 µm pixels, 1/3.06" CMOS size (Sony IMX214), F/2.2, 22mm (35mm equiv) lens. Battery 2400 mAh (9.12 Whr) OS Android KitKat with HTC Sense 6 Connectivity 802.11a/b/g/n + BT 4.0, USB2.0, GPS/GNSS, DLNA, NFC SIM Size NanoSIM
Owing to its 5.2" display, the Desire Eye has a larger height and width than HTC's M8 and E8 phones. At 8.5mm it's 1.35mm thinner than both of them, and at 154 grams its mass sits between the lighter E8 and heavier M8. Also similar to the E8 and M8 is its hardware platform which includes Qualcomm's MSM8974AB at 2.26GHz, 2GB of LPDDR3 RAM. The finish on all color variants is matte plastic throughout, and the two-tone "Doubleshot" finish really makes for an even better design than what we saw in the E8. The power button has also been moved to the side, and we see a new two-stage camera button. The latter is definitely useful, but the pressure needed to trigger the second stage is a bit too high in my experience.
The Desire EYE also sports water resistance, rated for immersion in 1 meter of water for up to 30 minutes. HTC has gone the extra mile here by implementing water resistance without annoying plastic flaps that are easily torn off. Instead, we see rubber gaskets around key areas and "short protection" on the USB port. 802.11ac isn't present, and the battery is on the small side, but overall the Desire EYE has solid specifications.
Like I mentioned in the beginning, the major marketing and selling point for the phone is definitely the front facing camera. The Desire EYE uses the same Sony IMX214 sensor for the front and rear cameras. IMX214 is formidable as a sensor for rear cameras, and this is the first time we're seeing it also implemented as a front-facing camera. By using two separate cameras instead of a rotating upper section like on the Oppo N1, HTC has been able to customize the cameras for their intended use case. The rear camera has a wider F/2.0 aperture and a longer 28mm focal length than the front-facing camera. The front-facing camera is optimised for a wider 87 degree field of view with its shorter 22mm focal length. Both cameras have a max ISO of 3200 and a max exposure time of 1/9s. In some casual testing it seems that there is a bit of color noise in low light, but it's otherwise well-suppressed. Detail is definitely good but there're noticeable sharpening kernels at the base settings which is a bit disappointing. The rear camera seems to have detail similar to the Butterfly 2, which is comparable to most 13MP cameras in flagships today.
HTC is including a number of software features that add additional camera functionality, collectively called the HTC EYE Experience. HTC's face tracking keeps the user's face in focus and crops the image to frame it. Up to four people can be tracked and framed at the same time. HTC's Split Capture feature combines simultaneously taken photos or videos with the front and back cameras into a single image or video. Voice Selfie allows the front camera to take a photo by smiling, and to take a video by saying "action" or "rolling"
HTC is also bringing over features that were introduced earlier this year with the Desire 820. Face fusion allows the user to merge their face with that of their friend or another person. Live makeup allows the user to adjust the level of skin smoothing with a live preview before capture.
Some of the HTC EYE Experience features will roll out to the following HTC devices in the coming months:
- HTC One (M7)
- HTC One (M8)
- HTC One E8
- HTC One Mini
- HTC One Mini 2
- HTC One max
- HTC Desire 816
- HTC Desire 820
- HTC Butterfly 2
HTC plans to roll out the Desire EYE on carriers in Europe, the Middle East, Africa, Asia, and the United States from October onward. Pricing is said to be around that of the One (E8), and should supersede the One (E8) in those markets.
HTC Announces RE, an Action Camera: Hands On
Today is HTC's camera-centric Double Exposure event in New York. In addition to announcing the new Desire EYE, HTC is announcing a new device that occupies it's own space in HTC's portfolio. It's a small camera called the RE. As far as cameras go, the RE is fairly unique. At first glance it looks like it could be something like a flashlight, but the large glass circle is really the cover for a 16MP camera housed inside. The device is shaped so it can be held and operated with a single hand. At 65.5 grams it's also very light. MicroSD is used exclusively for storage, with an 8GB card shipping by default and support for up to 128GB MicroSDXC. An 820 mAh (3.116Wh) battery provides up to 1200 captured photos or 100 minutes of continuous video recording.
The RE is the opposite of what HTC has done with their cameras in the smartphone space. HTC is also keen to emphasize that this doesn't compete with GoPros. They were one of the first companies to start putting manual controls for white balance, ISO, and shutter speed in their camera application. With the RE, HTC is trying to create a highly automatic photography experience where all that's required is the press of a button. To do this, HTC has eliminated as many buttons, toggles, and controls as possible. The RE has no on/off button. Instead, it has a sensor in the grip which detects when it is being held and turns the camera on which uses an MCU which is likely to be a Cortex M solution. The only buttons are the shutter button on the back where the user's thumb is placed, and a slow-mo video toggle on the front underneath the lens where the user's index finger is placed. There are no dials, and no viewfinder or LCD.
Composition is an essential part of photography. To allow users to take photos without any viewfinder, HTC has used a 16MP 1/2.3" CMOS sensor with an F/2.8 aperture and a very wide 146 degree field of view which is equivalent to 17mm. The camera's wide field of view means that as long as users have a rough idea of what is being captured by the camera and point in the direction of the subject, they can crop and align photos after they are taken without cutting off any essential details. In practice, the quality is passable, although hurt by the wide field of view which reduces peak resolution in addition to the half second or so of shutter lag. The gallery below has sample photos from the RE, although HTC cautioned that both the hardware and software were non-final and that there may be significant differences in the final product.
Gallery: HTC RE Sample Shots
HTC has also announced a line of accessories for the RE, which are pictures above. The extended battery has a novel design, screwing into to the 1/4" tripod mount on the bottom of the RE and using a retractable micro USB plug. While the camera has an IP57 rating for dust resistance and immersion in 1 meter of water for 30 minutes, accessories like the extended battery do not. Users who plan on going deeper underwater should invest in the protection pack which includes a cap that improves water resistance to IP58 which allows for immersion in 3 meters of water for up to 2 hours.
While the camera can function on its own, it also integrates with Android and iOS devices that support Bluetooth 4.0 LE. The RE app that will be launching with the camera will work as a photo and video manager, and a live viewfinder. Photos can be easily shared with social networks, and automatically backed up to Dropbox or Google Drive.
HTC plans to launch the RE in the United States by October, and will expand to other regions afterward. We expect pricing in the US to be around 199 USD at launch. While it remains to be seen whether the RE is the first of many new action cameras, HTC seems to be exploring new product categories instead of following prevailing industry trends.
The AnandTech Guide to Video Card Overclocking Software
Video card overclocking has become a very popular topic amongst gamers and PC enthusiasts these days. With the release of next generation games around the corner and the growing popularity of resolutions beyond 1080p, overclocking is becoming increasingly important to users looking to squeeze the most performance possible out of their video cards. It’s been more than a decade since video card overclocking was first introduced, and while the core concept remains the same, the software has improved to make it easier and provide additional features and functionality. If you're looking to boost performance on your GPU, we're rounding up the most popular utilities to find out their pros and cons. Read on for the full guide of GPU overclocking software.
MSI GT72 Dominator Pro: Performance Preview
NVIDIA just launched their new GTX 980M/GTX 970M GPUs, and unfortunately we were unable to get a notebook in time for testing… which just changed a few hours ago. With an MSI GT72 Dominator Pro in hand, we're ready to see just how fast the GTX 980M is when it comes to playing games. Here's a hint: the 1080p display may prove to be the limiting factor in quite a few titles.
AMD Radeon R9 290 Series Prices Finally Begin To Fall
With the launch of NVIDIA’s Maxwell-powered GeForce GTX 900 series last month, it was immediately obvious that NVIDIA had been able to deal a swift blow to AMD’s product lineup by surpassing AMD’s performance while significantly undercutting their pricing. At the time we were expecting AMD to quickly respond with the necessary price cuts to keep the R9 290 series competitive with the GTX 900 series, but surprisingly even a week later this had yet to happen.
Now a bit over two and a half weeks after the GTX 900 series launch, we’re finally seeing Radeon R9 290 series pricing fall in response to NVIDIA’s launch. AMD has not announced an official price cut at this time – and admittedly neither AMD nor NVIDIA tend to announce reactive price cuts – so it’s not clear whether this is AMD’s doing, board partner’s, retailers, or most likely all three. But regardless, retail video card prices at Newegg and other etailers have seen some substantial drops that help bring back at least some balance between AMD and NVIDIA’s high end video card lineups.
A number of Radeon R9 290 cards can now be found for around $300 after rebate, with a couple more factory overclocked models at $310. With HIS, Sapphire, PowerColor, Asus, and XFX represented, this is a broad selection of vendors with a bit less than half of Newegg’s stock now at or around $300. Meanwhile R9 290X can be found for $399, again with a wide selection of vendors and roughly half of Newegg’s stock at or near that price. The remainder of Newegg’s stock in turn generally consists of heavily overclocked or otherwise premium cards that carried their own price premium before these latest cuts.
Speaking of AMD card prices, it should also be noted that AMD’s Never Settle Forever bundle is still active even after this round of price cuts. AMD and their partners will be continuing to try to influence the value proposition of their products by including free games.
For AMD these price cuts don’t come a moment too soon, and while they are going to help the competitive landscape I’m not convinced this is the last time we’re going to see AMD cut prices. As we discussed in our review of the GTX 970, comparing stock-to-stock, the $329 GTX 970 is every bit as fast as the now $400 R9 290X. If AMD wants to be price/performance competitive with NVIDIA then there’s still an additional $70 price difference between the two cards, a gap further muddied by AMD’s game bundle and NVIDIA’s superior energy efficiency. Strictly speaking $400 may not be low enough for the R9 290X, but no doubt AMD wants to see what sales are like at $400 before cutting prices on their single-GPU flagship any further.
The R9 290 on the other hand is in an interesting spot. At resolutions below 2160p it trails the GTX 970 by around 10%, but then again at $300 it’s also priced about 10% lower. Since it ships at a lower clockspeed than R9 290X a lot of AMD’s partners also goose the core clock on R9 290, which improves performance a bit but isn’t enough to close that 10% gap. What it does mean however is that at least so long as energy efficiency is not a concern, R9 290 is appropriately priced for its performance. However if energy efficiency is a concern, then AMD doesn’t have any kind of counter to GM204 at this time.
If anything the one wildcard at this point is the availability of the new GeForce cards. Despite stock more-or-less holding up immediately post launch, we’ve seen both the GTX 980 and GTX 970 go out of stock in the last week. As of the time of this writing it looks like Newegg has received their Tuesday shipment, so there is stock available, but it’s a thin selection of just a few different cards (including a model or two at MSRP). For prospective buyers this means either playing inventory games or grabbing the AMD alternative, and for AMD this is all the more reason not to cut prices too drastically while GeForce availability is still limited. As for NVIDIA it’s been a while since we’ve seen them capacity constrained on the high end, so while it’s solid evidence that they’ve done everything right with the GTX 900 series launch, it does mean that they’re also going to be leaving sales on the table until supply and demand level out.
Fall 2014 GPU Pricing Comparison AMD Price NVIDIA Radeon R9 295X2 $1000 $550 GeForce GTX 980 Radeon R9 290X $400 $330 GeForce GTX 970 Radeon R9 290 $300 Radeon R9 280XRadeon R9 285 $250 Radeon R9 280 $200 GeForce GTX 760
More GeForce GTX 980M/970M Notebooks
Continuing with our GTX 980M/GTX 970M coverage, I expect we'll see press releases and website updates from all of the major notebook vendors today or very soon, as all of them add the GeForce GTX 980M and 970M to their configurators. In order of when I received the news, here's a short list of vendors offering GTX 980M/970M notebooks.
Origin PC EON and EVO15-SOrigin tends to offer very high-end configurations with some customization options that you won't find at "lesser" vendors, though like most companies they don't actually manufacture the core chassis. Instead, they use "whitebooks" from ODMs like Clevo and MSI and then custom configure the components. For today's launch, Origin is updating their EON line and EVO15-S laptops; they're also looking to add a 4K display option on the EVO15-S. I'd love to see them offer a HighDPI panel on the larger EONs, as the EVO15-S tops out at the GTX 970M and if you're trying to push 4K resolutions on a notebook you'll want every ounce of performance you can find. Anyway, there's not a whole lot to say about their updated offerings, other than you can head over to their GeForce 900M page to configure your notebook.
Gallery: Origin PC EON and EVO
Maingear Nomad and PulseMaingear is also updating their notebook offerings, which are based on MSI designs. As such, you can expect to see the same selection of notebooks that we covered in our MSI GTX 900M post. The Nomad 17 is the same core chassis as the MSI GT72, and it will be available with either the GTX 980M or 970M. The Nomad 15 matches up with the GT60 while the Pulse 15 and 17 equate with the GS60/GS70. For now Maingear appears to be focusing on the Nomad 17, which starts at $2100.
EurocomEurocom is the first to send us a complete list of Clevo-based designs with GTX 980M/970M support. Their X3, X5, X7, and X8 will all support the new GPUs, and yes, that means you can order a GTX 980M SLI configuration if you really need every ounce of performance you can squeeze out of a desktop replacement. Eurocom also supports upgrading existing notebooks with the new GPUs, though the cost is often so high that you'd be better off selling your current system and simply buy a brand new notebook instead.
CyberPowerPCCyberPowerPC is also offering customized builds based on the MSI GS60. The new Fangbook Edge is available in two configurations, a base model with a 1080p display, i7-4870HQ CPU, GTX 970M, 1x8GB RAM, and a 1TB HDD starts at $1689. The second option is a 4K Fangbook Edge with the same specs but with a 3840x2160 display starts at $1799. I'm actually quite happy that the age of quality notebook displays is finally upon us; $120 extra to go from 1080p to 4K isn't all that bad, as I remember several years back when upgrading from a garbage 1366x768 panel to 1080p could cost more than that. As for storage, CyberPowerPC offers a 240GB M.2 Kingston drive for $131, or you can get up to a 1TB 2.5" Samsung SSD if you're willing to pay for it; the $254 500GB Samsung 840 EVO is still the one to beat, and with the fixed firmware coming out it's probably safe to go that route again.
Gallery: CyberPower PC Fangbook Edge
AVADirectAVADirect is also selling a full line of Clevo notebooks with either GTX 980M or GTX 970M. The current list consists of the Clevo P150SM-A, P157SM-A, P170SM-A, P177SM-A, and P377SM-A. That last one is an SLI configuration, though that seems more than a bit overkill without a 3K or 4K panel to go with it. The base price for the Clevo P377SM-A with a single GTX 980M is $2212 (with only a hard drive for storage), and adding a second GTX 980M increases the price by $638. If that's the standard price for a GTX 980M, it's actually not too bad; now all we need is the ability to buy one and upgrade existing notebooks, which is easier said than done.
NVIDIA GeForce GTX 980M and GTX 970M: Mobile to the Maxwell
Every year NVIDIA launches quite a few new products; some are better than others, but they're all interesting. This fall, the big news is Maxwell 2.0, aka GM204. Initially launched last month as the GTX 980 and GTX 970, NVIDIA is hopefully changing the way notebook gamers get treated by launching the mobile version of the GM204 just one month later. We already covered all of the new features in the desktop launch, but now we have specifications for the mobile versions. Read on for a preview of GM204 on notebooks as NVIDIA seeks to close the performance gap with desktops.
MSI Gaming Notebooks with GeForce GTX 980M and 970M
As noted in the conclusion of our GTX 980M/GTX 970M launch article, we're going to be covering some of the notebooks that are being announced today in separate Pipeline pieces. MSI has the largest selection of notebooks, with four of their existing products receiving updates to support 980M/970M along with one all-new design, the GT72 Dominator. We'll start with the GT72 Dominator, as it's obviously the most interesting option. There are seven different models of the GT72 being offered, with varying storage, RAM, and GPU configurations. Here's a brief overview of the specs:
MSI GT72 Dominator (Pro) Specifications CPU Core i7-4980HQ (2.8-4GHz)Core i7-4710HQ (2.5-3.5GHz) GPU NVIDIA GeForce GTX 980M 8GB
NVIDIA GeForce GTX 970M 6GB RAM 12GB up to 32GB DDR3L-1600
(Four SO-DIMM Slots) SSD 128GB to 1TB M.2 SATA
(2-4 SSDs in RAID 0 for 256GB and up) HDD 1TB 7200RPM Optical Super Multi (9.5mm)
BD Burner 9.5mm Display 17.3" Full HD eDP Anti-Glare (1920x1080) Networking Killer Gaming Network
Killer N1525 Combo (2x2 802.11ac + BT 4.0) I/O Ports 6 x USB 3.0
Flash Reader (SDXC/SDHC)
HDMI 1.4
2 x mini-DisplayPort 1.2 Input Steel Series Keyboard
Multi-touch Touchpad Power 9-cell battery
230W AC adapter Extras Full HD webcam (1080p30)
Configurable Multi-colored Backlighting
Anti-Ghost Key OS Windows 8.1 Multi-Language Dimensions 16.85" x 11.57"x 1.89"
(428mm x 294mm x 48mm) Weight 8.4 lbs (3.82kg) MSRP $2000-$3900
Fundamentally, there are two variants: the GT72 Dominator Pro comes with a GTX 980M while the GT72 Dominator comes with GTX 970M. All seven models come with some form of solid state storage (from 128GB with a single M.2 SATA SSD up to 1TB with four 256GB M.2 SATA SSDs in RAID 0), and they all include a secondary 1TB 7200 RPM hard drive. Pricing starts at $2000 for the GT72 Dominator with GTX 970M, while the top configuration tips the scales at nearly four grand ($3900) – obviously, the cost of four 256GB SSDs can add up. Finally, the NVIDIA GPUs seem to have twice the standard RAM on the GT72, so the GTX 970M comes with 6GB while the GTX 980M comes with 8GB GDDR5. These are clearly intended as high-end gaming systems, and the minimum price for a GT72 with GTX 980M is $2400, which will get you a Core i7-4710HQ, 2x8GB RAM, a 128GB SSD, and a Blu-ray burner (along with the other common features listed above).
Note that there have been some significant changes to the GT72 compared to the previous generation GT70 platform, though we haven't reviewed the GT72 (which launched last month) yet. MSI has reworked the chassis and motherboard to provide six USB 3.0 and two mini-DisplayPort 1.2 ports, and the new GT72 chassis also comes with dual cooling fans, which should address one of our biggest complaints with the previous GT70 design. Support for up to four M.2 SATA SSDs is also new. Needless to say, we're definitely looking forward to testing the GT72; MSI provided some photos of the GT72 in the gallery below.
Gallery: MSI Gaming Notebooks with GeForce GTX 980M and 970M
MSI GT60/GT70 Dominator Specifications CPU Core i7-4710HQ (2.5-3.5GHz) GPU NVIDIA GeForce GTX 970M 3GB RAM 8GB or 16GB DDR3L-1600(Four SO-DIMM Slots) SSD 128GB or 2x128GB mSATA
(None on base model) HDD 1TB 7200RPM Optical BD Combo
Super Multi (base model) Display GT70: 17.3" Full HD Anti-Glare (1920x1080)
GT60: 15.6" Full HD Anti-Glare (1920x1080) Networking Killer Gaming Network
Intel 7260 (2x2 802.11 ac + BT4.0) I/O Ports 3 x USB 3.0
2 x USB 2.0
Flash Reader (SDXC/SDHC)
HDMI 1.4
1 x mini-DisplayPort 1.2
1 x VGA Input Steel Series Keyboard
Multi-touch Touchpad Power 9-cell battery
180W AC adapter Extras Full HD webcam (1080p30)
Keyboard Backlighting OS Windows 8.1 Multi-Language Dimensions GT70: 16.85" x 11.34" x 2.17"
(428mm x 288mm x 55mm)
GT60: 15.55" x 10.51" x 2.16"
(395mm x 267mm x 55mm) Weight GT70: 8.6 lbs. (3.91kg)
GT60: 7.7 lbs. (3.50kg) MSRP $1600-$2100
Moving on, the GT60 and GT70 Dominator have also been updated with the GTX 970M (but not the 980M). The core differences from the new GT72 are easily summarized. GT70 provides four SO-DIMM slots but MSI only populates two of them on the new GTX 970M models. The SSDs are mSATA and you can get the GT70 with none, one, or two SSDs (and a 1TB 7200RPM HDD). In terms of I/O, there are two USB 2.0 ports, one fewer total USB ports, a VGA port in place of one of the mDP ports, and the chassis is a bit thicker and heavier. MSI doesn't mention multi-colored keyboard backlighting, so it looks like they're going with a standard white backlight. Finally, the AC adapter is only 180W for these models and the WiFi is a standard Intel 7260 module. As for the GT60, there's only one model and it matches the base model GT70 in specs, so it's basically a bit lighter and has a smaller display.
MSI GS60/GS70 Specifications Notebook GS60 Ghost Pro GS70 Stealth Pro CPU Core i7-4710HQ (2.5-3.5GHz) Core i7-4710HQ (2.5-3.5GHz) GPU GeForce GTX 970M 3GB/6GB GeForce GTX 970M 3GB/6GB RAM 12GB or 16GB DDR3L-1600(Two SO-DIMM Slots) 12GB or 16GB DDR3L-1600
(Two SO-DIMM Slots) SSD 128GB or 2x128GB M.2 SATA 128GB to 3x256GB mSATA HDD 1TB 7200RPM 1TB 7200RPM Optical N/A N/A Display 15.6" Full HD eDP WVA (1920x1080)
15.6" WQHD+ 3K IPS (2880x1620) 17.3" Full HD eDP Anti-Glare
(1920x1080) Networking Killer Gaming Network
Killer N1525 Combo
(2x2 802.11ac + BT4.0) Killer Gaming Network
Killer N1525 Combo
(2x2 802.11ac + BT4.0) I/O Ports 4 x USB 3.0
Flash Reader (SDXC/SDHC)
HDMI 1.4
2 x mini-DisplayPort 1.2 4 x USB 3.0
Flash Reader (SDXC/SDHC)
HDMI 1.4
2 x mini-DisplayPort 1.2 Input Steel Series Keyboard
Click Pad Touchpad Steel Series Keyboard
Click Pad Touchpad Power 6-cell battery
150W AC adapter 6-cell battery
150W AC adapter Extras Full HD webcam (1080p30)
Multi-colored Backlighting Full HD webcam (1080p30)
Multi-colored Backlighting OS Windows 8.1 Multi-Language Windows 8.1 Multi-Language Dimensions 15.35" x 10.47" x 0.78"
(390mm x 266mm x 20mm) 16.47" x 11.29" x 0.85"
(418mm x 287mm x 22mm) Weight 4.2 lbs. (1.91kg) 5.7 lbs. (2.59kg) MSRP $1900-$2300 $1900-$2600
Wrapping up, the GS60 and GS70 have both been updated as well with support for the GTX 970M. There are six GS70 models coming out and four GS60 models, with the GS60 having two 3K variants. The base model on both comes with 12GB RAM and a 3GB GTX 970M and a 128GB SSD; all of the other models have 16GB RAM, but there are options with 3GB VRAM and 6GB VRAM. The GS60 only supports two M.2 SATA SSDs while the GS70 can support three mSATA SSDs, and the top configuration of the GS70 comes with 768GB of SSD storage. The display options on the GS60 trump the GS70, however, as you can get either a wide viewing angle (IPS or similar) 1080p panel or a 3K IPS panel, while it appears the GS70 only has a standard 1080p display – the same display used in the GT70/GT72 most likely.
HP Splits In Half: Consumer & Enterprise Businesses To Separate
After a weekend of rumors spurred on by a Wall Street Journal report, HP has confirmed this morning that the company intends to split in half next year. The process will see each half become its own independent company, allowing for what amounts to HP’s enterprise and consumer divisions to go their separate ways. By doing so, HP is looking to allow each half to focus on one subset of HP’s overall business, allowing for more focused execution and growth while cutting the bonds that HP believes have made them slow to move in the past.
The split will see HP’s core businesses assigned into one of two companies. HP Inc. the closest of the two companies to an immediate successor to the current HP, will take HP’s PC and printing businesses, along with HP’s other consumer/mobile businesses such as the company’s Chromebooks and tablets. Internally these products are already organized under HP’s Printing and Personal Systems business, so in some senses this is merely moving a business that was its own division into its own company entirely. This split off will also see the current EVP of the Printing and Personal Systems business, Dion Weisler, promoted to CEO of the new HP Inc. Finally, HP Inc. will also be retaining the current HP branding.
Meanwhile the rest of HP’s businesses – servers, networking, storage, software, financial services, and other services– will all be split off together to form the new Hewlett-Packard Enterprise. As alluded to by the name, Hewlett-Packard Enterprise will be focused on HP’s enterprise businesses, where divisions such as the company’s networking business are potential rapid growth markets for HP. HP’s current CEO, Meg Whitman, will be transitioning over to CEO of Hewlett-Packard Enterprise.
HP’s separation in turn is largely borne out of the fact that HP isn’t deriving much of an advantage of keeping all of their businesses under one roof together. HP believes that having the companies split off will mean that each company is better focused on its respective market without the heavy overhead of trying to manage all of these businesses as a single company. In other words, each half will be more flexible/agile than the combined whole. Practically speaking HP has seemed conflicted between consumer and enterprise for some number of years now, and while it’s possible to do both things at once it’s anything but easy. So in lieu of a better reason to have a single company, HP seems content to let each half of the company go their own ways.
What’s interesting is that despite how much bigger Hewlett-Packard Enterprise would seem at first due to its mix of enterprise products, it’s only marginally larger than HP Inc. Based on HP’s most recent revenue, Hewlett-Packard Enterprise would be slightly more profitable than HP Inc. on roughly the same revenue, but from a financial basis at least this isn’t a clear case of ejecting the weaker company. That said, HP does seem more bullish on the growth opportunities for Hewlett-Packard Enterprise than HP Inc. (the latter will pay dividends, for example), which offers some additional rationale for why HP would want to split the company. In any case even split both companies will be quite large, clearing over $50B/year in revenue and putting them in the Fortune 50.
Ultimately HP expects the transaction to be completed by the end of the company’s fiscal year 2015 (Oct. 31, 2015), assuming regulatory approval and no other challenges. This split comes as the latest step as part of the company’s larger 5 year turnaround plan, which if successful HP will be nearing the end of.
Finally, in light of this split it’s interesting to reflect on the state of HP after such a long period of acquisitions and mergers by the company. Over the years HP has acquired a large number of formerly high profile companies, including Digital Equipment Corporation (DEC), Compaq, VoodooPC, Palm, 3Com, and Electronic Data Systems (EDS) as part of a larger effort to build up what has become the megalith that they are now taking apart. The combined companies are still larger than the individual companies HP has acquired over the years, but between the previous spin-off of Agilent (HP’s former instruments & equipment business) and now the split into HP Inc. and Hewlett-Packard Enterprise, the individual companies are no longer the sort of mixed business conglomerates that HP has been for the last two decades.
MSI Z97 Gaming 5 Motherboard Review: Five is Alive
Sometimes it feels odd to review the cheaper elements of the motherboard market. The more expensive models have more to play with, whereas the sub $160 market for Z97 comes down to the choice of an individual controller or two. Here is where brand loyalty and styling seem to matter more than absolute feature set. To make matters worse for MSI, one of the other manufacturers is also branding their motherboards with ‘Gaming X’, making it harder to forge that nomenclature as a brand. Today we are looking at the MSI Z97 Gaming 5 at $160, which at the time of writing was sold out on Newegg.
ARMv8 Goes Embedded with Applied Micro's HeliX SoCs
We covered the news of the first shipment of 64-bit ARMv8 processors in the HP Moonshot product line earlier this week. At ARM TechCon 2014, Applied Micro (APM) had a very interesting update to their 64-bit ARM v8 product line. They launched two SoC families, HeliX 1 and HeliX 2. Both of them are based on the X-Gene ARMv8 cores developed for servers, but appropriately scaled down to fit in the 8 W - 42 W TDP scenarios for the embedded market. The HeliX 1 is fabricated in a 40 nm process, while the HeliX 2 uses a 28 nm process. The latter uses the second generation X-Gene ARMv8 core.
Applied Micro has traditionally been a PowerPC house. In fact, we have evaluated their Catalina networked storage platform in the Thecus N2310 and looked at the previous generation PowerPC SoC in the Western Digital My Book Live. However, in 2010, Applied Micro obtained an architecture license for ARMv8 (the 64-bit ARM architecture). Understanding that PowerPC was in decline, Applied Micro decided to devote all development resources to ARMv8. As part of this deal, all product lines based on the PowerPC architecture are being migrated to ARMv8 under the HeliX family.
APM is hoping to get HeliX into the embedded market, with focus on communication and networking, imaging, storage and industrial computing verticals. They believe ARMv8 is the architecture of the future and had a number of companies (including Cisco, Netgear, Konica Minolta, Wind River and Canonical) voicing support for their strategy.
The two SoC product lines launched by APM yesterday were the APM887208-H1 (based on HeliX 1) and the APM887104-H2 (based on HeliX 2). The SoC block diagrams of both of these SoCs are provided below, along with a table summarizing and comparing the various aspects.
Applied Micro Helix Block Diagram
Applied Micro HeliX Family APM887208-H1 APM887104-H2 Cores 4 or 8 ARMv8 HeliX 1 at up to 2.4 GHz 2 or 4 ARMv8 HeliX 2 at up to 2.0 GHz L1 Cache 32 KB I / 32 KB D per core (write-through with parity protection) L2 Cache 256 KB shared per core pair (with ECC) 64 L3 Cache 4 or 8 MB shared 2 MB shared DRAM 2x DDR3 Controllers with ECC (72b each) 1x DDR3 Controller with ECC (72b) On-Chip Memory 1 MB 256 KB Memory Bus Width 256-bit 256-bit Low Power Features N/A < 250 mW standby Coprocessors 4x Cortex-A5 at 500 MHz N/A High-Speed Interfaces 2x 10G + 4x 1G + 1x 1G Management Ethernet 1x 10G + 4x 1G Ethernet 17x PCIe 3.0 (2 x8 + 1 x1 OR 1 x8 + 2 x4 + 1 x1 OR 4 x4 + 1 x1) 3x PCIe 3.0 (2 x1 OR 1 x4) 1 2x USB 3.0 Host 2x USB 3.0 Host + 1x USB 3.0 Host/Device 6x SATA III (four muxed with 4x 1G Ethernet) 1x SATA IIIApplied Micro Helix 2 Block Diagram
The HeliX SoCs are sampling right now and slated to go into volume production in 2015. Applied Micro claims that design wins are already in place. From ARM's perspective, one can say that the juggernaut rolls on. With Cavium's Project Thunder and Broadcom's Vulcan targeting the high-end enterprise and datacenter segment, ARM needed an entry in the mid- to high-end embedded space currently dominated by MIPS64 and x86-64. The Applied Micro HeliX family brings ARM forward as a credible competitor for those sockets.
Antec EDGE 550W Power Supply Review
Today we are looking at Antec's latest PSU series, the EDGE, which the company markets as "the pinnacle of power supplies". Bold statements aside, only medium capacity units are available and silence seekers are their main target. We're reviewing the lowest capacity model of the series, with a maximum output of just 550 Watts, which means this is a PSU that could be used by a larger number of users. Let's see how it performs.
ASRock Z97 OC Formula Motherboard Review: Less Lamborghini, More Yellow
ASRock is quietly confident of its OC Formula range. We awarded the Z77 version because of its aggressive tactics at the $240 price point and while the Z87 model offered even more but at $330 it missed that sub-$250 market which cheaper overclocking builds are built on. The Z97 OC Formula ditches the Lamborghini on the box and comes back down to earth at $210, although the feature set becomes lighter as a result. The mainstream overclocking motherboard market is always hot at $200, so today we are putting the Z97 OC Formula through its paces.
Benchmarked - Metro: Last Light Redux
Last month 4A Games released updated versions of the two earlier games in the Metro series, Metro 2033 Redux and Metro: Last Light Redux. The games have both been remastered using the latest version of 4A Engine, with updates for the latest generation of console hardware among other things. Fundamentally, that means less for Metro: Last Light than it does for Metro 2033, but there are still some visual changes, and that potentially means performance changes as well. We've been using Metro: Last Light as one of our gaming performance benchmarks almost since it first came out in May, 2013, and it's still one of the most demanding games around. Of course part of that stems from the use of super-sampling anti-aliasing at the highest quality settings, but even without SSAA Metro: Last Light can be a beast.
Something we've wanted to do more in the past is to provide smaller updates looking at the performance of recent game releases. Our GPU reviews do a good job of giving a broad overview of the performance from the latest graphics cards on a smaller subset of games, and it's basically impossible to test every new GPU on every game at the time of launch. But if you're in the market for a new GPU, you probably want to use if for playing games, which means seeing how new games perform on a selection of hardware is useful. To be clear, we're not replacing our GPU reviews, but we hope to augment our other coverage with increased coverage of the recent gaming releases.
It's worth noting that testing gaming performance at the time of launch may also tell an interesting story about the state of drivers from the various GPU companies. AMD and NVIDIA are the two obvious participants, but with Intel continuing to increase the performance of their Processor Graphics solutions it's also important to see how they fare with new releases. In some cases we may see serious performance issues or rendering errors early on, and if/when that happens we may elect to revisit the performance of certain games a month or two after launch to see what has changed. We've encountered instances in the past where drivers tended to target and fix issues with the most commonly benchmarked games, and while things are certainly better these days it's always good to look at empirical data showing how the various companies stack up.
With that out of the way, let's see what has changed with Metro: Last Light Redux, both in terms of graphics as well as performance. Starting with the former, in most areas you'll be hard pressed to see substantial differences. The most noteworthy exception is the use of red lights and smoke in place of white lights/smoke in some areas; this is particularly apparent in the built-in benchmark. There also appears to be more tessellation in some areas, and at the end (when the "train" gets blown up), you can see in Redux that there's more deformation/destruction of the concrete barrier. I've created a split-screen video showing the original Metro: Last Light on the left and Metro: Last Light Redux on the right. The games were both run at 1080p maximum quality settings, with Advanced PhysX disabled. (Note that with video recording I limited the frame rate to 30 FPS, so disregard the performance shown in that clip.)
Other than the aforementioned changes in lighting color for the smoke, it's difficult to say how much the graphics have improved versus simply being different from the initial release. I've benchmarked Metro: Last Light hundreds of times over the past year (perhaps even thousands), but I have to admit that I haven't actually taken the time to play the game that much, so many of the more subtle changes might go unnoticed.
The list of updates notes that there are graphical upgrades, lighting enhancements, improvements to the gameplay and gunplay, and Redux also includes all of the DLC released for the original game. There have been some updates to certain maps/areas as well, all the weapons that were added via DLC are integrated into the game, and there are some minor UI tweaks (e.g. you can check your watch and inventory as in the original Metro 2033). Finally, there are new achievements/trophies along with two new modes – Spartan and Survival – in Redux. Spartan is basically the way the original Last Light worked (more run-and-gun gameplay, more ammo, not as "hard") while Survival mode is more like the original Metro 2033 (less ammo and health, more difficult enemies). From what I can tell, though, having more (or less) ammo in either game doesn't really change things too much.
But what about performance – is Metro: Last Light Redux any faster (or slower) at rendering its updated graphics compared to the original? To answer that, I've got a rather different set of hardware than what Ryan uses for our GPU reviews, as all of the hardware has been purchased at retail over the past year or so. For now I'm going to focus on single GPU performance, and while I do have a moderate collection of both AMD and NVIDIA GPUs, for the time being my hardware is slanted towards the high-end offerings than lower tier parts. On the laptop side, we'd also like to thank MSI for letting us use three of their latest notebooks, the GT70 Dominator Pro with GTX 880M, the GS60 Ghost Pro 3K with GTX 870M, and the GE60 Apache Pro with GTX 860M. Here's the short list of hardware that I've used for testing:
Gaming Benchmarks Test Systems CPU Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3)Overclocked to 4.1GHz Motherboard Gigabyte G1.Sniper M5 Z87 Memory 2x8GB Corsair Vengeance Pro DDR3-1866 CL9 GPUs Gigabyte Radeon HD 6970
Sapphire Radeon R9 280
Sapphire Radeon R9 280X
Gigabyte Radeon R9 290X
EVGA GeForce GTX 770
EVGA GeForce GTX 780
Zotac GeForce GTX 970
Laptops:
GeForce GTX 880M (MSI GT70 Dominator Pro)
GeForce GTX 870M (MSI GS60 Ghost 3K Pro)
GeForce GTX 860M (MSI GE60 Apache Pro) Storage Corsair Neutron GTX 480GB Power Supply Rosewill Capstone 1000M Case Corsair Obsidian 350D Operating System Windows 7 64-bit
The obvious omission here is the new GeForce GTX 980, though we're also missing GTX 780 Ti, R9 290, not to mention all of the mainstream GPUs like the GTX 750/750 Ti, the whole AMD R7 series, etc. The good news is that the laptops at least give us some idea of what to expect from such cards – the GTX 860M for instance is clocked very similarly to the GTX 750 Ti, and GTX 870M is similar to the OEM GTX 760 192-bit. Again, we'll work on improving the selection of cards tested and try to cover a broader range in the future, but for now let's see how performance differs between the two releases of Metro: Last Light.
We've tested at 1080p with maximum quality (Very High Quality) and we also ran a second test at 1080p with High Quality and without SSAA. In both cases we're testing without enabling Advanced PhysX. While PhysX can make a noticeable difference at times (the Batman games being a prime example), I can't say I've noticed anything but lower frame rates from the feature in the Last Light benchmark – it basically drops performance about 10-15% on NVIDIA cards, and minimum frame rates in particular can be very poor. Advanced PhysX also seems to cause issues with some NVIDIA cards (see below). Our settings then are essentially "Ultra" quality and "High" quality; here's what performance looks like for the two releases on our selected hardware:
So this is where things get interesting. At our maximum quality settings, performance is lower almost across the gamut of hardware with Metro: Last Light Redux. The R9 280 and MSI GE60 are the two exceptions, where performance basically stays the same; everywhere else we see anywhere from a 2% to an 11% drop. When we drop the quality settings a notch and disable SSAA on the other hand, Redux performance is only slightly lower (essentially the same) in one instance, and that's the HD 6970; all of the newer GPUs are anywhere from 10% to 19% faster. That could mean that optimizations have been made for all the modern GPUs but they just don't translate as well to SSAA performance.
As far as AMD vs. NVIDIA, similar to what we saw in our recent GTX 970 review NVIDIA's new "budget friendly high-end GPU" basically offers performance on par with AMD's top of the line R9 290X at a much lower price. GTX 970 also tends to be roughly the same level of performance as GTX 780, with the 780 now being cleared out at lower prices. The GTX 770 meanwhile offers roughly the same performance as the R9 280X, though in this case the AMD GPU has the lower price, but of course GTX 770 is being phased out in favor of GTX 970 as well.
One other item worth mentioning is that I noticed my Zotac GTX 970 GPU was a bit flaky with Redux, particularly at even higher settings (e.g. 2560x1440 maximum or 1080p high quality, with Advanced PhysX). I was running at the card's stock settings initially (which have a mild 26MHz bump on the base GPU clock), and I thought perhaps temperatures were getting too hot on some components. It turns out the real culprit is Advanced PhysX, which tends to crash Redux every few minutes on the GTX 970.
I haven't tested with PhysX extensively, but some additional testing of the GTX 780 also showed crashes with PhysX enabled (but it takes about twice as long as the GTX 970 to crash to the desktop, so 10 minutes instead of five minutes). Either Metro: Last Light Redux has some poorly implemented PhysX code, and/or NVIDIA may need to tweak their drivers for Redux to achieve stability at certain settings with Advanced PhysX enabled. This is definitely a fringe case, however, so it's not likely to affect a lot of users either way.
Overall, the Redux release of Metro: Last Light won't be any more – or less – playable on most systems than the original game. Of course, Metro 2033 Redux saw a much greater overhaul in terms of graphics and gameplay, but in that case it means the system requirements are higher than the original game, likely at the same level as Last Light Redux. In other words, if you're looking for the poster child of why gamers might want SLI or CrossFire builds, the Metro Redux games are right up there with other punishing titles like the Crysis series, at least if you want to crank up every quality settings. SSAA is as usual a major hit to performance, so turning that off can boost performance by almost 100% at the cost of having jaggies.
And on a final note, there's a huge onslaught of games coming, and we're hoping to test many of them in a format similar to this. Your feedback is welcome as always, and if you have any requests for games that are already available or coming soon that you'd like to see benchmarked, let us know. Also let us know if you'd like to see additional settings tested; I confined the results reported to 1080p at High and Ultra quality, but we could certainly run other settings. Since these are all single GPU configurations, 2560x1440 with Redux proves to be too much in most cases, unless we drop SSAA; the laptops meanwhile might benefit from 1920x1080 and medium quality settings, though that's a bit too light on the faster desktop GPUs. Anyway, let us know what you'd like to see.
PlayStation Plus October 2014 Free Games Preview
October is now here, and so is the news from Sony about which games will be available this month for subscribers to PlayStation Plus. For the last couple of months, PlayStation Plus members have been fortunate enough to get access to a brand new game, and this month Sony has pulled out all of the stops and have three brand new games to the respective platforms in the PlayStation Plus lineup. Two of those new games are on the PS4, with the final on the PS Vita. Also there is a bonus game for October. The free games will be available starting on the 7th of October.
PS4 Dust: An Elysian TailPS4 owners get access to two brand new games to the store. The first game is Dust: An Elysian Tail from independent developer Dean Dodrill. This action role playing game was first released on Xbox Live Arcade in August 2012, and has now made the jump to the PlayStation store. Dust presents itself as a 2D side scrolling game centered on the main character, Dust, who is united with a sentient sword and its guardian at the beginning of the game. Dust was well received by critics and users alike, with a Metascore of 83 and a User Score of 7.9 for the Xbox 360 version on metracritc.
“Immerse yourself in a gorgeous hand-painted world on a search for your true identity. As the mysterious warrior, Dust, your action-packed journey will take you from peaceful glades to snowy mountaintops and beyond. At your disposal is the mythical Blade of Ahrah, capable of turning its wielder it into an unstoppable force of nature, and the blade’s diminutive guardian, Fidget.”
SpelunkyThe second new game to the store available for the PS4 is Spelunky from independent developer Derek Yu. While new to the PS4 store, the game itself was first released on the PC in 2009, and then came to Xbox Live Arcade, the PS3, and the Vita. Players control an adventurer who is known as the spelunker. This platformer is set in underground tunnels and players gather treasure and avoid enemies and traps. The PS3 version scored an 83 Metascore and 7.7 User Score on metacritic. Spelunky will be cross-buy with the PS3 and Vita.
“Spelunky is a unique platformer with randomized levels that offer a challenging new experience each time you play. Journey deep underground and explore fantastic places filled with all manner of monsters, traps, and treasure. Go solo or bring up to three friends to join you in cooperative play or frantic deathmatch!”
DriveClub PlayStation Plus EditionThe bonus game is DriveClub PlayStation Plus Edition. This is also a new game to the store, and was developed by Evolution Studios. DriveClub, as you can surmise from the name, is a racing game but not a simulator type like Gran Turisomo. Instead, players make clubs of up to six players and complete challenges on road courses. The PlayStation Plus edition available for free has access to eleven tracks, and ten cars. Players who want to upgrade to the full game version can do that for $49.99.
“DRIVE TOGETHER, WIN TOGETHER. Fuel the thrill of high-octane racing by downloading DRIVECLUB PlayStation Plus Edition. Take a test drive of the world’s most social racer: Features unrestricted access both offline and online to the 11 tracks of India & 10 cars for your garage.”
PS3 Batman: Arkham AsylumMuch like the Xbox 360, the PS3’s vastly larger game catalog opens up a lot more content than new release indie games. The first free game for the PS3 is Batman: Arkham Asylum. This is the first of the Batman Arkham games, and was released in 2009 by Rocksteady Studios. This action-adventure game has a great story told through the third-person perspective. Arkham Asylum holds the Guinness World Record for “Most Critically Acclaimed Superhero Game Ever” and of course who doesn’t want to play as Batman? Arkham Asylum has an amazing 91 Metascore and 8.9 User score on metacritic. It normally sells for $19.99.
“Batman: Arkham Asylum exposes players to a unique, dark and atmospheric adventure that takes them to the depths of Arkham Asylum –Gotham’s psychiatric hospital for the criminally insane. Gamers will move in the shadows, instigate fear amongst their enemies and confront The Joker and Gotham City’s most notorious villains who have taken over the asylum. Using a wide range of Batman’s gadgets and abilities, players will become the invisible predator and attempt to foil The Joker’s demented scheme.”
Dungeons & Dragons: Chronicles of MystaraThe second PS3 game is Dungeons & Dragons: Chronicles of Mystara. This action role playing game was released August 22, 2013, and is from developer Iron Galaxy Studios. It is a compliation of D&D Tower of Doom and D&D Shadow of Mystara, both of which are classic D&D games from the 1990s. Chronicles of Mystrara got an 83 Metascore and 6.9 User Score on metacritic, and normally sells for $14.99.
“Dungeons & Dragons: Chronicles of Mystara combines two timeless D&D arcade classics -Tower of Doom and Shadow over Mystara- into one definitive package. Battle through a rich fantasy universe with a host of new features, including HD visuals, drop-in/drop-out 4 player online co-op, customizable House Rules, leaderboards, and a trove of extras.”
PS Vita Pix the CatThe first PS Vita game is also new to the store, and that is Pix the Cat. Pix the Cat is an arcade game which is a follow up to Pix’n Love – Rush from Pastagames. Save the ducklings in this 2D level based scoring game which will also be available on the PS4.
“PIX the CAT is an intense arcade game designed to boost your heart-rate! Rescue forsaken ducklings from the nested levels of the infamous GRID of INFINITY. Perfect your skills to SPEED and COMBO UP until you reach the explosive FEVER TIME!”
Rainbow MoonThe final game for October is Rainbow Moon, developed by SideQuest Studios and originally released in July 2012 on the PS3, and then was made available on the Vita in December 2013. This tactical role-playing game features turn based combat and allows the player to explore the created world. Rainbow Moon scored a 70 Metascore and 7.8 User score on metacritic. Rainbow Moon normally sells for $14.99, and is also available on the PS3.
“Rainbow Moon is a beautiful role-playing game filled with exploration, turn-based battles, and character development. Six playable main characters with upgradeable weapons and armor, and more than 20 challenging dungeons await you in a fascinating story that lasts over 40 hours.”
Another month, more games. PlayStation Plus members get a good assortment this month from indie arcade games right up to AAA titles like Batman: Arkham Asylum. All of the games will be available as of October 7th.
ARM Announces “mbed” IoT Device Platform
Following up on the incredible success of smartphones, tablets, and other handheld-size mobile devices, device manufacturers have been toying with ideas on what comes next. A common theme across many of these ideas has been the Internet of Things concept, which sees microcontrollers and Internet connectivity embedded into increasingly small or otherwise unusual devices where network connectivity wasn’t present before. From a technology perspective this is an exercise in seeing what you can do with products and environments where everything is networked, and meanwhile from a market perspective this is about driving the next wave of market growth for chip and device manufacturers.
But while the IoT concept seems simple on the surface – just put a chip in everything – what device makers have found out is that they have yet to discover what a good implementation would look like and what parts they need to build it. For that reason we have seen numerous companies toy with the IoT concept over the last year, launching new hardware components such as Intel’s Edison and the ARM Cortex-M7, or software components like MediaTek’s LinkIt. Which brings us to today’s news from ARM and their latest IoT project.
Being announced today at ARM TechCon 2014, ARM is unveiling their “mbed” (all lower case) IoT Device Platform, which is ARM’s new software platform for IoT devices and the servers feeding them. mbed is a surprisingly blunt project from ARM, who thanks to the success of their platform and their work over the years on their Cortex-M series of CPUs already has most of the hardware in place. Confident in their hardware, ARM is seeking to tackle what they see as the current issue holding IoT back: software.
The mbed platform is a combination of client and server software, consisting of a lightweight OS for client devices – mbed OS – and the matching server software to interact with it, the mbed Device Server. Like ARM’s hardware IP, both mbed OS and mbed Device Server are intended to be building blocks for finished products. Specifically, the idea being that developers will take the mbed components and build the application logic they need on top of a solid software foundation provided by ARM. By providing the OS and Device Server with the necessary support for various networking, wireless, and security standards built in, then as ARM’s thinking goes IoT software development will be dramatically simplified as ARM will have taken care of the hard parts, leaving developers free to focus on the application logic itself, reducing development costs and the time to market
For the mbed OS component, the OS is a lightweight, low-power kit OS designed to run on Cortex-M processors. ARM for their part will build in the necessary hardware features and even some common libraries, with a focus on providing building blocks for developers looking to design finished products. At this point ARM’s announcement predates the software by a bit, so mbed OS is still under development with early access for partners in this quarter with a shipping version in 2015.
Meanwhile the mbed Device Server is essentially a software bridge designed to allow the mbed OS devices to interact with web services. Unlike the stand-alone OS, Device Server is intended to be integrated into larger (cloud) server setups, with Device Server providing the means for the easy interaction with and management of mbed OS clients. As ARM likes to note the Device Server is based around open standards, with the idea being that they’re providing another kit (this time for the server) to give developers a place to start rather than creating a closed ecosystem around the mbed OS and Device Server components. Finally, unlike the OS, the Device Server is already running and available to developers.
In the short term mbed is all about kickstarting the IoT market, which is a big reason as to why ARM is giving away large chunks of it for free. The mbed OS is entirely free, and the Device Server is free for development. Only production setups running Device Server would need to pay a license fee. ARM wants to get mbed spread as widely as possible, and with their strong position in the hardware market they are more than willing to give away the software if it will spur on IoT hardware sales. Or as they see it, the time it takes to develop good software is currently gating the sales of products incorporating their IoT-focused hardware.
Looking at the bigger picture, while ARM has the right idea overall they are not the only company pursuing this software/toolkit market. Intel’s Edison platform ships with its own OS, and MediaTek’s LinkIt platform also includes its own LinkIt OS for many of the same purposes. However in all of these cases ARM and other companies can only provide the building blocks; they still must rely on developers to put together the killer app that will jumpstart the market. Largely for this reason the potential success for the IoT market is out of the hands of ARM and other hardware IP providers, but by providing a solid set of building blocks and working with a very large set of partners such as Marvell and Freescale, they are doing what they can to make that success happen.
A New Windows - Windows 10 Announced By Microsoft
It was only two years ago that Windows 8 was unleashed on the world. Microsoft tried to usher in an era of “Touch First” applications with a new look and feel for Windows. To say that Windows 8 was unsuccessful would be an understatement, and from both Microsoft’s and user’s perspectives, it was certainly a failure. Two years in, Windows 8 and its 8.1 derivative have struggled to gain market share over Windows 7 and XP, which still command the lion’s share of the desktop OS pie. A new interface, unfamiliar to users, did little to sway their wallets, and other market factors have come in to play as well.
Looking back at Windows 8, it was a big change from a company that traditionally has been called too conservative. Gone was the familiar start menu, replaced with a full screen version. Gone was the ability to move and resize applications into “windows” and instead it was replaced with full screen apps that take command of your desktop. Gone was a lot of what made Windows, well Windows. Add in the fact that Windows 8 at its launch was only half complete, and it is not surprising that the market did not buy into the new world. There were two disjointed interfaces, but one person had to interact with both no matter what form factor they were on. If you are on a touch based tablet, much of the settings and controls were still found in the old Control Panel applets. The file explorer was on the desktop, which was difficult to use with touch. On the other side of the coin, traditional desktop PC users also had to learn the new Start Screen, charms, and other controls which were clearly made for “touch first” and not the mouse and keyboard. Within weeks of the Windows 8 launch, major players in its creation were let go, or given new duties.
Since the day one release of Windows 8, Microsoft has been trying to fix many of the issues people have with the new version of their operating system, and Windows 8.1 was a step in the right direction, fixing interfaces for both the touch interface and the desktop. Windows 8.1 Update, announced at BUILD earlier this year, was a truly desktop-centric update with new keyboard and mouse controls for use in the touch environment, and the ability to control Windows Store apps with a title bar. It was a big help, but also showed off at BUILD were features coming in a later version of Windows, and that version has now come.
Windows 10 is as dramatic a shift from Windows 8 as Windows 8 was to Windows 7. Gone is the start screen for the desktop, with the familiar start menu back. Gone is the full screen applications taking over your computer, with those applications now being relegated to windows as before. Gone is the touch first interface on top of an operating system primarily used with a keyboard and mouse. However this is not Windows 7.1, and nor should it be. Windows 8 certainly had its faults, but not everything about Windows 8 needed to be thrown out.
Windows 10 starts its journey as the Windows Technical Preview for Enterprises. Microsoft’s core customer base is the enterprise, and this is important that they are starting the discussion with this market group this time around. Beginning tomorrow, people can join the Windows Insider Program and download and install the latest version of Windows for themselves. Microsoft has made it clear though that this preview is actually a preview, and not meant for general availability. Expect some rough edges, and some bugs, which should be worked out by the time the OS ships. As for the consumer side of the story, Microsoft is planning to announce more on that front in early 2015, and for developers, BUILD will be coming sometime after that. The actual Windows 10 launch is listed as “later in the year” with the year being 2015.
There is a lot to go over, and once we get our hands on the preview build we can dig into the new OS and give a full breakdown on what is new. One of the biggest complaints about Windows 8 and 8.1, is that real people do not want a single interface on every device they own. They want a User Interface which is tailored to the usage model. With Windows 10, Microsoft promises to address this.
Before we can talk about Windows though, we need to briefly discuss the Windows Store. Windows Store apps are executed in WinRT, which is the Windows RunTime, replacing the old Win32 runtime. WinRT has some advantages as a new framework, with the ability to be resolution independent natively, and support the Windows contracts such as Share. At BUILD in 2014, Microsoft announced Universal Apps, which are a key feature of the Windows store that is not available on any other platform. There is a lot of confusion as to what a Universal App is, and what it is not is a single application that runs on a phone, PC, tablet, and console. A Universal App leverages the common WinRT framework available in Windows, Xbox One, and Windows Phone, to allow a developer to share a common code base, but use a suitable UI for each system, and have all of it available on all platforms seamlessly through the Windows Store. It is certainly a lofty idea, and one that has gained a bit of traction in the store. With Windows 10 though, the concept of a Universal App allows a developer to target a phone, Xbox, tablet, and desktop. If anything is the killer feature of Windows 10, this could be it. Time will tell of course and developers need to buy into WinRT for this to be a reality. Today’s announcement is not developer focused, so we will expect more news on the WinRT API updates later on, at the BUILD conference.
The first place to start, is going to be Start. On Windows 7, clicking the Start button brought up the Start Menu. Windows 8 dropped the start button altogether, but 8.1 brought it back even though it opened the Start Screen. On tablets, the start screen was fine, but on a desktop, it could be unwieldy. It interrupted your workflow to bring you into a new environment, where you can find the application you want and launch it. Windows 10 brings back the Start Menu, but with a twist. Rather than just the traditional start menu of Windows 7, a familiar start menu can now also be populated with Live Tiles from the Windows Store apps. But this is only on the desktop. Tablets will get a different interface, as will the phone. To quote Microsoft: “We’re not talking about one UI to rule them all – we’re talking about one product family, with a tailored experience for each device.”
If Windows 10 is going to be successful, the tailored experience for each device is the key. The new start menu is just the first step towards that, and is especially important for the enterprise and desktop user.
The next interface change, was also announced at BUILD, and that is the ability to run Windows Store apps within a window on the desktop. This is a big change for two reasons. First, on a desktop, full screen Windows Store apps are less useful. Generally you have multiple things going on at once, and to have a single app take over the screen is generally not ideal. The usefulness of Windows Store apps has instantly been increased. The other reason it is important is for developer buy-in. Even though Windows 8 did not light the world on fire as far as unit sales, it is still on hundreds of millions of devices. However the majority of those devices are going to be traditional desktops. Writing an application for the Windows Store practically precluded use by the majority of the user base. By putting these apps on the desktop, it opens up a much larger potential audience. Microsoft needs the Windows Store to be kick-started, and this is one way they can advance that goal.
Windows Snap was also debuted in Windows 8, and it allowed two Windows Store apps to be snapped open, with one taking about 70% of the screen and the other using 30%. For multitasking, it was certainly better than other mobile operating systems from 2012, but it was a long way from Windows 7. Windows 8.1 improved Snap, and allows more than two apps to be snapped open on the screen at any one time, and for the snap percentage to be changed. Windows 10 is now offering another update on Snap. Apps can now be snapped to all four corners, giving more real estate to each app than before. Snap was a good feature, and this is a further improvement on it.
Another long requested feature is now coming to Windows 10 – multiple desktops. Desktops can be designated for different purposes, and users will be able to easily switch among them. There is a small but vocal group who have been asking for this for a long time, and they have finally been rewarded.
Sticking with the enterprise features, data security is always a big concern. Multifactor authentication based on smart cards or tokens is now built right into the OS. Bitlocker is still around, offering full device encryption, but Windows 10 now offers application and file level data separation, which can enable data protection even if that data leaves the device. Though they have not gone into a lot of detail as to how that is done, it likely leverages some of Microsoft’s other technologies such as Active Directory Rights Management Services.
Future updates to Windows should be easier for IT workers as well due to a new in-place upgrade option. And to go along with that, businesses will be able to choose whether to jump on the fast update consumer track, or lock down the updates to only deliver critical security patches, or somewhere in the middle. And this approach does not need to be at the enterprise level – different groups of machines can follow different update patterns depending on how critical the infrastructure is.
Windows 10 also supports Mobile Device Management (MDM) tools, as well as the traditional Active Directory and System Center approach to device management. This should be a boon to any small to mid-sized business who does not want to invest in a comprehensive solution.
Finally, the new Windows Store will allow volume license purchasing from within the store. Companies can re-claim licenses, and re-issue them to new devices. They can also create a custom store for their own computers which can include Windows Store and company-owned apps in the same interface.
Microsoft is trying hard to win back the Enterprise customers who have been turned off by Windows 8. Obviously we will have to wait and see if they are successful, but there is a lot to like in this new release. The “one UI to rule them all” model of Windows 8 has been put out to pasture, and instead replaced with a single platform, with a UI to suit the device it is running on.
Not all was bad about Windows 8, and it is good to see that some of the good ideas have been taken and molded into the new OS, but also tweaked at the same time to make them work better for the device they are on. The Universal App is a powerful idea, and one that has yet to make a big splash so far, but if the WinRT framework can be updated to make it more powerful, then it would certainly add a lot of power to Windows 10. Unlike Win32, WinRT apps support high DPI by default, which is more and more important as we move to higher resolution displays on all sizes of devices. The ability to log in to any Windows PC and get your own custom look and feel, including all of your applications, and data, is a powerful feeling. They have all of the tools they need to do this across all devices now, and it is exciting to see a glimpse at what the future may hold.
Once we get the actual install files for Windows 10, we will be able to provide more coverage on this major release of Windows.