Tech
Gravitational wave evidence disappears into dust
Earlier this year, researchers who used a telescope based at the South Pole called BICEP announced that they obtained evidence for gravity waves caused by the Big Bang itself. The results would provide direct evidence that a model of the Universe's origin called inflation had left its mark on the present-day Universe.
But in reporting on the results, our own Matthew Francis suggested that the discovery was not as definitive as it might be, writing "the story of BICEP2, inflation, and primordial gravitational radiation is just beginning." And since then, it became clear that there was a complicating factor—dusty material in our own galaxy—and that the BICEP team's way of controlling for it left a little something to be desired (it involved using processed data obtained from a PDF used in a conference presentation).
Yesterday, the team that put the PDF together in the first place released its own analysis. And they've determined that BICEP was probably staring at dust, rather than the earliest moments of the Universe.
Read 11 remaining paragraphs | Comments
Google will stop supporting climate change science deniers, calls them liars
Google Executive Chairman Eric Schmidt today said it was a “mistake” to support the American Legislative Exchange Council (ALEC), a group that has said human-created climate change could be “beneficial” and opposes environmental regulations. Schmidt said groups trying to cast doubt on climate change science are "just literally lying."
Google’s membership in ALEC has been criticized because of the group’s stance on climate change and its opposition to network neutrality rules and municipal broadband. Earlier this month, Google refused to comment after 50 advocacy groups called on the company to end its affiliation with ALEC.
That changed today when Schmidt appeared on The Diane Rehm Show and was asked by a listener whether Google is still supporting ALEC. The listener described ALEC as “lobbyists in DC that are funding climate change deniers.”
Read 12 remaining paragraphs | Comments
Reports say Apple may bring changes to Beats Music streaming service
Dueling reports from TechCrunch and Recode Monday suggest that Apple is likely to bring changes to Beats Music, the streaming music service owned by Beats Audio. TechCrunch cites five anonymous sources from both Beats and Apple as saying that Apple plans to shut the service down, but Recode cites Apple spokesperson Tom Neumayr as saying TechCrunch's report is "not true," and that while the brand may fade away, Apple will keep it around and "modify it over time."
Apple originally acquired Beats Audio back in May, and the company has remained fairly quiet about the future it sees for the company's various components. Beats Music was mentioned exactly once during Apple's September iPhone and Apple Watch event, and a Beats Music app was notably absent from the Apple Watch interface that debuted there.
TechCrunch writes that Beats Music CEO Ian Rogers was put in charge of iTunes Radio back in August and has been "splitting his time" between the two services. Apple does not currently offer à la carte streaming service like Beats Music; one possible course of action would be to roll the service's functionality into iTunes to make it more directly competitive with Spotify and Rdio. Recode's story fits with this approach a bit more: the site writes that "Apple won't shutter the streaming service," but may change it, and one of those changes may be to alter its branding.
Read 2 remaining paragraphs | Comments
Google stops malicious advertising campaign that could have reached millions
Google shut down malicious Web attacks coming from a compromised advertising network on Friday. The move follows a security firm's analysis that found the ad platform, Zedo, serving up advertisements that attempted to infect the computers of visitors to major websites.
In an attack that ended early Friday morning, visitors to Last.fm, The Times of Israel, and The Jerusalem Post ran the risk of their computers becoming infected as Zedo redirected visitors' systems to malicious servers. Because the advertisements hosted on Zedo's servers were distributed through Google's Doubleclick, the attack reached millions of potential victims, Jerome Segura, senior security researcher at Malwarebytes Labs, told Ars.
Distributing malware through legitimate advertising networks, a technique known as "malvertising," has become an increasingly popular way to compromise the systems of consumers and workers alike.
Read 9 remaining paragraphs | Comments
PSA: PlayStation TV launching October 14 in US for $99
Sony's latest gaming hardware product, the PlayStation TV, received a release date announcement on Monday, months after its official E3 unveiling this summer. Starting October 14, fans in the US will be able to buy the hardware by itself for $99 or purchase a "bundle" that includes a DualShock 3 controller, an 8GB memory card, and a free copy of The LEGO Movie Videogame for $40 more.
The PlayStation TV, which launched in Japan nearly one year ago as the Vita TV, essentially doubles as a Vita system that plugs into your HDTV. Just like the Vita, the system can play both physical and downloaded Vita games (along with PS1 titles and other fare sold via PlayStation Network). It can also serve as a PlayStation 4 streaming device via the Remote Play feature, and it supports the PlayStation Now streaming service, which serves PlayStation 3 games to your hardware by way of the cloud.
One of PSTV's limitations is its lack of touch, microphone, and motion support for Vita games, since neither the DualShock 3 nor DualShock 4 perfectly emulates Vita's features, meaning Vita-exclusive standouts like Tearaway are essentially unplayable (and other games' touch support requires some awkward joystick-clicking-and-aiming moves to emulate the Vita's taps). And while Vita games are upscaled to 720p resolution on the PlayStation TV, so are Vita video apps like Netflix and Hulu Plus. This limits them compared to the 1080p set-top competition.
Read 2 remaining paragraphs | Comments
Comcast to FCC: We already face enough competition, so let us buy TWC
Federal Communications Commission Chairman Tom Wheeler has made it clear he thinks there isn’t enough broadband competition in America, but Comcast is trying to convince the FCC that it faces enough competition right now. Already the largest pay-TV and broadband company in the US, Comcast is seeking permission to buy Time Warner Cable.
Comcast and Time Warner Cable don’t compete for customers in any city or town, despite being the nation’s two largest cable companies, which helps explain why US residents have so few viable options for cable and high-speed Internet service. But in response to merger-related questions from the FCC, a Comcast filing points to a broad range of competitors and says it’s easy to switch to a different provider (though a horde of angry customers might disagree).
Comcast said it faces competition from municipal broadband networks, though the telecom industry has pushed state governments to pass laws that restrict municipal broadband growth. Wheeler has said he will try to preempt those state laws, saying they prevent competition.
Read 17 remaining paragraphs | Comments
Weight loss firm demands $1 million from website hosting negative reviews
A Florida company selling an obesity product is suing a consumer website for hosting negative reviews of its dietary product. Roca Labs wants the US courts to award it in "excess" of $1 million in addition to blocking pissedconsumer.com from continuing the practice.
The lawyer for the New York-based online review site told Ars on Monday that the lawsuit [PDF] was "bunk," that its demands amount to a prior restraint of speech, and that the site itself is protected from defamation charges under the Communications Decency Act because it hosts the online review forum for others to use.
"Essentially, what they are saying, is my client is defaming them by allowing these negative reviews to be published. And that my client is engaged in tortious interference with their relationships with their customers, and that my client is practicing unfair and trade-deceptive practices," said attorney Marc Randazza in a telephone interview.
Read 10 remaining paragraphs | Comments
Home Depot’s former security architect had history of techno-sabotage
When Home Depot suffered a breach of transaction data that exposed as many as 52 million credit card transactions earlier this year, the company reportedly suffered from lax computer and network security measures for years. Apparently, the company wasn’t helped much by its selection of a security architect either. Ricky Joe Mitchell was hired by Home Depot in 2012, and in March of 2013, he was promoted to the position of Senior Architect for IT Security at Home Depot, in charge of the entire company’s security architecture. In May of 2014, Mitchell was convicted of sabotaging the network of his former employer.
When Mitchell learned he was going to be fired in June of 2012 from the oil and gas company EnerVest Operating, he “remotely accessed EnerVest’s computer systems and reset the company’s network servers to factory settings, essentially eliminating access to all the company’s data and applications for its eastern United States operations,” a Department of Justice spokesperson wrote in a release on his conviction. “Before his access to EnerVest’s offices could be terminated, Mitchell entered the office after business hours, disconnected critical pieces of…network equipment, and disabled the equipment’s cooling system.” As a result of his actions, the company permanently lost some of its data and spent hundreds of thousands of dollars repairing equipment and recovering historical data. It took a month to bring the company’s office back online, costing the company as much as $1 million in lost business.
And that wasn’t the first time he used technology for revenge. Mitchell’s previous legal troubles resulting from malicious use of his technical skills dates back to when he was a high school junior. In 1996, at the age of 17, Mitchell—who then went by the handle “RickDogg” in online forums—planted viruses in his high school’s computer system. He was suspended for three days from Capital High School for planting 108 computer viruses “to disk space… assigned to another student on the Capital High School computer system,” according to a school district memo obtained by the Charleston Gazette. He then posted threats to students whom he blamed for reporting him. Mitchell was expelled from the school and sued to be re-instated. The case eventually went to the West Virginia Supreme Court.
Read 2 remaining paragraphs | Comments
Eyes-on: Oculus’ Crescent Bay prototype is a new high-water mark
Regular readers are probably tired of hearing us say that the latest hardware demonstration from Oculus is a new high-water mark in virtual reality that finally does away with a lot of the problems holding the technology back. To those readers, I apologize in advance: the new Crescent Bay prototype Oculus announced and showed off at its first-ever developer conference in Hollywood this weekend is a new high-water mark in virtual reality that finally does away with a lot of the problems holding the technology back.
I tried on the new device for two 10-minute demo sessions at the conference, each time going through the same set of 10 pre-made demo experiences. As soon as I put it on (or rather had it put on me; we were barely allowed to touch the fragile prototypes for fear of breaking them), I noticed a significant jump in comfort from previous Rift development kits and prototypes. Those old devices have all been akin to ski goggles, with thick elastic bands in the rear pressing the display box tightly around the eyes. It was a design decision that put a lot of pressure on some sensitive facial areas, and it left this user a sweaty, red-faced mess after every use.
The new design replaces tight elastic with a much more comfortable rigid plastic.The Crescent Bay prototype does away with this issue. Instead of an elastic band, there's now a rigid plastic support that goes over the ears and dips down to join at a thick, triangular rear support, which tucks around the nape of the neck and back of the skull. (The single threaded wire connecting the Rift to the computer now slides down the right side of this plastic support, which is much more comfortable than the over-the-middle-of-the-skull solution on previous dev kits.) This plastic band slides in and out of the main unit quite easily to adjust for differently sized heads, while a small velcro strip comes over the top of the skull for additional support.
Read 15 remaining paragraphs | Comments
Red light camera firm took cops out for meals, then they recommended firm
According to a new report in the Sacramento Bee, Sacramento County sheriff’s deputies and California Highway Patrol officers accepted "at least 250 meals worth $3,800 over a five-year period" paid for by the embattled red light camera (RLC) vendor Redflex.
Then, those law enforcement agencies recommended that the Northern California county renew Redflex’s contract for the county’s RLC system late last year. Five out of the eight members of the law enforcement evaluation team received those free meals, the newspaper reported.
Once informed of the meals, the Sheriff's Office top brass was not happy.
Read 10 remaining paragraphs | Comments
The Samsung Galaxy Alpha is coming to US as an AT&T exclusive
An atypically metallic Samsung phone, the Galaxy Alpha, will be headed to US shores in just a few days. The Alpha is due out September 26, but the bad news is that it will be an AT&T exclusive. It's priced at $199.99 on a two-year contract or $612.99 with no contract.
The Alpha is noteworthy for being the first Samsung phone we've seen in a long time that uses a metal casing. While the back is still plastic, the frame around the device isn't. Samsung has started to experiment with metal more often of late, hoping to ditch the "cheap" feeling that many people get from their smartphones. Besides the Galaxy Alpha, the Note 4 also has a metal frame.
The spec sheet is also out of character for Samsung, which usually makes big phones with a top-tier spec sheet. For the Galaxy Alpha, though, it seems like the company read up on iPhone 6 rumors and built a phone to those specs—the Alpha only has a 4.7-inch 1280x720 AMOLED display.
Read 2 remaining paragraphs | Comments
Google nixes G+ requirement for Gmail accounts
The grand unbundling of Google’s G+ social network continues, with Gmail becoming the latest Google service to gain its independence from Google’s campaign of forced integration. As noted in a post on the WordStream Blog, Google has axed the requirement that new Gmail accounts be tied to a G+ social networking account as of "early September."
For some users, mandatory G+ integration has been an unwanted burden placed on Google’s popular services since 2012, when Google began to link G+ account creation to the sign-up process for most of its services (journalists even had to create G+ profiles in order for their names and pictures to appear next to their own stories in Google News search results). The forced integration became so prevalent—and so onerous—that when Google acquired smart thermostat manufacturer Nest, the running joke was that soon a G+ account would be required to change the temperature in your home.
It's not clear whether the move brought benefits to anyone but Google. Bolting G+ onto YouTube, for example, proved to be a colossal mess yet resulted in no measurable improvement to the cesspool of comments beneath most videos.
Read 4 remaining paragraphs | Comments
iPhone 6 first weekend beats last year’s iPhone 5 sales, sets record
For the third September in a row, people around the United States have queued up (physically and virtually) to buy the latest Apple iPhone—and for the third September in a row, Apple has sold a hell of a lot of iDevices. A report this morning from The New York Times indicates that the Cupertino company has delivered more than 10 million shiny new iPhones 6 and 6 Plus, a number that The Times characterizes as being at the high end of analyst expectations.
Last year, with the introduction of the iPhone 5S and 5C, Apple sold somewhere in the area of nine million devices (though there weren’t as many gold devices as some shoppers would have liked). This year, Apple’s decision to offer larger screens has drawn a large amount of consumer interest—and opening weekend sales that reflect this.
Perhaps not surprisingly, the 5.5" iPhone 6 Plus has been the most difficult device to locate, with projected fulfillment of online orders quickly slipping into October. Apple’s official press release addresses the high demand with this quote from CEO Tim Cook: "While our team managed the manufacturing ramp better than ever before, we could have sold many more iPhones with greater supply and we are working hard to fill orders as quickly as possible."
Read 1 remaining paragraphs | Comments
DisplayPort Alternate Mode for USB Type-C Announced - Video, Power, & Data All Over Type-C
Earlier this month the USB Implementers Forum announced the new USB Power Delivery 2.0 specification. Long awaited, the Power Deliver 2.0 specification defined new standards for power delivery to allow Type-C USB ports to supply devices with much greater amounts of power than the previous standard allowed, now up to 5A at 5V, 12V, and 20V, for a maximum power delivery of 100W. However also buried in that specification was an interesting, if cryptic announcement regarding USB Alternate Modes, which would allow for different (non-USB) signals to be carried over USB Type-C connector. At the time the specification simply theorized just what protocols could be carried over Type-C as an alternate mode, but today we finally know what the first alternate mode will be: DisplayPort.
Today the VESA is announcing that they are publishing the “DisplayPort Alternate Mode on USB Type-C Connector Standard.” Working in conjunction with the USB-IF, the DP Alt Mode standard will allow standard USB Type-C connectors and cables to carry native DisplayPort signals. This is designed to open up a number of possibilities for connecting monitors, computers, docking stations, and other devices with DisplayPort video while also leveraging USB’s other data and power capabilities. With USB 3.1 and Type-C the USB-IF was looking to create a single cable that could carry everything, and now that DisplayPort can be muxed over Type-C, USB is one step closer to that with the ability to carry native video.
The Tech & The SpecFrom a technical level the DP Alt Mode specification is actually rather simple. USB Type-C – which immediately implies using/supporting USB 3.1 signaling – uses 4 lanes (pairs) of differential signaling for USB Superspeed data, which are split up in a 2-up/2-down configuration for full duplex communication. Through the Alt Mode specification, DP Alt Mode will then in turn be allowed to take over some of these lanes – one, two, or all four – and run DisplayPort signaling over them in place of USB Superspeed signaling. By doing so a Type-C cable is then able to carry native DisplayPort video alongside its other signals, and from a hardware standpoint this is little different than a native DisplayPort connector/cable pair.
From a hardware perspective this will be a simple mux. USB alternate modes do not encapsulate other protocols (ala Thunderbolt) but instead allocate lanes to those other signals as necessary, with muxes at either end handling the switching to determine what signals are on what lanes and where they need to come from or go. Internally USB handles this matter via the CC sense pins, which are responsible for determining cable orientation. Alongside determining orientation, these pins will also transmit a Standard IDentification (SID), which will be how devices negotiate which signals are supported and which signals to use. After negotiation, the devices at either end can then configure themselves to the appropriate number of lanes and pin orientation.
Along with utilizing USB lanes for DP lanes, the DP Alt Mode standard also includes provisions for reconfiguring the Type-C secondary bus (SBU) to carry the DisplayPort AUX channel. This half-duplex channel is normally used by DisplayPort devices to carry additional non-video data such as audio, EDID, HDCP, touchscreen data, MST topology data, and more. Somewhat perversely in this case, the AUX channel has even been used to carry USB data, which dutifully enough would still be supported here for backwards compatibility purposes.
Since the main DisplayPort lanes and AUX channel can be carried over Type-C, when utilized in this fashion Type-C is very close to becoming a superset of DisplayPort. In a full (4 lane) DisplayPort configuration, along with all of the regular DisplayPort features a Type-C cable also carries the standard USB 2.0 interface and USB power, which always coexist alongside alt mode. So even in these configurations Type-C allows dedicated high power and USB 2.0 functionality, something the DisplayPort physical layer itself is not capable of. And of course when using a less-than-full configuration, 2-3 of those lanes on the Type-C cable then can be left to running USB Superspeed signaling, allowing USB 3.1 data to be carried alongside the narrower DisplayPort signal.
Meanwhile since DP Alt Mode means that Type-C carries native DisplayPort signaling, this enables several different interoperability options with other Type-C devices and legacy DisplayPort devices. On the hardware side Type-C ports can be used for the sink (displays) as well as the source (computers), so one could have a display connected to a source entirely over Type-C. Otherwise simple Type-C to DisplayPort cables can be constructed which from the perspective of a DisplayPort sink would be identical to a real DisplayPort cable, with the cable wired to expose just the DisplayPort signals to the sink. Or since these cables will be bidirectional, a legacy DisplayPort source could be connected to a Type-C sink just as well.
This also means that since DP Alt Mode is such a complete implementation of DisplayPort, that DisplayPort conversion devices will work as well. DisplayPort to VGA, DVI, and even HDMI 2.0 adapters will all work at the end of Type-C connection, and the VESA will be strongly encouraging cable makers to develop Type-C to HDMI 2.0 cables (and only HDMI 2.0, no 1.4) to make Type-C ports usable with HDMI devices. In fact the only major DisplayPort feature that won’t work over a Type-C connector is Dual-Mode DisplayPort (aka DP++), which is responsible for enabling passive DisplayPort adapters. So while adapters work over Type-C, all of them will need to be active adapters.
From a cabling standpoint DP Alt Mode will have similar allowances and limitations as USB over Type-C since it inherits the physical layer. DisplayPort 1.3’s HBR3 mode will be supported, but like USB’s Superspeed+ (10Gbps) mode this is officially only specified to work on cables up to 1M in length. Meanwhile at up to 2M in length DisplayPort 1.2’s HBR2 mode can be used. Meanwhile DP Alt Mode is currently only defined to work on passive USB cables, with the VESA seemingly picking their words carefully on the use of “currently.”
The Ecosystem & The FutureBecause of the flexibility offered through the DP Alt Mode, the VESA and USB-IF have a wide range of options and ideas for how to make use of this functionality, with these ideas ultimately converging on a USB/DisplayPort ecosystem. With the ability to carry video data over USB, this allows for devices that make use of both in a fashion similar to Thunderbolt or DockPort, but with the greater advantage of the closer cooperation of the USB-IF and the superior Type-C physical layer.
At its most basic level, DP Alt Mode means that device manufacturers would no longer need to put dedicated display ports (whether DisplayPort, VGA, or HDMI) on their devices, and could instead fill out their devices entirely with USB ports for all digital I/O. This would be a massive boon to Ultrabooks and tablets, where the former only has a limited amount of space for ports and the latter frequently only has one port at all. To that end there will even be a forthcoming identification mark (similar to DP++) that will be used to identify Type-C ports that are DP Alt Mode capable, to help consumers identify which ports they can plug their displays into. The MUX concept is rather simple for hardware but I do get the impression that devices with multiple Type-C ports will only enable it on a fraction of their ports, hence the need for a logo for consumers to identify these ports. But we’ll have to see what shipping devices are like.
More broadly, this could be used to enable single-cable connectivity for laptops and tablets, with a single Type-C cable providing power to the laptop/tablet while also carrying input, audio, video, additional USB data, and more. This would be very similar to the Thunderbolt Display concept, except Type-C would be able to be a true single cable solution since it can carry the high-wattage power that Thunderbolt can’t. And since Type-C can carry DisplayPort 1.3 HBR3, this means that even when driving a 4K@60Hz display there will still be 2 lanes of USB Superspeed+ available for any devices attached to the display. More likely though we’ll see this concept first rolled out in dock form, with a single dock device connecting to an external monitor and otherwise serving as the power/data hub for the entire setup.
Speaking of which, this does mean that USB via DP Alt Mode will more directly be competing with other standards such as Thunderbolt at DockPort. Thunderbolt development will of course be an ongoing project for Intel, however for DockPort this is basically the end of the road. The standard, originally developed by AMD and TI before being adopted by the VESA, will continue on as-is and will continue to be supported over the DisplayPort physical layer as before. However it’s clear from today’s announcement that DisplayPort over USB has beaten USB over DisplayPort as the preferred multi-signal cabling solution, leaving DockPort with a limited duration on the market.
It’s interesting to note though that part of the reason DP Alt Mode is happening – and why it’s going to surpass DockPort – is because of the Type-C physical layer. In designing the Type-C connector and cabling, the USB-IF has specific intentions of having the Type-C connector live for a decade or more, just like USB Type-A/B before it. That means they’ve done quite a bit of work to future-proof the connector, including plenty of pins with an eye on supporting speeds greater than 10Gbps in the future.
For that reason the possibility is on the table of ditching the DisplayPort physical layer entirely and relying solely on Type-C. Now to be clear this is just an option the technology enables, but for a number of reasons it would be an attractive option for the VESA. As it stands the DisplayPort physical layer tops out at 8.1Gbps per lane for HBR3, meanwhile Superspeed+ over Type-C tops out at 10Gbps per lane with the design goal of further bandwidth increases. As the complexity and development costs of higher external buses goes up, one could very well see the day where DisplayPort was merely the protocol and signaling standard for monitors while Type-C was the physical layer, especially since DisplayPort and USB Superspeed are so very similar in the first place due to both using 4 lanes of differential signaling. But this is a more distant possibility; for now the DP Alt Mode ecosystem needs to take off for the kinds of mobile devices it’s designed for, and only then would anyone be thinking about replacing the DisplayPort physical layer entirely.
Wrapping things up, the VESA tells us that they are going to hit the ground running on DP Alt Mode and are seeing quite a bit of excitement from manufacturers. The VESA is expecting the first DP Alt Mode capable devices to appear in 2015, which is the same year Type-C ports begin appearing on devices as well. So if everything goes according to schedule, we should see the first DP Alt Mode devices in just over a year.
The all-in-one cable concept has been a long time coming, and after DockPort and Thunderbolt stumbling the market does look ripe for DP Alt Mode. So long as the execution is there, the manufacturers are willing to use it, and device compatibility lives up to the promises. Getting video over USB is the ultimate Trojan horse – unlike mDP, USB is already everywhere and will continue to be – so this may very well be the X factor needed to see widespread adoption where other standards have struggled.
Corsair Gaming K70 RGB Mechanical Keyboard Review
Today is the dawn of a new era for Corsair, as the company has multiple announcements. Corsair is establishing their own gaming brand, announcing the availability of the new RGB keyboards and mice, and they're also releasing a new software engine for their input devices. We're focusing mostly on the new RGB keyboards, and Corsair is dropping the "Vengeance" series name with the new keyboards simply use the brand name and model. So how does the newly christened Corsair Gaming K70 RGB keyboard fare? This keyboard probably had more hype than any other keyboard in history, so let's find out if it can live up to expectations in our full review.
MediaTek Labs and LinkIt Platform Launch Targeting IoT and Wearables
Companies such as Motorola, Apple, Nest, and Fitbit have been targeting the Internet of Things (IoT) and wearables market with devices for the past several years. However, if the smartphone revolution was any indication, we are merely at the tip of the iceberg for these devices. Even Apple acknowledged as much by naming the processor inside the Apple Watch the “S1”, clearly planning for future revisions.
Today, hoping to capitalize on this next wave of technology proliferation, MediaTek is formally launching their Labs program for IoT and wearables. This is one of many announcements we will see over the next year as companies look to enter this market.
MediaTek Labs' goal is to be a central hub for developers to collaborate on everything from side-projects to big business device production. With Labs, MediaTek provides software and hardware development kits (SDKs and HDKs), technical documentation, example code, and discussion forums. MediaTek was a late entry into the smart phone market in 2009/2010 but has since exploded in popularity largely due to very complete reference designs and aggressive pricing. MediaTek aims to reproduce this success, only earlier, for the IoT and wearables space.
When discussing hardware, it’s important to keep in mind there are actual several sub markets. I’ve reproduced a slide and table from MediaTek that does a decent job laying out the differences.
MediaTek's IoT and Wearables Market Segment Description One Application Use (OAU) Simple Application Use (SAU) Rich Application Use (RAU) Examples
Fitness Tracker
Health Tracker
Simple Bluetooth
Smart Watch
Child/Elderly Safety High-end Smart Watch
Smart Glasses
Hardware
MCU (<100 MHz)
Bluetooth
Sensor
MCU (100-300 MHz)
Bluetooth
Sensors
AP (>1GHz w/ multi-core)
Bluetooth
Sensors
TFT Display
GSM/GPRS
GPS
Wi-Fi See-Through Display
GSM/GPRS
GPS
Wi-Fi OS None Mostly RTOS Mostly Linux Price Point Lowest Middle Highest Battery Life Long (>7 days) Medium (5-7 days) Short (2-3 days) Characteristics
Limited computing power, focusing on one task (such as sports, health, find device)
Mostly non-display or with very simply LED display
May have multiple functions and can update apps
Also need outdoor/indoor positioning
Focus for MediaTek LinkIt and Aster (MT2502) chipset
Multiple apps and functions
Sophisticated UI with more powerful graphics and multimedia features
One thing I do not like about this table is it insinuates these markets are mutually exclusive. While I agree there are indeed hardware and software differences between sub markets, with low enough sleep power and smart enough software, a single device could contain both a high performance applications processor (AP) as well as a low power microcontroller (MCU). In fact, that’s exactly what Intel’s Edison platform and many smart phones do, such as the Moto X. Nevertheless, hybrid devices are certainly more complicated and there is a lot of success to be had focusing on a single task.
For example, the popular Pebble smart watch and Nest thermostat each contain a simple MCU with no high performance AP. This is exactly what MediaTek is targeting with their first platform release on labs: LinkIt. LinkIt actually refers to MediaTek’s new MCU operating system, which is launching alongside a new SoC named Aster or MT2502. Additionally, a hardware development kit from partner Seed Studio is available through Labs, as well as a software development kit to aid in firmware development and to help port existing Arduino code.
The core of this kit is of course the new Aster MT2502 SoC. MediaTek feels it is uniquely positioned with an SoC that contains an MCU, Power Management Unit (PMU), Memory, Bluetooth 4.0, and a GSM and GPRS Dual SIM modem (850/900/1800/1900MHz). The total size of the SoC is 5.4x6.2mm. If GPS/GLONASS/BEIDOU or WiFi b/g/n are desired, MediaTek provides compatible external ICs for each.
MediaTek Aster MT2502 SoC Size 5.4mm x 6.2mm Package 143-ball, 0.4mm pitch, TFBGA CPU ARM7 EJ-S 260MHz Memory 4MB RAM integrated Storage 4MB Flash integrated PAN Dual Bluetooth 4.0 WAN GSMS and GPRS dual SIM modem Power PMU and charger functionsLow power mode and sensor hub function Multimedia AMR speech codec, HE-AAC music codec, Integrated audio amplifier, JPEG decoder/encoder, MJPEG decoder/encoder, MPEG4 decoder/encoder Interfaces LCD, VGA camera, I2C, SPI, UART, GPIO, SDIO, USB 1.1, Keypad, Serial Flash, JTAG, ADC, DAC, PWM, FM Radio
Developers eager to get their hands dirty can do so as of today for $79. The LinkIt One development board is available and shipping from Seed Studio. This board combines the Aster MT2502A SoC, MT5931 for WiFi, MT3332 for GPS, audio codec, SD card, many I/O interfaces similar to Arduino, and Arduino shield compatibility.
It will be a while before we see non-prototype designs featuring LinkIt and Aster hit the market, but if MediaTek has its way that will only be the start. MediaTek plans on releasing more SDKs, HDKs, and chips through their Labs website and partners over the next few years. As of this writing MediaTek has already posted a beta SDK and emulator for Android targeting the higher performance IoT and wearable devices. While I am not personally sure just what additional smart devices I need in my life right now, I actually think that gets me more excited about the future than otherwise.
AUDIO: When Temple of Mithras was unearthed
AUDIO: Could 'marshmallow test' teach you?
iPhone 6 and iPhone 6 Plus: Preliminary Results
While we’re still working on the full review, I want to get out some preliminary results for the iPhone 6. For now, this means some basic performance data and battery life, which include browser benchmarks, game-type benchmarks, and our standard web browsing battery life test. There’s definitely a lot more to talk about for this phone, but this should give an idea of what to expect in the full review. To start, we'll look at the browser benchmarks, which can serve as a relatively useful proxy for CPU performance.
There are a few interesting observations here, as a great deal of the scaling is above what one would expect from the minor frequency bump when comparing A7 and A8. In SunSpider, we see about a 13% increase in performance that can't be explained by frequency increases alone. For Kraken, this change is around 7.5%, and we see a similar trend across the board for the rest of these tests. This points towards a relatively similar underlying architecture, although it's still too early to tell how much changes between the A7 and A8 CPU architectures. Next, we'll look at GPU performance in 3DMark and GFXBench, although we're still working on figuring out the exact GPU in A8.
In in GPU benchmarks, we generally see a pretty solid lead over the competition for the iPhone 6/A8. It's seems quite clear that there is a significant impact to GPU performance in the iPhone 6 Plus due to the 2208x1242 resolution that all content is rendered at. It seems that this is necessary though, as the rendering system for iOS cannot easily adapt to arbitrary resolutions and display sizes. Before we wrap up this article though, I definitely need to address battery life. As with all of our battery life tests, we standardize on 200 nits and ensure that our workload in the web browsing test has a reasonable amount of time in all power states of an SoC.
As one can see, it seems that Apple has managed to do something quite incredible with battery life. Normally an 1810 mAh battery with 3.82V nominal voltage would be quite a poor performer, but the iPhone 6 is a step above just about every other Android smartphone on the market. The iPhone 6 Plus also has a strong showing, although not quite delivering outrageous levels of battery life the way the Ascend Mate 2 does. That's it for now, but the full review should be coming in the near future.