Snapdragon-powered Windows 8 PCs a reality for 2012 according to Qualcomm
With Windows 8, Microsoft hopes to blur the lines between PC and tablet. Qualcomm’s chips tend to be not as powerful as some of the competitors, but are able to achieve superior energy efficiency and combine an application processor with a cellular radio, which allows them to always remain connected. Qualcomm believes that those features of its chips would make a great platform for Windows 8 devices.
Because most previously released software has been written for an x86 architecture, critics are quick to point out that older programs, say from Windows 7 or XP, will not run on a Snapdragon-powered PC. Steve Mollenkopf, Qualcomm’s Chief Operating Officer, believes that moving forward, key applications would be re-written to operate on ARM and many programs run in a cloud environment and are accessible via an Internet browser.
He also makes a case that the iPhone and iPad both run on ARM-based chips and have achieved industry success even though they are unable to run legacy software. "For the apps that you really care about, I don't see it as a significant growth inhibitor in terms of ARM vs. Windows," he said. "I don't think the impact is as significant as what others believe."
So how soon could this be a reality? Qualcomm CEO, Paul Jacobs, predicts the last quarter of 2012. While he has a number of formidable opponents in this sector, including Nvidia, Texas Instruments, and Intel, Jacobs is confident about Qualcomm entering the PC market. He said, “We're comfortable with our technology portfolio. It's mainly because we've been focused on it longer and have more people working on it than anyone else in the industry.”
source: CNET via BGR
1. remixfa (Posts: 14251; Member since: 19 Dec 2008)
"Qualcomm’s chips tend to be not as powerful as some of the competitors, but are able to achieve superior energy efficiency "'
no offence, but snapdragon (s1, s2, s3) chips are the WORST at power efficiency... by a decent margin.
unless by "best", you mean "make best use of the wall charger" :)
2. Penny (Posts: 1340; Member since: 04 Feb 2011)
Not going to comment on their previous chips, because I honestly don't know that much, but the upcoming S4 chips are built on the 28nm process, so that should have a significant positive impact on battery life. Also, the S4 chip will have integrated LTE, which should further significantly reduce power consumption of the device.
The S4 is supposedly capable of running console and PC class games as well, with better physics and shader performance. Overall, it looks like it might be a worthwhile chip.
5. remixfa (Posts: 14251; Member since: 19 Dec 2008)
compared to the S3, absolutely. When you compare it to the competition, it doesnt actually seem like its going to fare better much than the next iteration of the dual core exynos, the 4212.
Samsung hasnt said too much about their upcoming quad core monster, but it would be safe to assume that their power gains from dual to quad should be about the same as Qualcomm's and Nvidia's as they make the same leap.. which would "possibly" leave samsung even further ahead of the pack then they already are... especially since they are rumored to be going back to PowerVR (one can only hope).
just for examples (all hypothetical of course) going off quadrant scores. if they all got a doubling of power (which is what Nvidia claims the T3 is going to be)
the average tegra2 quadrant runs about 2300, so that means we should expect 4600 when the T3 is properly benched.
omap runs around 2400 so 4800
S3 runs around 2200 ... so 4400..
Exynos runs about 3300 so.. 6600..
IF they all go up at even rates (they wont) then the current power divide is going to get even worse. Thats what worries me. They are all taking different approaches to the chips though, so who knows what is going to actually happen. Cant wait to find out though. :)
You have to forgive me, my geekiness takes over some times. :)
6. Penny (Posts: 1340; Member since: 04 Feb 2011)
Yeah, I have always heard great things about the Exynos processor, so I wouldn't doubt it's superiority in performance if it kept up the trend.
But, at least according to Qualcomm, the initial S4 processors that will be released are going to be the DUAL CORE versions, which they claim will be competitive in performance with the quad core competition. They will release the quad core after that, and if their dual cores are competitive, the quad cores should excel.
Either way, I am not sure if the other manufacturers are doing this or not, but the inclusion of LTE on the same chip really should reduce power consumption, so Qualcomm might have an edge there, if not in performance.
I really don't spend much time or effort learning about mobile processors, so my knowledge is pretty much limited to the above, but I did get this information while watching an interview of Qualcomm's VP of Product Management:
8. remixfa (Posts: 14251; Member since: 19 Dec 2008)
:) im one of those geeky people that likes spec manuals n such. :)
the next gen dual core for the Krait (S4) should be competitive with the other next gen dual cores like the 4212 exynos. their power improvement with the smaller die and all in one LTE solutions will definately help.
Its all speculation at this point until we get actual voltage draws and benchmarks though. :)
11. KingKurogiii (Posts: 5658; Member since: 23 Oct 2011)
forgive me remixfa, the King's a big fan but i just need to change some numbers there. i'd really rather we calculate by the best, accurately possible scores here.
S3 - @1.2GHz: 2200-2300 @1.5GHz: 2500-2600
Tegra II - @1.0GHz: 2600-2700
OMAP4 - @1.0GHz: 2700-2800 @1.2GHz: 2900-3000 (haven't seen it yet but that's how it should be)
Exynos 4210 - @1.2GHz: 3500-3700
also look at this ->http://www.youtube.com/watch?a
nnotation_id=annotation_47805& feature=iv&src_vid=_d1U1Hw0cCY &v=I-sUQUJT9wk
i was looking for a good quadrant test on a Razr but people don't really test them so many times consecutively like they're supposed to and i know that guy was annoying but something interesting i noticed is at 2:07 when it's doing the 3D test is that the frame rates are MUCH higher than they are on other Motorola OMAP powered devices that's because with those Motorola used their own drivers instead of TIs proprietary drivers and they weren't so great but here it looks like what you'd expect from a PowerVR SGX540 at 304MHz. will this affect the scores it doesn't appear so here but who knows where another test could have taken it. i need to get enough time one day to spend in a Verizon store putting one of these babies through it's paces.
also here's a Bionic Quadrant test so you can see what i'm talking about, the 3D test starts at 0:42.http://www.youtube.com/watch?v
17. remixfa (Posts: 14251; Member since: 19 Dec 2008)
i was going with the lower numbers for all of them.. but we could use those.. that just makes the situation for Snapdragon even more dire than the numbers I used.
gonna look at the links, i promise. :) Just buzzin through this morning on as i sip some coffee n get ready for clinicals. I'll check em out this evening sometime. :)
10. snowgator (Posts: 3336; Member since: 19 Jan 2011)
You know, I come on any chip related story just to see how you and the King-former-thump3r respond to them. I dig the geek stuff, but I can only add really, really important stuff like this:
OH YEAH??? WELL... SNAPDRAGON IS A WAYYYYY COOLER NAME!!!
Ahhhh ... feeling better now that I have added something....
12. KingKurogiii (Posts: 5658; Member since: 23 Oct 2011)
yeah, i remember hearing about it being in the HD2 and the Exynos and thinking "wow, this thing must be badass" xD
14. KingKurogiii (Posts: 5658; Member since: 23 Oct 2011)
not the Exynos i meant the Nexus One. xD
3. darac (Posts: 2156; Member since: 17 Oct 2011)
remixfa, you ever even used a snapdragon phone?
I use one(S2 on Xperia arc..the most used smartphone chipset in the world btw, the MSM 8255) and I'm pretty happy with the battery. Better than the tegra 2 on friend's optimus 2x, and on pair with galaxy S2 of the other friend)
4. remixfa (Posts: 14251; Member since: 19 Dec 2008)
darac, no offence, but your really asking the wrong person that :)
Yes, I've owned a few.. and my wife still carries her S2 loaded G2.
most used in the world, doesnt mean the best. Im a chip/ tech geek. i drool over this stuff. and thus, i tend to champion whatever is best in the category and spurn the worst. And for single core and dual core snapdragons are the worst. its not personal, we all have our quirks. :)
The S4 is supposed to bring a 75% improvement over S3 battery life... which is extremely significant. The S3 was supposed to be a good.leap as well, but it wasn't. The biggest thing I am looking forward to when we get to the next generation (other than that fat increase in power) is the very reduced power consumption.
7. DroogV59 (Posts: 37; Member since: 02 Jun 2011)
One thing that several sources all say is that the (Tegra3) speed bump is basically due to Qualcomm’s Krait chip. People who have Kraits in-house tell SemiAccurate that the dual core version mops the floor with a 1.5GHz quad A9, something that a vanilla dual core A15 should not be able to do. The particular A9 variant in question wasn’t named outright, but there is only one quad A9 sampling right now.
9. remixfa (Posts: 14251; Member since: 19 Dec 2008)
T3 and Krait are not the same thing, not the same company, and most definately not the same design of chip.
Krait doesnt even have a quad core in sampling yet. They havent gotten the dual core sampled which is 3-6 month in advance of the quad. The only one that has made it beyond paper speculation at this point is the Nvidia quint/quad core.
cores dont matter as much as other things. much like a well done V6 can destroy a crappy V8. But we have to wait till everyone is off the paper stage and into live benchmarking before we can tell who goes where in the scheme of things. we have to wait till Q1-Q3 2012 to be able to do that.
13. ilia1986 (unregistered)
"key applications would be re-written to operate on ARM and many programs run in a cloud environment and are accessible via an Internet browser."
I'm sorry but that's absolute BS. Say I own a Windows 7 Laptop. And I do. Why on earth would I want to purchase a Windows 8 non-Intel PC and have to repurchase every x86 Application I want just to have the same functionality again? And what about the zillions of freeware apps from independent developers? You know - the likes found on Download.com? These aren't going to be recoded again just for the sake of it any time soon!
Windows =| Android.
You want me to buy a Windows 8-based PC with an arm chipset? A Windows 8 non-Intel Tablet? Make sure that all the existing apps can run on it too. It's called backwards compatibility - something MS has been doing ever since Windows 95. I don't care how hard it is. I - as a consumer - will NOT buy a non-Intel Windows 8 device unless it's backwards compatible with all the x86 software.
16. remixfa (Posts: 14251; Member since: 19 Dec 2008)
I gotta agree with you there. Most the alure of the windows8 platform is the marraige of Metro UI and windows desktop 7 so you get the touch slickness with windows compatability for programs. Otherwise, its just another big phone-tablet.