Exynos-laden Galaxy Tab S 8.4 can't take the heat, owner complains
3. Ashoaib (Posts: 3229; Member since: 15 Nov 2013)
hmm... so in future mobile phones will be coming with cooling fans?
I dont mind until it will not effect the battery and will provide heavy processing power
20. Vinayakn73 (Posts: 191; Member since: 05 Oct 2011)
now I know why samsung panic to use metal body. that will tun galaxy into frying pan.
61. madmikepr (Posts: 138; Member since: 09 Aug 2011)
How this Could Happen When Samsung Claim to Make 1,000 torture test before Release!!!
69. iushnt (Posts: 1687; Member since: 06 Feb 2013)
Yaa its a lie..I tried to heat up my gs5 exynos model by continuously benchmarking and playing high end games for hours along with charging it..by worst it told "the device is too hot and is shutting down"
2. HugoBarraCyanogenmod (limited) (Posts: 1133; Member since: 06 Jul 2014)
Exynos are so way behind than mighty mediatek
4. mr.techdude (Posts: 569; Member since: 19 Nov 2012)
Sammy should make s-cooler, strap it on the back and it should cool it down. Give me credit Sammy!!!
5. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
Galacsh!t with Suxynos.... No wonder.
7. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
That's why Apple was desperate to use Exynos socs on their ICr@p phones
Apples iToys, from toysRus, especially made for dummies. And little kids.
17. Iodine (Posts: 1330; Member since: 19 Jun 2014)
When they used exynos in their phones last time ? Are you ok ?
21. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Were. Apple paid Samsung to have their logo stamped on it, instead of Exynos.
32. marcski07 (Posts: 598; Member since: 25 Apr 2014)
if so? why iphone don't suffer like this ugly samsung tablet that looks like a stitch pad or a book of a blind man.
46. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Clocked at 1.3ghz vs 2.0ghz
Ios has a very light Os, unlike Android mini Pc, needs another of Ram and Cpu power to function properly
54. Berzerk000 (Posts: 4275; Member since: 26 Jun 2011)
The A6 is definitely not an Exynos. The A5 was kind of similar to the Exynos 4212, but had a different GPU.
60. Napalm_3nema (Posts: 2183; Member since: 14 Jun 2013)
Exynos in Apple? No. Samsung is still using reference ARM designs, and calling them "Exynos," while Apple uses custom ARM. There might have been some comparison between silicon engineers from Samsung and Apple in the Hummingbird days, but those days are long gone.
9. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Just like the Ipad3, when you can cook an omelette on it.
Even PhoneArena mentioned in the article, a year ago.
Criticising on the finest, superb, chip. Even though it has no way for them to alter their Cpu, like Snapdragon, the Exynos buttkicks the Snapdragon. They were the first to produce A15 socs, Lmfao, Snapdragon don't even have an A15 soc yet, still relying on Arm A9 3years old soc, by fooling customers by overclocking it to 2.7ghz.
19. Iodine (Posts: 1330; Member since: 19 Jun 2014)
Yeah the new iPad was rather warm up to 40 Celsius, but it used an 170 mm^2 45nm SoC and last gen retina display that made more heat. And it didn't melt either.
30. mr.techdude (Posts: 569; Member since: 19 Nov 2012)
Wait I don't dig deep into what snapdragon uses on their chips like A9, etc etc. But what your saying is it really true that they're using arms 3 year old a9 architecture? And exynos being the clear winner. And what's the difference between A9 and A15 ?
33. Iodine (Posts: 1330; Member since: 19 Jun 2014)
That A15 is just two years old.... And about 40% faster and more power hungry than A9.
39. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
The Arm A17 and A57 is still a A15, but with 64bit.
47. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
But when you have 8pipe lines and extend the memory bandwidth to 14mb, it's 100%+ faster
38. Tsepz_GP (Posts: 987; Member since: 12 Apr 2012)
The Snapdragon uses ARM A15 like architecture, but it isn't as powerful per clock as Exynos, notice how each Exynos variant of a Samsung runs at a much lower clock speed than it's Snapdragon counterpart yet manages to either equal it or beat it in benchmarks? The Snapdragon fanboys always avoid this issue with some sort of excuse but this has been the case for a long time. The Galaxy S4 came with the 1st Exynos 5 Octa that was crippled in a way yet despite it not working completely as it should it manages at 1.6GHz to beat the Snapdragon S600 at 1.9GHz, same situation with the Snapdragon 800 Note 3 and Exynos 5 version, the Exynos Note 3 runs at 1.9GHz and the Snapdragon at 2.3GHz at its best bin, yet the Exynos manges similar performance, same situation in the Galaxy S5.
Did you know the Exynos 5 Octa came with H.265 HEVC support and HD720@120FPS video before the Snapdragon 800 arrived? Samsung have been ahead for a long time. :)
40. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
These Snapdragon users and fans belive Snapdragon is the very best.
72. iushnt (Posts: 1687; Member since: 06 Feb 2013)
Arte, there are no SD fans, they are simply sammy haters. If SD was made by sammy and exynos was made by qualcomm then they would praise Exynos and hate SD..the number of sammy haters are too damn high..
14. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Apples iPhone's are an app launching device. You can swipe only left to right. Overpriced midrange phone with a pos 1500mah battery, along with outdated weird resolution. And outdated PPI. Cheap recycled soda can aluminium, which costed apple so less that they must've gave the aluminium as a good will jesture, charity. Even plastic is better than that POS.
22. Iodine (Posts: 1330; Member since: 19 Jun 2014)
Midrange ? Pos ? Cheap ? A weird resolution is a negative ? Recycling aluminum is a negative ? What else ? Boy you are on a bad planet.
27. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Truth hurts, hard to swallow
This is true and genuine
31. Iodine (Posts: 1330; Member since: 19 Jun 2014)
Yeah maybe it's truth for you, but that kind of doesn't bother me right ?
42. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Look who started it off by insulting
71. iushnt (Posts: 1687; Member since: 06 Feb 2013)
When people say the current amoleds are the best screens then u say sammy paid them. Now why dont u say this can be paid as well? Why such a hater JakeLee?
13. tedkord (Posts: 11629; Member since: 17 Jun 2009)
Probably performing CPR on someone who was i-Lectrocuted.
15. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
They are ordered by their "employer" to stay away from articles like this where it's simply inexcusable / too obvious.
Arte-8800 above is not one of them, just a fat elementary school kid pretending to be a tech-savvy adult.
18. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Yea, you've got nothing to counter attack me with, know I'm a school kid, Lmao. Your a system developer, you know more than me, I know about Cpu, more than you.
23. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
I''m a system *OPTIMIZER*
Do you know what this means?
I know EVERY instruction and EVERY cycle timing IN ADDITION TO EVERY CONDITION the pipeline stalls in.
I've been running self-written benchmarks with pretty EVERY ARM SoC.
You picked the wrong opponent. The worst in fact.
Keep singing while I'm chuckling.
29. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Good to know we can command scrip on Android
Apple will never let you do that, on their POS pusedo, Unix halfbaked platform
You are so jealous of Android, though you are a big time Apple fanboy.
41. Tsepz_GP (Posts: 987; Member since: 12 Apr 2012)
You've done nothing but troll, if you are what say you are you'd lay down some deeper knowledge and not accuse anyone who goes against your opinion a paid employer. I know you "optimizers" don't exactly get out much and have next to know personal skills, so I get that, but at least add something worth reading here.
43. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
That's what he does Tsepz_gp
He ad hominem attacks
When he loses, he never admits. I give credit where it's due
24. Iodine (Posts: 1330; Member since: 19 Jun 2014)
Why should he even bother to counter attack a tro...lololo.
28. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
And what did you reply with, nothing
Well your a new to tech world, you wouldn't no jack about anything
25. Ashoaib (Posts: 3229; Member since: 15 Nov 2013)
dont you have anything technical to say instead of personal attacks on arte?
you are calling samsung's fans paid employees while you yourself sometime sound like a paid employee of apple, specially when you wrote such a long paragraphs for apple in some articles
44. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
Since you are asking so politely, I'll give you an answer.
For example, he claimed Apple stole Exynos from Sammy. That Apple's A series SoCs are remarked Exynos.
That's simply impossible.
There are two types of ARM licenses : either you obtain the architecture license or the ISA license.
Sammy has the architecture license while both Apple and Qualcomm has the ISA license.
With the architecture license obtained, the vendors combine ARM's reference design with some other IPs of their choice, and it's done.
As for the ISA license, the vendors design the whole chip by themselves, only keeping the instruction set specified by ARM for binary compatibility.
And designing a chip's architecture is such an extreme high tech job that very few are capable of.
Only Apple and Qualcomm succeeded in that while nVidia miserably failed, and Sammy didn't even dare doing that.
Apple's and Qualcomm's chips are undeniably FAR superior to ARM's reference design. Both put TONS of money for this purpose.
And recently, even Qualcomm gave up on custom designing 64bit SoC and adopted ARM's reference design. Designing a chip is THAT hard, and what Apple pulled out last year with their 64bit A7 chip is simply unbelievable. Truly a milestone in the history of SoC.
And look what BSs this kid is spreading.
- Apple's chips are remarked Exynos
- Exynos is the best in the world, Qualcomm sucks
- I know CPUs better than you
Where should I start laughing?
Am I an Apple fan boy?
I once asked all the AOSP project members : "Is Android a good OS?"
ALL of them answered with a clear "No", only the reasons varied, depending on the specialty of each member. From overly complex to too prone to mistakes.
And from my point of view - an optimizer's one - AOSP is unbelievably inefficient for the CPU. Especially parts written by none other than Google heavily suck. It's simply beyond anyone's imagination.
What about iOS in this regard?
Marvelous, breathtaking throughout the whole framework.
The difference in the quality of software far exceeds that of the 5s' body and crappiest Chinese noname Android.
Am I an Apple fan boy?
Maybe. As an engineer myself, I'm amazed by nothing else than Apple's craftmanship, both in HW and SW.
That's pretty much all.
48. Ashoaib (Posts: 3229; Member since: 15 Nov 2013)
well, I agree with that it is sadistic to call apple's Soc a remarked exynos... these two are completely different except both are manufactured by samsung and now by tsmc... but it is also a fact no matter how much you criticise samsung, they have real manufacturing prowess... name a componnant and most probably you'll find samsung manufacturing it... on the other hand, despite having such a vast resources they are unable from quite some time to design any real appealing design for their phones... they are manufacturing socs for others but not quite successful for their own, seems like contract manufacturer foxcon, who has huge manufacturing facilities but no product of their own
51. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
Sammy lacks at two things :
And below is what Android fans should know :
- Sammy just pretends to compete against "evil" Apple.
- Sammy isn't Android's champion.
- Sammy is much more interested in killing other weaker Android vendors.
- Sammy's current dominance isn't good for Android's future
If you are a real Android fan, you should avoid Sammy's at all costs for Android's sake.
57. Ashoaib (Posts: 3229; Member since: 15 Nov 2013)
I will buy may be in future bcoz I love Super Amoled display... specially after a hands on with tab s 8.4... Super Amoled is superb, just give it a try... I hope all manufacturers will start using Amoled displays soon
58. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Your not new here ashoaib, this Jakelee hates Android and Samsung.
He's jealous, that's Android Linux vs rubbish app launcher ios, is 4years behind. And fake bogus pusedo multitasking on ios.
59. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
AMOLED is another unethical lie from Sammy, especially pentiled one could be quite harmful for human eye and brain.
Although it's not immediately notable at higher ppi nowdays, the pentile arrangement sends disturbing signals to your eye and brain that have to get compensated unconsciously, and that tires both.
And you should avoid looking directly at light emitters. That harms your eyesight.
LED = Light Emitting Diode. Enough said.
Further, AMOLED has a much shorter life span than LCD.
You certainly heard of burn-ins.
That's the lesser evil. The blue diodes wear off more than twice as fast as the green or red ones.
The color accuracy suffers already after a few months of use : It steadily gets more and more yellowish due to this.
PA should do a longer-than-longer-term-review proving this : the same display test with the very same phone used for the initial review. Nothing will be the way it used to be after using the phone for around 6 months.
All in all, Sammy should have NOT created any product with AMOLED with all those graving drawbacks.
73. iushnt (Posts: 1687; Member since: 06 Feb 2013)
Now the creditibility of jakelee got too low..too much hatred..he/she hates anytging that sammy makes and seems like a paid troll from Apple.I have been using infuse 4g simce 3 years. Now its a secondary phone..how much longer life do I need for amoled.. I am sure it will go good for another 3 yrs?
49. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Clock frequency is pointless to compare by itself. Sure A15 might be faster clock-for-clock, but it's also more power hungry so you won't be seeing it hit 2.0 Ghz anytime soon. Looks like a Snapdragon 800 can easily clock higher which more than compensates its IPC disadvantage compared to A15 while keeping power usage in check.
Qualcomm is sitting very comfortable right now. No wonder their market cap is now bigger than Intel, while they have a fraction of Intel's revenue/profits. The market clearly likes what they see from them.
I agree to both Qualcoom and Apple.
But looking back, even from start Exynos was out preforming Snapdragon in the past.
Snapdragon is using A9 to gain half the performance of A15, by using an A9.
But still Snapdragon is relying on old A9 architecture, while Exynos and Tegra4 kills both Apple and Snapdragon
Apple and Snapdragon have done nothing, but useless extra pipelines and extra memory bandwidth to obtain wide performance power.
55. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
Please, do me and yourself a favor and stop this.
What you are talking is nothing else than a collage of craps you read somewhere on the web, ignorantly or purposely distorted ones, I know that exactly. That much I know.
Ask politely instead if you want to know something.
52. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Since performance comes at a cost, this Soc seems in the 8/10 W Tdp range running games. The comparison is pretty clear with A4-5000, done on the same process.
Likely the power draw is lower in cpu only tasks but IMO users will enjoy a fast battery death under games. It's a Qualcomm choice, will see the market acceptance.
I prefer more balanced socs, this do not seem the case.
53. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
I will be interesting to see how the Silvermont based SoC stand up against Snapdragon 800 and Tegra 4. I guess it will do quite well considering that CloverTrail+ seem to beat even Exynos 5 Octa. CT+ is a dual core CPU using a 5 year old Atom design...
And this is the most interesting part "Intel's chip scores highest on the most tests and draws the least current"
56. Arte-8800 (banned) (Posts: 4562; Member since: 13 Mar 2014)
Having used Apple tablets (with their relatively slow CPU but fast GPUs and hardware accelerated drawing/scrolling), I believe how fast a device feels is more a software/GPU issue than a CPU one. Similarly browsing depends a lot on how fast you can download stuff.
The original ABI press release almost literally reads like an Intel advert. And the chosen benchmark is one of the few where Atom scores well, so that way it not only seems faster then it really is but appears more power efficient as a bonus (as the A15/Kraits have to do more work to get the same score). If you did power numbers on Geekbench then Atom wouldn't look very efficient at all - the only results Intel will show is where they look best.
I have no idea what AnTuTu actually does but Atom scores ridiculously high, so much that either the calculation must be incorrect or there is some kind of cheating going on. For example Atom appears to score more than twice as much as A15 on the memory test, but Geekbench shows a completely different story: Stream - the traditional memory benchmark - shows that A15 is about 3 times faster.
So which is right? Well give me the AnTuTu source code and as a compiler/benchmarking expert I'll explain what is wrong with it and how it could be trivially gamed to show a much higher score (if you look at their web page, it is full of cautions about companies cheating the results). I know for a fact one cannot do this with Stream, which is why I trust the Geekbench results.
64. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)
Ok, I cannot answer all your questions, but I'll try the best within the scope of my knowledge.
Intel is CISC while ARM is RISC.
Although RISC is by no means inferior to CISC, Intel can "cheat" due to the CISC nature.
x86's 40 years old CISC ISA nowdays serve just as a kind of interface for binary compatibility. The actual computation occurs under the surface in a "RISC" mode with 100+ *physical* registers very efficiently. Instructions that seem to be dealing with costly memory accesses are actually interacting with those hidden, physical registers, thus saving a lot of power in addition to high performance.
On a "real" RISC machine like ARM it cannot be done that way. It has to live with the 14 *architectural* registers available. While ARM's 32bit ISA was extremely delightful up to ARMv6, It crippled the new enhancements added on ARMv7 like superscalar and out-of-order-execution that are simply necessary for higher IPC.
CA15's performance is therefore pretty much the absolute maximum ARM's 32bit ISA can achieve.
A new ISA was absolutely due for further progresses, and it came with the completely redesigned 64bit ARMv8, designed with superscalar and out-of-order-execution in mind.
That's THE difference between Intel and ARM.
Almost all self-claimed tech experts around the world were busy discounting Apple's A7 last year, 64bit being not beneficial and only necessary for 4+GB memory.
It's only true for x86 to x64. Since they differ only in the "interface" and addressing range, x64 is only marginally, situationally faster than x86.
And for ARM, ARMv8 is FAR more efficient than ARMv7 thanks to the new ISA.
The caveat is though, unlike Intel, ARM cannot switch between 32 and 64bit within a process. Which means 32bit apps need 32bit frameworks and 64bit apps need the same in 64bit.
And that makes Android's transition extremely hard :
Optimized 32bit codes run much faster than unoptimized 64bit ones.
Since the 64bit frameworks are hardly optimized, 64bit apps won't be any faster than 32bit ones, more than often even slower.
Unlike Apple who made the WHOLE optimized 64bit framework available from day 1 last year, Google cannot do the same due to Android's open source dependency in addition to their own incompetency in optimizing. And yes, I MEAN this. Google's allegedly optimized codes are a collection of jokes. They are so bad that Qualcomm did their own version of most of them. (more on this later)
Google is lying with its "64bit ready" claims. Android is FAR from that. All Android can right now is running machine codes in "64bit mode" that's worth nothing as long as the apps perform better in 32bit mode due to the differences in the degree of optimizations.