NVIDIA bringing its desktop graphics to next year's mobile Tegra, watch it render a face in real time
0. phoneArena 24 Jul 2013, 09:10 posted on
The Project Logan test results seem to be pretty encouraging, as NVIDIA took a prototype to demo at the Siggraph conference that is going on, doing one of the most trying tasks - rendering a human face in real-time. Previously this demo called "Ira," was done using a powerful desktop machine, but now it is fully functional on a Project Logan mobile silicon...
This is a discussion for a news. To read the whole news, click here
1. SakuroAkino (banned) (Posts: 141; Member since: 12 Jul 2013)
but still nokia will upgrade only Camera in 2014
4. jove39 (Posts: 1130; Member since: 18 Oct 2011)
what? This article is about nvidia's project logan!!!
3. jove39 (Posts: 1130; Member since: 18 Oct 2011)
Amazing graphics...I wonder one day animated movies will come as instructions to GPU instead of stored frames :)
Don't get hopes high on next year...now a days nvidia comes late to party!!!
7. CanYouSeeTheLight (Posts: 538; Member since: 05 Jul 2012)
Tegra 4 came to late and announced too soon, i do hope "Logan" will come early next year not late next year.
10. jove39 (Posts: 1130; Member since: 18 Oct 2011)
Well...in fact still waiting for a T4 retail device...T4 hasn't moved from announcement > retail > consumer...poor nvidia...their profits are going down!
12. zennacko (Posts: 156; Member since: 16 Jun 2013)
That would be somewhat interesting, except for the part that they would have to include "graphic settings" (or: playing your movie on ultra settings with 24x FSAA, tesselation, etc.)
And movies would be susceptible to lag :(
And a "high end" movie would be unplayable on low-end graphic chips/cards (you can't actually play a full-HD movie, fullscreen with no stuttering on the first Atom netbooks, can you?)
That aside, would be great since the file sizes would be smaller, with a good encryption algorithm it could be even smaller, since the CPU would only decipher it while the GPU gets the rendering job done, no codecs needed!
13. dynamo.one (Posts: 62; Member since: 14 Feb 2013)
I'm sick of this sh*t... processors are powerful enough..the companies should provide more storage and more battery..i can't believe we are in 2013 and some phone's limite is 16GB with no microsd.
14. phonemirer (Posts: 110; Member since: 07 Dec 2012)
yeah but thats because android is still laggin no matter of processing power
17. ztkells (Posts: 7; Member since: 26 Jan 2011)
dynamo.one is right though. processing power isn't what is needed and won't cure the lag either. 4GB of ram should do the trick though.
6. Dr.Phil (Posts: 789; Member since: 14 Feb 2011)
The video does say that the reference tablet uses "2-3 watts", so the answer about power consumption is right there.
8. aayupanday (Posts: 190; Member since: 28 Jun 2012)
Shut up and release Tegra 4 first...
Or cancel it and release Logan.
9. livyatan (Posts: 297; Member since: 19 Jun 2013)
This pretty much means that a device like the Shield 2 or a tiny android console could close in to the Xbox one and PS4 in terms of processing ability.(the memory bandwidth will stil have ways to go though)
I take it will reach over 500GFLOPS, or even up to 1TFLOP wihout thermal constrictions, and be done in 16nm finFET process.
And if Nvidia licenses it at a decent price, we could see this even in the next Apple and Samsung phones and tablets
11. jove39 (Posts: 1130; Member since: 18 Oct 2011)
we are far from 16nm process...but it takes nothing to dream about it :)
16. Shatter (Posts: 1652; Member since: 29 May 2013)
No we are not intels broadwell chips coming next year are 14nm.
The Radeon R9 series is 20nm and nvidia maxwell is 20nm, you will see a 14nm gpu in 2016.
18. belovedson (Posts: 807; Member since: 30 Nov 2010)
i still dont understand how they plan on cooling it.
nvidia project is awesome though.
battery manufacturing and the programmers have to really step up to catch up.
its like they are a generation behind the tech
19. itsdeepak4u2000 (Posts: 1511; Member since: 03 Nov 2012)
NVIDIA is silent so far, i think they would bring something big.