http://software.newsforge.com/article.p ... 3&from=rss
So, Alan and Bob use Mac, right? And you have hardware acceleration for desktop UI. Does it matter?
Both Linux and Windows are embracing hardware acceleration for ordinary desktop work, and OS X is already there.
I can't think of any desktop stuff that isn't fast enough, graphics-wise. My main complaints are network operations that don't fork off a thread so they block the UI (Firefox proxy access on DNS miss, Outlook in various situations) and UI causing disk accesses (Start menu). Linux-wise, it's mostly a paging thing making apps that I haven't touched for a while slow to start (work).
If this gets us shrunken views of minimized apps I'm all in favor of it. But is there anything else to it? I don't think 3D spinning desktops are the answer, by the way.
hardware accelerated desktop
It seems to me that the overhead involved in creating and managing the desktop in memory would negate the tiny gains you could get from offloading the window drawing to hardware. How much processor can you actually save? And how much memory and memory bandwidth is it going to cost?
While I love the F9 behavior on macs, a functional equivalent could be implemented on Windows without needing hardware acceleration. You just take a snapshot of each window and scale that rather than scaling the window itself. There's a WinXP powertoy that's supposed to do it the Alt+Tab, but I've never tried it.
And now that I think about it, most non-professional OpenGL cards are optimized for single window/full screen work. What happens if I try to run an OpenGL (or God forbid Direct3D) game or app? And how would this work with anti-aliasing? What you want is text antialiased, but not lines. OpenGL is all or nothing, unless they're going to try some kind of crazy multi-pass, multi layer scheme.
Of course, this whole thing isn't aimed at people like me who run WinXP with the nice clean "classic" windows manager. It's aimed at people who have a desperate need for rounded corners on their windows.
While I love the F9 behavior on macs, a functional equivalent could be implemented on Windows without needing hardware acceleration. You just take a snapshot of each window and scale that rather than scaling the window itself. There's a WinXP powertoy that's supposed to do it the Alt+Tab, but I've never tried it.
And now that I think about it, most non-professional OpenGL cards are optimized for single window/full screen work. What happens if I try to run an OpenGL (or God forbid Direct3D) game or app? And how would this work with anti-aliasing? What you want is text antialiased, but not lines. OpenGL is all or nothing, unless they're going to try some kind of crazy multi-pass, multi layer scheme.
Of course, this whole thing isn't aimed at people like me who run WinXP with the nice clean "classic" windows manager. It's aimed at people who have a desperate need for rounded corners on their windows.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
Well, the point is if all your windows are rectangles with window textures on them, you can do your scaling using the GPU's FPUs, not the CPU's.
I believe Vista's Avalon interface shows much higher write traffic from the GPU than gaming workloads do, which is why you need a bidirectional interface like PCIe. Don't quote me on that, though.
I guess I need to go to the Apple store and push F9 on the systems, because I'm not really sure what Expose does.
But, in large part, I think you're right. There's not a lot of desktop UI out there that requires hardware acceleration. Is it a chicken and egg problem? No one can think of neat UI ideas until it's cheap to implement? My suspicion is no, but what do I know?
I believe Vista's Avalon interface shows much higher write traffic from the GPU than gaming workloads do, which is why you need a bidirectional interface like PCIe. Don't quote me on that, though.
I guess I need to go to the Apple store and push F9 on the systems, because I'm not really sure what Expose does.
I'm not sure I understand your point.What happens if I try to run an OpenGL (or God forbid Direct3D) game or app?
Presumably they don't antialias any desktop stuff.And how would this work with anti-aliasing?
But, in large part, I think you're right. There's not a lot of desktop UI out there that requires hardware acceleration. Is it a chicken and egg problem? No one can think of neat UI ideas until it's cheap to implement? My suspicion is no, but what do I know?
Disclaimer: The postings on this site are my own and don't necessarily represent Intel's positions, strategies, or opinions.
No, because if my desktop is clean and simple, the tiny amount of processor power I could regain with hardware acceleration is negligeable. Hardware acceleration is for new, more elaborate frills, not vanilla box drawing.Dwindlehop wrote:Well, the point is if all your windows are rectangles with window textures on them, you can do your scaling using the GPU's FPUs, not the CPU's.
I believe it. The early articles I read about hardware desktops talked about caching screens in video memory to try to cut down the traffic.I believe Vista's Avalon interface shows much higher write traffic from the GPU than gaming workloads do, which is why you need a bidirectional interface like PCIe. Don't quote me on that, though.
It reduces all of your windows in size until every one of them can be tiled across your screen. You can then select the one you want and it is on top when they resume their original size. It's an alternative to Alt-Tab, but prettier. Just about the only GUI frill I've seen in the last four or five years that's actually better than its predecessor.I guess I need to go to the Apple store and push F9 on the systems, because I'm not really sure what Expose does.
My point is that in the average system, only the applications are using the hardware APIs, which means that they can make some assumptions. Full screen Direct3D can be written to detect loss of their lock, but very few developers do it. Instead, they either mask the Alt-Tab or they just don't care and let the system die. If the desktop locked Direct3D for full screen drawing (which seems like the only reasonable implementation), all full screen games would fail to establish the lock and depending on how they were written, would fall back to a windowed mode or crash. OpenGL has similar issues, but most of the OpenGL code I've seen tends to have the appropriate protection built in to it.I'm not sure I understand your point.What happens if I try to run an OpenGL (or God forbid Direct3D) game or app?
Hmm, and another point is that now the desktop is going to consume significantly more video memory. That means the performance of most games on most cards will be degraded by the need to swap textures in and out of memory.
I meant in terms of Cleartype and the open source equivalents. They are basically a limited form of antialiasing. But without understanding how it's implemented now, I guess I can't be sure it wouldn't work.Presumably they don't antialias any desktop stuff.