So, I did a quick “audit” this evening of four different machines in my
house, running both Ubuntu recent and Fedora recent, but with
different generations of video hardware/motherboards, and tried the
“Persistence” control on all of them.
EVERY SINGLE ONE OF THEM failed, provoking an OpenGL exception from
glAccum, which, it turns out, is an optional feature, and at least in
this little survey, not a single piece of my hardware supported that
op.
How many people actually use “persistence”? (As opposed, I must be
clear, to “Peak Hold”). I suspect that a workable approach is to,
for now, remove that feature entirely, but I don’t have a good feel
for who uses it.
Near as I can tell, the “persistence” feature is intended to give a kind
of storage-scope effect, or high-persistence phosphor effect.
But the “effect” uses a non-mandatory OpenGL feature (the accumulator
buffer) which appears, at least on the garden-variety
video hardware I use, not to be supported. And to be clear, I’m
running modern motherboards on two of my systems, but using
the on-board video, since I’m not a gamer, and really, the simple 2D
effects Gnu Radio uses aren’t particularly taxing.
–
Marcus L.
Principal Investigator
Shirleys Bay Radio Astronomy Consortium
Other then the Mesa software only stack it does not work on any Intel
or ATI driver provided stack, but nVidia blob driver DOES support it.
WXFFT also maxes out my 2 core 3Ghz machine ( a lot of people often
get locked up i7’s so this is a problem ), wxfft realy needs a c++
re-write if anything.
Other then the Mesa software only stack it does not work on any Intel
or ATI driver provided stack, but nVidia blob driver DOES support it.
WXFFT also maxes out my 2 core 3Ghz machine ( a lot of people often
get locked up i7’s so this is a problem ), wxfft realy needs a c++
re-write if anything.
I couldn’t get the VESA/MESA stuff working the other night, so went back
to fglrx, which is what the Fedora installer chose for my machine.
I keep my update rates quite modest, and I can support multiple wxGUI
FFT sinks on the same machine. I’m running two different applications
24 x 7 that have both Waterfall and FFT sinks, and my machine is only
lightly loaded. But I keep the update rates down to 5Hz or so.
Well, the qtGUI stuff is being worked on at the moment, and it should
have much better performance, and provided it can give a similar
amount of user-friendly stuff, perhaps at some point, we let the
wxGUI stuff die. Unless some brave soul wants to seriously work it over
and make it a better performer, and eradicate the openGL edge cases
that it keeps tripping over.
–
Marcus L.
Principal Investigator
Shirleys Bay Radio Astronomy Consortium
On May 24, 2012, at 6:44 PM, Marcus D. Leech wrote:
Near as I can tell, the “persistence” feature is intended to give a kind of
storage-scope effect, or high-persistence phosphor effect.
I achieved a similar effect in the ubertooth-specan-ui Python app, built
on top of Qt’s newer PySide Python library. Perhaps it’ll be a useful
reference for reimplementation in Qt without relying on OpenGL.
The general technique is to draw the graph into an off-screen image.
During each frame update, black is drawn over the existing image, with a
small alpha value, which effectively fades the prior image. The new
graph is rendered over the top at alpha = 1.0, then the image is copied
to the screen. It seems to perform quite well. I presume most video
drivers can push the pixels-with-alpha BLTing into the hardware.
The meat of this technique is in RenderArea._draw_graph(). I imagine the
code would map directly to the C++ Qt API.
I hear the author of Kismet has done something similar in his software,
but I don’t know what graphics API he built it on.
On Thu, May 24, 2012 at 10:18 PM, Marcus D. Leech [email protected]
wrote:
sinks on the same machine. I’m running two different applications
24 x 7 that have both Waterfall and FFT sinks, and my machine is only
lightly loaded. But I keep the update rates down to 5Hz or so.
Well, the qtGUI stuff is being worked on at the moment, and it should have
much better performance, and provided it can give a similar
amount of user-friendly stuff, perhaps at some point, we let the wxGUI
stuff die. Unless some brave soul wants to seriously work it over
and make it a better performer, and eradicate the openGL edge cases that it
keeps tripping over.
I’m really hoping that we can get the qtgui working nicely for
everyone soon and that we can replace all functionality of the wxgui
with it. And then, yes, I won’t shed any tears to let wxgui fade
away.
Tom
On Thu, May 24, 2012 at 10:18 PM, Marcus D. Leech[email protected] wrote:
I’m really hoping that we can get the qtgui working nicely for
everyone soon and that we can replace all functionality of the wxgui
with it. And then, yes, I won’t shed any tears to let wxgui fade
away.
Tom
The only heartburn it gives me is thinking about all those flow-graphs
out there that use wxGUI. [And more important personally, all
the flow-graphs I have that use wxGUI].
–
Marcus L.
Principal Investigator
Shirleys Bay Radio Astronomy Consortium
Hey,
I’m really hoping that we can get the qtgui working nicely for
everyone soon and that we can replace all functionality of the wxgui
with it. And then, yes, I won’t shed any tears to let wxgui fade
away.
Tom
while you are at that, the QT Scope cannot be stopped. That’s actually
THE feature that us useful for Signal inspection. Other than that the
QT Sinks should be separate blocks in GRC, too.
Best,
Marius
On Thu, May 24, 2012 at 10:51 PM, Marcus D. Leech [email protected]
wrote:
The only heartburn it gives me is thinking about all those flow-graphs out
there that use wxGUI. [And more important personally, all
the flow-graphs I have that use wxGUI].
Yeah, that’s why I said “fade away.” It’d be a pretty long deprecation
process, I think, on this one.
Tom