Dual cores, clock skew and odd behavior
Dual cores, clock skew and odd behavior
I'm not ready to put this in the bugs forum yet since I'd like to hear what people think of this theory first. But I've got an idea that Celestia may not like running on both cores of some dual core systems (well, at least mine).
While I get very good frame rates (70+/sec), after running for some time Celestia starts "stuttering". And I don't mean dropped frames or jerky movement like in low frame rates. What I observe is Celestia slowly losing its mind.
For instance:
- Zooming up on a planet gives the effect of the planet largely getting closer, but instead of smoothly and consistently getting larger, it does so in a jerky manner, sometimes getting smaller, sometimes larger, but more often larger so that *eventually* you zoom in to the level you wish.
- Travelling to another planet yields similar results, where objects you pass by jerkily get closer, then further, then closer. You eventually pass them, but it's a weird ride.
- The time display doesn't smoothly increase. Often, you'll see time move backwards a second, or flicker back and forth between two adjacent seconds.
These effects get more pronounced the longer *the machine runs from the last boot*. In other words, restarting Celestia doesn't help matters at all. Only a reboot sets things back to normal.
I did some searching and didn't find anything helpful, and had settled for rebooting periodically. It then occurred to me that perhaps there's some clock skew between the two cores of my AMD 64 X2 3800. So on a hunch, I used the Task Manager to restrict Celestia's processor affinity to a single core. Lo and behold, an instance of Celestia that was running poorly moments ago suddenly began to run smoothly once again.
While I suppose that some driver issue may be at the heart of this, it also seems to me that some clock skew between cores is entirely possible. Has anyone else seen this or have an answer/opinion as to what may be going on? Would Celestia benefit from restricting itself to a single core, or at least supporting an option that made it restrict itself? If that isn't a tenable request, does anyone know of an approach (short of writing a wrapping launcher program that restricts affinity) that will automatically restrict affinity? My 3 year old really digs Celestia, but I'm not of a mind to try to show him how to use Task manager.
Thanks,
-h
While I get very good frame rates (70+/sec), after running for some time Celestia starts "stuttering". And I don't mean dropped frames or jerky movement like in low frame rates. What I observe is Celestia slowly losing its mind.
For instance:
- Zooming up on a planet gives the effect of the planet largely getting closer, but instead of smoothly and consistently getting larger, it does so in a jerky manner, sometimes getting smaller, sometimes larger, but more often larger so that *eventually* you zoom in to the level you wish.
- Travelling to another planet yields similar results, where objects you pass by jerkily get closer, then further, then closer. You eventually pass them, but it's a weird ride.
- The time display doesn't smoothly increase. Often, you'll see time move backwards a second, or flicker back and forth between two adjacent seconds.
These effects get more pronounced the longer *the machine runs from the last boot*. In other words, restarting Celestia doesn't help matters at all. Only a reboot sets things back to normal.
I did some searching and didn't find anything helpful, and had settled for rebooting periodically. It then occurred to me that perhaps there's some clock skew between the two cores of my AMD 64 X2 3800. So on a hunch, I used the Task Manager to restrict Celestia's processor affinity to a single core. Lo and behold, an instance of Celestia that was running poorly moments ago suddenly began to run smoothly once again.
While I suppose that some driver issue may be at the heart of this, it also seems to me that some clock skew between cores is entirely possible. Has anyone else seen this or have an answer/opinion as to what may be going on? Would Celestia benefit from restricting itself to a single core, or at least supporting an option that made it restrict itself? If that isn't a tenable request, does anyone know of an approach (short of writing a wrapping launcher program that restricts affinity) that will automatically restrict affinity? My 3 year old really digs Celestia, but I'm not of a mind to try to show him how to use Task manager.
Thanks,
-h
haxsaw99, probably this is a stupid question, but... have you the possibility to check your CPUs temperatures?
If yes, have you checked them when using Celestia?
I noted that, even not having a double core CPU, when I use Celestia the temperature rises rapidly, much more than during the normal operations.
Just a little cent, who knows...
Bye
Andrea
If yes, have you checked them when using Celestia?
I noted that, even not having a double core CPU, when I use Celestia the temperature rises rapidly, much more than during the normal operations.
Just a little cent, who knows...
Bye
Andrea
"Something is always better than nothing!"
HP Omen 15-DC1040nl- Intel® Core i7 9750H, 2.6/4.5 GHz- 1TB PCIe NVMe M.2 SSD+ 1TB SATA 6 SSD- 32GB SDRAM DDR4 2666 MHz- Nvidia GeForce GTX 1660 Ti 6 GB-WIN 11 PRO
HP Omen 15-DC1040nl- Intel® Core i7 9750H, 2.6/4.5 GHz- 1TB PCIe NVMe M.2 SSD+ 1TB SATA 6 SSD- 32GB SDRAM DDR4 2666 MHz- Nvidia GeForce GTX 1660 Ti 6 GB-WIN 11 PRO
How long is "for some time"? 3 minutes? 3 hours?
What version of Windows are you running? XP Home? XP Pro? SP2? 32bit O/S? 64 bit?
What version of Celestia are you running?
To first approximation, I don't understand how there could be clock skew between cores: they're synchronized to the same clock.
I haven't seen the effect you describe, but I may not have been running Celestia long enough.
At home I run Celestia on a hyperthreaded P4 (two simulated CPUs).
At work I sometimes run Celetia on a Dell system that has an Intel Core2Duo processor. Both systems running Celestia v1.4.1 or from CVS under 32bit Win XP Pro.
p.s.
As Andrea points out, if your CPUs are overheating, very strange things could happen. Celestia stresses every part of your system much more than most other programs can -- CPU, graphics, memory and disk.
What version of Windows are you running? XP Home? XP Pro? SP2? 32bit O/S? 64 bit?
What version of Celestia are you running?
To first approximation, I don't understand how there could be clock skew between cores: they're synchronized to the same clock.
I haven't seen the effect you describe, but I may not have been running Celestia long enough.
At home I run Celestia on a hyperthreaded P4 (two simulated CPUs).
At work I sometimes run Celetia on a Dell system that has an Intel Core2Duo processor. Both systems running Celestia v1.4.1 or from CVS under 32bit Win XP Pro.
p.s.
As Andrea points out, if your CPUs are overheating, very strange things could happen. Celestia stresses every part of your system much more than most other programs can -- CPU, graphics, memory and disk.
Selden
[quote="ANDREA"]haxsaw99, probably this is a stupid question, but... have you the possibility to check your CPUs temperatures? [/quote]
An interesting idea; my 3 year old is on that machine at the moment, so I'll have to wait a bit to check. Are you suggesting that heat aggrevates clock skew, or are just thinking that heat may be causing abnormal behavior in general? The latter seems unlikely since lengthy sessions of Half Life 2 don't wind up displaying any odd behaviors, and no one would argue that that bit of software doesn't exercise the system significantly. I have no knowledge one way or the other if heat can aggrevate clock skew; do you?
I suppose a reasonable test for this would be to boot the system, attempt to not do much that would generate significant heat for any length of time for about the time that it takes for Celestia to start to run poorly, and then run Celestia for the first time and see if it acts flakey.
My son just got off from using the system (and running Celestia). The sensor says the system is running around 42C, which seems reasonable. Would you agree?
An interesting idea; my 3 year old is on that machine at the moment, so I'll have to wait a bit to check. Are you suggesting that heat aggrevates clock skew, or are just thinking that heat may be causing abnormal behavior in general? The latter seems unlikely since lengthy sessions of Half Life 2 don't wind up displaying any odd behaviors, and no one would argue that that bit of software doesn't exercise the system significantly. I have no knowledge one way or the other if heat can aggrevate clock skew; do you?
I suppose a reasonable test for this would be to boot the system, attempt to not do much that would generate significant heat for any length of time for about the time that it takes for Celestia to start to run poorly, and then run Celestia for the first time and see if it acts flakey.
My son just got off from using the system (and running Celestia). The sensor says the system is running around 42C, which seems reasonable. Would you agree?
I believe that on Windows, Celestia uses 'QueryPerformanceFrequency' and 'QueryPerformanceCounter' to determine the elapsed time between frames. According to Microsoft:
Perhaps this might be the problem. Of course, I'm a Mac guy, so I really don't know.
- Hank
While QueryPerformanceCounter and QueryPerformanceFrequency typically adjust for multiple processors, bugs in the BIOS or drivers may result in these routines returning different values as the thread moves from one processor to another.
Perhaps this might be the problem. Of course, I'm a Mac guy, so I really don't know.
- Hank
selden wrote:How long is "for some time"? 3 minutes? 3 hours?
What version of Windows are you running? XP Home? XP Pro? SP2? 32bit O/S? 64 bit?
What version of Celestia are you running?
To first approximation, I don't understand how there could be clock skew between cores: they're synchronized to the same clock.
If the system has been up for a day or so I begin to see the effects I described.
I'm running 32 bit XP Pro with SP2.
The hardware is:
Athlon 64 X2 3800+ (runs around 42C when using Celestia)
1 GB DDR RAM
GeForce 6800 GS card w/256MB ram (runs around 44C when using Celestia)
nForce 4 mobo
My Celestia version is 1.4.1.
Yeah, I'd have expected the clocks to be in sync as well, but that was the only immeidate explanation that came to mind for me. If they are locked, then skew is obviously not the problem. However, setting affinity to a single CPU does fix the issue, which is another item with suggests that heat isn't the cuprit. Also, the problem continues to manifest if you've let the system idle after it starts acting up, which would give it a chance to cool down.
hank wrote:I believe that on Windows, Celestia uses 'QueryPerformanceFrequency' and 'QueryPerformanceCounter' to determine the elapsed time between frames. According to Microsoft:Perhaps this might be the problem. Of course, I'm a Mac guy, so I really don't know.While QueryPerformanceCounter and QueryPerformanceFrequency typically adjust for multiple processors, bugs in the BIOS or drivers may result in these routines returning different values as the thread moves from one processor to another.
- Hank
This sounds pretty promising. I wondered about BIOS issues, but I'm not in any position to render an opinion. Nice sleuthing.
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 9 months
- Location: Seattle, Washington, USA
haxsaw99 wrote:hank wrote:I believe that on Windows, Celestia uses 'QueryPerformanceFrequency' and 'QueryPerformanceCounter' to determine the elapsed time between frames. According to Microsoft:Perhaps this might be the problem. Of course, I'm a Mac guy, so I really don't know.While QueryPerformanceCounter and QueryPerformanceFrequency typically adjust for multiple processors, bugs in the BIOS or drivers may result in these routines returning different values as the thread moves from one processor to another.
- Hank
This sounds pretty promising. I wondered about BIOS issues, but I'm not in any position to render an opinion. Nice sleuthing.
Hank is on the right track. I had this exact same problem when I upgraded to a dual core Athlon system a couple years ago. It wasn't just Celestia that was affected--pretty much every game uses QueryPerformanceCounter, and exhibited the same stuttering as Celestia. I fixed the problem by installing the AMD processor driver. It's the second to last item on this page:
http://www.amd.com/us-en/Processors/Tec ... 18,00.html
--Chris
- LordFerret
- Posts: 737
- Joined: 24.08.2006
- Age: 68
- With us: 18 years 2 months
- Location: NJ USA
"The latter seems unlikely since lengthy sessions of Half Life 2 don't wind up displaying any odd behaviors, and no one would argue that that bit of software doesn't exercise the system significantly." - haxsaw99
I'm a big DoD fan myself.
I've heard a number of people who've had problems with HL and HL:Source, playing DoD. Things like mad speed and erratic display, very similar to the problems you mention with Celestia. The solution I recall hearing was restricting the game to just one core.
*edited*
I remembered this old post on the topic. It definately applies.
http://hardware.gotfrag.com/portal/forums/thread/182114/
I'm a big DoD fan myself.
I've heard a number of people who've had problems with HL and HL:Source, playing DoD. Things like mad speed and erratic display, very similar to the problems you mention with Celestia. The solution I recall hearing was restricting the game to just one core.
*edited*
I remembered this old post on the topic. It definately applies.
http://hardware.gotfrag.com/portal/forums/thread/182114/
chris wrote:
Hank is on the right track. I had this exact same problem when I upgraded to a dual core Athlon system a couple years ago. It wasn't just Celestia that was affected--pretty much every game uses QueryPerformanceCounter, and exhibited the same stuttering as Celestia. I fixed the problem by installing the AMD processor driver. It's the second to last item on this page:
http://www.amd.com/us-en/Processors/Tec ... 18,00.html
--Chris
Thanks for this. I installed this last night and so far things are working smoothly. Fingers crossed.
Well, it turns out this driver upgrade has been a mixed blessing. Celestia works great, however all sounds played by the computer and now screwed up. Whether downloading a video stream, playing a previously recorded MP3, or even playing system sounds, ever sound comes out at the right speed but about an octave low, and kind of watery too boot. It looks like I have more research to do if I want to be able to listen to my mp3 collection ever again. Windows is such a delight.
You need the newest lame codec to do away with the watery mp3 sounds... Chances are something kicked out your audio codecs and installing the updated Lame codec for mp3 should fix this... Or you can get ffdshow and give that a whirl... ffdshow is mainly for divx avi files but I believe it also supports mp3...
http://lame.sourceforge.net/download.php
http://lame.sourceforge.net/download.php
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!
That turned out to be the final problem. I eventually went over to my mobo manufacturer's site and ran their update tool which downloaded new audio drivers. It *finally* cleared it all up.
Rassilon wrote:You need the newest lame codec to do away with the watery mp3 sounds... Chances are something kicked out your audio codecs and installing the updated Lame codec for mp3 should fix this... Or you can get ffdshow and give that a whirl... ffdshow is mainly for divx avi files but I believe it also supports mp3...
http://lame.sourceforge.net/download.php
-
- Posts: 1
- Joined: 28.05.2007
- With us: 17 years 5 months
Something similar happened to me.However,I did a serious mistake.The hyperthreading in NVIDIA control panel is about dual GPU and NOT CPU.Since I only have ONE GPU,Celestia begins to work erratically.But the fact is that before I did this upgrade,this item hadn??t appeared yet.It appears that Nvidia Control Panel mistakes a CPU for a GPU and induce me to the error...very interesting...
There is an AMD optimized driver,but I don??t know if it really helps in improve the perfomance.
I have a reasonable boost on Mars2 but not anything more else,under Celestia
Nvidia Monitor is ONLY recognizing ONE CPU.Should I install the AMD optimized driver?
There is an AMD optimized driver,but I don??t know if it really helps in improve the perfomance.
I have a reasonable boost on Mars2 but not anything more else,under Celestia
Nvidia Monitor is ONLY recognizing ONE CPU.Should I install the AMD optimized driver?
Gotcha!I take the proof and now I can see that Celestia is NOT optimized to dual or multicore,at least for now.When I activate the threaded optimization in Nvidia Control Panel,I was astonished to say Mars 2 dips to a frame rate between 19-22 fps and then I disable the threaded optimization and frame rate jumps to something between 30-35 fps.It appears that instead of treating the processor as a 3.8 GHz equivalent dual core,the program treats like some sort of 5.7 GHz single core.It??s very odd.5.7 GHz is the perfomance rating in the site "Will you run this game?"
Also I have some sort of HL2 type game,named Sin Episodes Emergence and the frame rate goes from 28 fps to 133 fps or even more.Is this the mad behauvior?
Also I have some sort of HL2 type game,named Sin Episodes Emergence and the frame rate goes from 28 fps to 133 fps or even more.Is this the mad behauvior?
danielj wrote:There is an AMD optimized driver,but I don??t know if it really helps in improve the perfomance.
Without the AMD Dual-core Optimizer, I would run into issues with the cores becoming out of sync with each other. After a few hours, this would be able to be seen in Celestia as shaky/jittery movement in all the 3d graphics. I would definitely recommend installing it.
As for Celestia being dual-core optimized, the evidence I've seen points to yes. As I posted in your other Dual Core thread:
Johaen wrote:I just did a quick test in Celestia, buy floating along with the ISS over Earth. When running with both cores, I get around 42-45 FPS. When I use the Set Affinity option in the task manager to allow Celestia to only run on one core, the FPS dropped to around 30 fps. After re-enabling the second core, the FPS once again jumped up to 42 or so fps. So yes, it seems to have some dual core optimization.
AMD Athlon X2 4400+; 2GB OCZ Platinum RAM; 320GB SATA HDD; NVidia EVGA GeForce 7900GT KO, PCI-e, 512MB, ForceWare ver. 163.71; Razer Barracuda AC-1 7.1 Gaming Soundcard; Abit AN8 32X motherboard; 600 watt Kingwin Mach1 PSU; Windows XP Media Center SP2;
Perhaps the mine it's a stupid question, but probably that game manage the vsync of the graphic card in off mode. When the vsync is off the frame rate raise. Another question is: which end do all the textures loaded in the video memory after a long run across the universe? Are you sure that time after time the GPU ram does clear these or be saturate? There is someone out of there that look at some sort of resource meter apart the fps during the trip?
Never at rest.
Massimo
Massimo
-
- Site Admin
- Posts: 4211
- Joined: 28.01.2002
- With us: 22 years 9 months
- Location: Seattle, Washington, USA
Johaen wrote:danielj wrote:There is an AMD optimized driver,but I don??t know if it really helps in improve the perfomance.
Without the AMD Dual-core Optimizer, I would run into issues with the cores becoming out of sync with each other. After a few hours, this would be able to be seen in Celestia as shaky/jittery movement in all the 3d graphics. I would definitely recommend installing it.
As for Celestia being dual-core optimized, the evidence I've seen points to yes. As I posted in your other Dual Core thread:
Celestia has no specific optimizations for dual core systems. However, the NVIDIA graphics drivers are able to take advantage of multiple processor cores, so you'll see a performance boost whenever Celestia is bottlenecked by the driver.
Typically, it's GPU performance that determines the maximum frame rate in Celestia, so I haven't spent much time thinking about dual core optimizations. There are some cases where it could help, such as loading textures and models on extra processor cores. Other CPU-limited situations occur when dealing with very large numbers of solar system objects. But, performance tuning effort is probably better spent on optimizing for a single processor, as there are some fairly simple (or at least simple when compared to spawning threads) things to do to speed up processing of solar system objects.
--Chris
Last edited by chris on 05.12.2007, 10:53, edited 1 time in total.
-
- Developer
- Posts: 3776
- Joined: 04.02.2005
- With us: 19 years 9 months
chris wrote:...But, performance tuning effort is probably better spent on optimizing for a single processor, as there are some fairly simple (or at least simple when compared to spawning threads) things to do to speed up processing of solar system objects....
Chris, I must thrust you here, but let's not forget that multicore will become the norm very soon, not only with dual but with quad and more. It would be a pity to not take advantage of such extra power...