Dual-core CPU's

The place to discuss creating, porting and modifying Celestia's source code.
Topic author
guest jo
Posts: 126
Joined: 01.04.2004
With us: 20 years 7 months

Dual-core CPU's

Post #1by guest jo » 08.03.2006, 11:30

Hello,
Does celestia take any advantage of dual-core cpu's ?
Would an AMD 64 4000+ be slower then AMD X2 3800+ ?
(Asuming a graphics card that is not the bottleneck)
_______________________________________
Celestia 1.6.0 @1600x1200x32; GF8800Ultra; Q6600@3,2GHz;WinXPx64;

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #2by selden » 08.03.2006, 11:57

Unfortunately, Celestia is currently single-threaded. It can't take advantage of more than one CPU.

However, having two CPUs means that your system will be able to run another program at full speed with excellent interactive response, even while Celestia is using 100% of one of the CPUs.

The difference in performance between the two models you list wouldn't be noticable for Celestia, only for additional jobs running at the same time.
Selden

Topic author
guest jo
Posts: 126
Joined: 01.04.2004
With us: 20 years 7 months

Post #3by guest jo » 08.03.2006, 15:57

Thank you very much Selden, that answers my question completely.
At the moment I won't upgrade board and cpu cause I've just bought the fast AGP-card 7800-GS. It works wonderful in 1600x1200 compared to my old GF4Ti4200 :D

But I also see that cpu power is a bottleneck in some cases (talking only about celestia here) and if I buy a dual-core in the future it will be only for celestia, not for any parallel processes. They are not so important for me :lol:
_______________________________________
Celestia 1.6.0 @1600x1200x32; GF8800Ultra; Q6600@3,2GHz;WinXPx64;

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #4by selden » 08.03.2006, 16:09

Unfortunately, individual CPU performance seems to be almost stalled at the moment, although somewhat faster CPUs will be developed. Right now you'd only get about a 50% speed improvement over what you have, which may not be worth the amount of money you'd have to invest.

You will get a substantial performance enhancement by increasing your memory to 1GB, however, since your system won't have to page when running with large Addons and surface textures.

Depending on what kind of disk you have, you may be able to double its I/O performance. 15K RPM SCSI disks are available, although for a significant amount of money.
Selden

Topic author
guest jo
Posts: 126
Joined: 01.04.2004
With us: 20 years 7 months

Post #5by guest jo » 08.03.2006, 17:49

You are right. I had exactly the same thoughts about cpu-power. My conclusion: No upgrade till performance goes up to at least 100% above my actual hardware.
RAM I have 768 MB. I had to return 2 Gig RAM to the shop because it didn??t work here(it definitely should work on my board), but maybe I??ll try again.
SCSI I never had in mind: maybe interesting (I only thought about Raid 0 but not with my actual board and case.)

Thanks again.
_______________________________________
Celestia 1.6.0 @1600x1200x32; GF8800Ultra; Q6600@3,2GHz;WinXPx64;

Avatar
selden
Developer
Posts: 10192
Joined: 04.09.2002
With us: 22 years 2 months
Location: NY, USA

Post #6by selden » 08.03.2006, 18:01

When upgrading memory, it usually turns out that you have to make sure that all of the memory modules are made by the same manufacturer. Even when they have identical specs, too often memory modules from different companies will not work together.
Selden

Topic author
guest jo
Posts: 126
Joined: 01.04.2004
With us: 20 years 7 months

Post #7by guest jo » 08.03.2006, 21:23

:? I even had what a tube-amplifier freak would call a "matched-pair"
of 1 Gig modules from one manufacturer but my board seems to be known as a bit difficult. Only chance is to take the whole thing to a dealer and try RAM until it works but I would rather do it myself...
_______________________________________
Celestia 1.6.0 @1600x1200x32; GF8800Ultra; Q6600@3,2GHz;WinXPx64;

tech2000
Posts: 258
Joined: 14.02.2006
Age: 52
With us: 18 years 9 months
Location: Skepplanda, Sweden

Post #8by tech2000 » 19.03.2006, 19:30

Since dualcore CPU's is the future, will Celestia in the future become multithreaded?

tec
Posts: 51
Joined: 14.03.2006
With us: 18 years 8 months
Location: Huntsville, AL

CPU utilization

Post #9by tec » 23.05.2006, 15:35

I have a dual core CPU. I watched my CPU utiliation stay at 50% while my hand was off the mouse. I turned on the stats indicator on the screen and I saw 119 fps. I feel that rendering the scene at 119 fps is overkill. I would like to see someone go into the Celestia code and throttle down the rendering loop to 30 Hz. They could implement a menu option to have either a "screeming" mode or a fixed 30Hz mode. I like have a screeming mode because I can play around with different graphical ideas and see how these new ideas affect performance. But I would be frustrated if I did not have the dual core processor (which many people don't have).

Implementing a 30 Hz timer would be easy to implement. I am going to play around with this idea for my next Real-Time verison.

Tim

Avatar
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 7 months
Location: Hamburg, Germany

Re: CPU utilization

Post #10by t00fri » 23.05.2006, 15:53

tec wrote:I have a dual core CPU. I watched my CPU utiliation stay at 50% while my hand was off the mouse. I turned on the stats indicator on the screen and I saw 119 fps. I feel that rendering the scene at 119 fps is overkill. I would like to see someone go into the Celestia code and throttle down the rendering loop to 30 Hz. They could implement a menu option to have either a "screeming" mode or a fixed 30Hz mode. I like have a screeming mode because I can play around with different graphical ideas and see how these new ideas affect performance. But I would be frustrated if I did not have the dual core processor (which many people don't have).

Implementing a 30 Hz timer would be easy to implement. I am going to play around with this idea for my next Real-Time verison.

Tim


Of course it's an overkill, since 119fps is faster than the refresh rate of your screen (e.g. 85 -- 100 Hz). But I thought we meanwhile ALL know what to do?

Just switch on "vsync" in your card and then you get a vertical sync with your screen's refresh, i.e. fps <~ 85-100.

Bye Fridger
Image

Johaen
Posts: 341
Joined: 14.01.2006
With us: 18 years 10 months
Location: IL, USA

Re: CPU utilization

Post #11by Johaen » 23.05.2006, 16:26

t00fri wrote:Of course it's an overkill, since 119fps is faster than the refresh rate of your screen (e.g. 85 -- 100 Hz). But I thought we meanwhile ALL know what to do?

Just switch on "vsync" in your card and then you get a vertical sync with your screen's refresh, i.e. fps <~ 85-100.

Bye Fridger

I still haven't gotten this V-Sync thing to work. I described my issues in Always 60Hz refresh rate in full-screen (solved) and celestia takes 100% CPU time :(.

Johaen wrote:Using my ATI Catalyst Control Center, I set my 3D Refresh Rate Override to Same as Desktop. This was the setting that seemed the closest to what you described. Yet, I still get the same results as before: anywhere from 100 to 400 fps, depending on what's going on on the screen, and what my PC is doing in the background. I also tried changing it to a few different settings, such as 60 Hz, 66 Hz, and 70 Hz. None of them seemed to make any difference.

Johaen wrote:I just tested it. Nothing's changed. It still always uses 100% of my CPU, and I still have framerates in the 300+ range. I'm really not sure what else to do. There is no V-Sync option from what I can see in the Driver Options. The 3D Refresh Rate Override seemed to be the one to use, but it doesn't seem to change anything.


I'm not realy complaining about it, it's just that V-Sync might not be a viable option.

Topic author
guest jo
Posts: 126
Joined: 01.04.2004
With us: 20 years 7 months

Post #12by guest jo » 23.05.2006, 17:07

Could you make a screenshot of your driver-panel where you have enabled vsync ?
Maybe that helps me to understand why it won??t work for you... :P

(Hm do you have crt or tft ?)
_______________________________________
Celestia 1.6.0 @1600x1200x32; GF8800Ultra; Q6600@3,2GHz;WinXPx64;

tec
Posts: 51
Joined: 14.03.2006
With us: 18 years 8 months
Location: Huntsville, AL

CPU Utilization

Post #13by tec » 23.05.2006, 18:52

Fridger,

You need to read the dialog again. I am talking about the CPU utilization. The graphics card has nothing to do with the CPU talking up 99% of the CPU. Celesia is re calculating what goes down the graphics pipe for the same scene over and over while my hand is off the mouse. This is very in-efficient. The Celestia CPU needs to sleep (wait) if the REDRAW, RESIZE, etc WM_ messages do not occur. It also need to sleep between 30 Hz cycles. That is why I suggested using a timer to check for these window manager messages. Theoretically if this loop is done correctly, a Celestia user should only see on average about 15% CPU utilization for the process.

Tim

Avatar
t00fri
Developer
Posts: 8772
Joined: 29.03.2002
Age: 22
With us: 22 years 7 months
Location: Hamburg, Germany

Post #14by t00fri » 23.05.2006, 19:20

tec,

but we have discussed this topic to exhaustion, I think.

Here is Seldens /correct/ answer
from http://www.celestiaproject.net/forum/viewtopic ... 63&start=0
once more:

Selden wrote:Celestia uses 100% of the CPU if it cannot keep up with the system's screen refresh rate. Many people seem to like to set their display refresh rate at 75fps or faster. LCDs usually require 60fps.

If Celestia can refresh its window in less time than is needed by the refresh rate that you've configured your system to have, then Celestia will use less than 100% of the CPU.

Your options are to reduce your system's screen refresh rate, reduce the screen resolution, upgrade your system, or reduce the amount of recalculation Celestia needs to do. E.g. run Celestia in a smaller window or reduce the number of Addons that are loaded. A lot of CPU time is needed to calculate which stars or galaxies are obscured by foreground objects, so turning off Galaxy rendering can help. Also, backing away from a planet, so fewer background objects are hidden, will reduce this load. Reducing the magnitude limit, so fewer stars are drawn, also helps.

The current version of Celestia will use no CPU time when it's iconized.


Bye Fridger
Image

Rassilon
Posts: 1887
Joined: 29.01.2002
With us: 22 years 9 months
Location: Altair

Post #15by Rassilon » 23.05.2006, 20:29

Actually I think what nec is referring to in a way is correct... Celestia does not return management to windows in the message handler loop using the sleep command or some similar procedure and in turn utilizes 100% of the cpu process stack to handle its messages.... If Celestia would return only a millisecond back to windows 'Sleep(1)' then you would no longer see a 100% cpu usage in the task manager...' or 50% for multithreaded cpu's'... This is not in any way considered bad since Celestia is trying to grab as much cycles as it can for best performance... The Sleep command will cause a bottleneck although seeming minimal will amount to something when larger masses of vertex data is being assimilated...

V Sync only monitors the return rate of the monitor refresh... Enabling this feature I wouldnt think solve this problem... Most gamers disable it to improve performance in thier framerate... Framerate is independant (or should be) of the message handler and its ability to return to windows to allow other processes access to the cpu... It can however affect it... The lower the framerate is a sign that the application is taxing either cpu, RAM, gpx card cpu or VRAM...
I'm trying to teach the cavemen how to play scrabble, its uphill work. The only word they know is Uhh and they dont know how to spell it!

Johaen
Posts: 341
Joined: 14.01.2006
With us: 18 years 10 months
Location: IL, USA

Post #16by Johaen » 23.05.2006, 23:42

guest jo wrote:Could you make a screenshot of your driver-panel where you have enabled vsync ?
Maybe that helps me to understand why it won??t work for you... :P

(Hm do you have crt or tft ?)

I'm using a CRT running at 60 Hz.

Here is a screenshot of the CATALYST Control Center, with the 3D Refresh Rate Override bordered in Red. Changing this setting doesn't seem to have any effect in Celestia.

Now, I've been poking around in the Control Center and I found a "Wait for vertical refresh" setting. I set that to Always On, as seen in this screenshot.

This setting seems to limit it close to my moniter's refresh rate. I get about 56-58 FPS with this setting turned on. But, I still am using 100% of my CPU. *sigh* :cry:

The current version of Celestia will use no CPU time when it's iconized.


What does iconized mean?

Topic author
guest jo
Posts: 126
Joined: 01.04.2004
With us: 20 years 7 months

Post #17by guest jo » 24.05.2006, 07:42

The first screenshot shows exactly what I supposed:
Fou did not enable vsync.

What you did in the second screenshot sounds better but I'm confused with this quality slider...
Nvidia has the same slider for quality and performance but the vsync is not a matter of "quality - performance" it is a matter of "on - off" if you know what I mean.
Sorry but I have no ATI lets wait for someone who knows, that can't be a big deal guys...

P.S. Your refresh rate override is useful to use celestia in 75 or 85 Hz if you don't want 60 !! But "Same as desktop" won't work. You have to make a setting like I did in a preexisting long resolution-list :
1600x1200x32 ....................... Standard
changed to:
1600x1200x32.........................85Hz
_______________________________________
Celestia 1.6.0 @1600x1200x32; GF8800Ultra; Q6600@3,2GHz;WinXPx64;

tec
Posts: 51
Joined: 14.03.2006
With us: 18 years 8 months
Location: Huntsville, AL

CPU utilization

Post #18by tec » 24.05.2006, 14:24

Hello Again,
I looked at making the drawing loop more efficient. I placed a Sleep call in the window manager and forced the frame rate to 30Hz. This brought my CPU utilization to 1%. I used my windows task manager to verify this. I tested this code on 3 different computers and I got the same results. One was a XEON and the other 2 were P4's.

When you press the ` key the frames per second will appear. This will be about 30Hz because the following code sets a timer to render at 30Hz. You can change the DRAW_TIMER_MSEC macro if you want to but 30Hz should work well for you. I have to set the timer to 40 Hz to get the 30Hz frame rate that I wanted. I am trying to figure out why I had to do that.

I am a little concerned how the Sleep(int) call will work using Linux. If you try this using Linux, please let us know how it works out. These changes will be in my next release of the RealTime Celestia.

Here are the changes I made to winmain.cpp. The TEC characters are my initials:

Add:

static UINT_PTR MainWindowProcTimer; // TEC - define timer #2 for the rendering
#define DRAW_TIMER_MSEC 0.025 // TEC

to the top of the file to make it global.

Add:

// TEC - set timer #2 to go off at 30Hz for the rendering functions
MainWindowProcTimer = SetTimer(hwnd, 2, (unsigned int)(DRAW_TIMER_MSEC*1000), NULL);

to the bottom of CreateOpenGLWindow()

Add:

//TEC - create a timer once to tic for 30 Hz period
static Timer *timer = CreateTimer(); // TEC
// TEC - use a flag to make sure the timer is finisihed before it tries to render again
static int rendering=0;

as the first lines in MainWindowProc

Take out the WM_PAINT message case statement. Comment out the entire case and let the default resource handler take care of it.

Add:

case WM_TIMER: // TEC
if(wParam == 2) // TEC - make sure only timer 2 is handled
{
if (bReady && ( rendering == 0) )
{
// TEC - don't let the timer call this block unless I am done in here
rendering = 1;
timer->reset(); // reset timer to zero
appCore->draw();
SwapBuffers(deviceContext);
ValidateRect(hWnd, NULL);

// don't continue until the 30Hz period has expired
while( timer->getTime() < DRAW_TIMER_MSEC )
{
Sleep( 2 );
}
rendering = 0;
}
}
break;



That's all. Let me know how yours works out.
Tim


Return to “Development”