PC upgrade advice

grantx

Senior member
2,331 223
CB,

Before I do as you suggest, I copied the Services contents from the System Configuration Utility. Please see attachment (1 page).There's got to be something here that I don't need - telephony, anything network related (I don't run a network), etc.

Grant.
 

Attachments

  • System configuration utility.doc
    45.5 KB · Views: 211

Crap Buddist

Senior member
2,458 289
hi Grant, when you get to the start up tab and select disable all, then reboot, and your machine will, but it wont load all the junk, you might want to boot up your anti virus or if you can see it in the start up list after you have diabled everything, then just re enable that one program to load up.

Again , you can always do the same function and re select enable all, all those programs you disable will still be in the tab waiting to be enabled...
 

Atilla

Legendary member
19,909 3,160
CB,

Before I do as you suggest, I copied the Services contents from the System Configuration Utility. Please see attachment (1 page).There's got to be something here that I don't need - telephony, anything network related (I don't run a network), etc.

Grant.

Suspects IMO

ATI HotKey Poller - Suspect, probably uses interrupts for keyboard requests
.NET Runtime Optimization - I only have this once. Do you have 2 diff versions?
NBService for Nero AG - NetBios very chatty MS nonsense

Start by disabling - unchecking them.
 

Crap Buddist

Senior member
2,458 289
the funny thing is when you check disable all, and reboot, and it reboots quicker because it loads less, therefore less running etc, is, you wonder what those 30 items were running in the first place ? :)
 

grantx

Senior member
2,331 223
DB, Atilla,

I ran an experiment. On the Excel sheet, I selected an area and set a Find and Replace, changing 1 cell reference (20001 to 20000). With DB’s Disable All, and subsequent Enable All, the time for both, changing 67,280 cells was 45 seconds. CPU Usage was c 50% until completion.

Automatic updates is already disabled. There is a printer connected but it’s turned off.

Atilla,

Just read your latest. Will do as you suggest (after I’ve eaten). And Presumably there will be no harm if I maintain DB’s Disable All.

.NET Runtime Optimization. My 2 your 1, what’s it for anyway?

Seeing as the only 64-bit prog I have (but don't use) is Excel 07, maybe a change to 32-bit windows also?

Grant.
 

Atilla

Legendary member
19,909 3,160
DB, Atilla,

I ran an experiment. On the Excel sheet, I selected an area and set a Find and Replace, changing 1 cell reference (20001 to 20000). With DB’s Disable All, and subsequent Enable All, the time for both, changing 67,280 cells was 45 seconds. CPU Usage was c 50% until completion.

Automatic updates is already disabled. There is a printer connected but it’s turned off.

Atilla,

Just read your latest. Will do as you suggest (after I’ve eaten). And Presumably there will be no harm if I maintain DB’s Disable All.

.NET Runtime Optimization. My 2 your 1, what’s it for anyway?
Grant.

I'm not too sure my self. It's Microsofts new vision for an integrated application platform I think. .Net framework for developers to build apps that all talk to each other with "Simplicity" or something... Provides support for XML and various other languages.

Any developers here?

Just curious as to why you have two different versions I guess. Use the most recent one.
 

dcraig1

Experienced member
1,604 243
I know SFA about Excel, but I will post a comment about CPU Level 2 cache sizes and system performance that may be useful.

The benchmarking of CPUs on popular hardware PC sites can be very misleading as regarding the importance of cache size/architecture on application performance. In particular applications, that effect can be really dramatic, but the inexperience of the reviewers means that a myth endures that cache is not all that important. Sometimes that is not true at all.

Applications where cache/memory performace dominate overall system performace are those that make a lot of random accesses over a very large memory space. Such apps generate a lot of cache misses.

It occurs to me that very large spreadsheets MAY at least partially fall into this category.

To illustrate my point, several years ago a company I was contracting to bench marked a Java app on Pentium PIII and Sun Sparc work station. In small memory spaces the PIII easily won out. In large memory spaces the Sparc was much better - all because of cache size/memory architecture.

With some of my own Java code recently I've seen the same thing. Large memory space with many random accesses - small cache AMD Sempron compared to Intel C2D overclocked to 3.2Ghz. The Intel is 30 times faster on the same code. This almost unbelievable difference has a lot to do with the far superior cache on the Intel. Of course better AMD processors with bigger cache than the Sempron would fare a lot better in the comparison, but the message is crystal clear.
 

dcraig1

Experienced member
1,604 243
I'm not too sure my self. It's Microsofts new vision for an integrated application platform I think. .Net framework for developers to build apps that all talk to each other with "Simplicity" or something... Provides support for XML and various other languages.

Any developers here?

Just curious as to why you have two different versions I guess. Use the most recent one.

I don't know much about .NET (I try to avoid MS) but I would guess that this optimization is very similar to Sun's Java Hotspot optimization. To put it briefly, Hotspot optimizes compiled Java code on the fly as it detects execution patterns. The argument goes that by so doing, optimizations are possible that cannot be done with statically optimized code (as for example is produced by a C++ compiler). Hotspot is very effective for a lot of apps. I would think that in general, you would want to have .NET runtime optimization turned on if you are running .NET applications. The setting should not affect non .NET apps.
 

Atilla

Legendary member
19,909 3,160
I know SFA about Excel, but I will post a comment about CPU Level 2 cache sizes and system performance that may be useful.

The benchmarking of CPUs on popular hardware PC sites can be very misleading as regarding the importance of cache size/architecture on application performance. In particular applications, that effect can be really dramatic, but the inexperience of the reviewers means that a myth endures that cache is not all that important. Sometimes that is not true at all.

Applications where cache/memory performace dominate overall system performace are those that make a lot of random accesses over a very large memory space. Such apps generate a lot of cache misses.

It occurs to me that very large spreadsheets MAY at least partially fall into this category.

To illustrate my point, several years ago a company I was contracting to bench marked a Java app on Pentium PIII and Sun Sparc work station. In small memory spaces the PIII easily won out. In large memory spaces the Sparc was much better - all because of cache size/memory architecture.

With some of my own Java code recently I've seen the same thing. Large memory space with many random accesses - small cache AMD Sempron compared to Intel C2D overclocked to 3.2Ghz. The Intel is 30 times faster on the same code. This almost unbelievable difference has a lot to do with the far superior cache on the Intel. Of course better AMD processors with bigger cache than the Sempron would fare a lot better in the comparison, but the message is crystal clear.

I concur with your post dcraig. Also with SQL databases and more specificaly for me and Disaster Recovery, disk to tape writes so much faster with cache. I would strongly recommend a good scsi card with as much Cache Write Battery Backed RAM also coupled with RAID 10 for fast disk read / writes.

Coming back to the Spread-Sheet I did notice memory usage was at 1.7Gb whilst 2Gb being available. Obviously the spread sheet and formulas having a big impact on RAM utilisation. Layer 2 cache would help. Can't remember if PC had any?

This is a tough one.

Best to use elimination to see where problem lies: software, spreadsheet or hardware... In that order.

Good luck Grantx
 

grantx

Senior member
2,331 223
Atilla,

Multi-core cache configuration is 2 x 512kb.

If cache is the key here then compatible upgrades are the 3.2 Ghz with 2 x 1Mb cache or the Opteron 3.0Ghz, also with 2 x 1Mb cache.

The question then is, will I see much improvement?

Grant.
 

Atilla

Legendary member
19,909 3,160
Atilla,

Multi-core cache configuration is 2 x 512kb.

If cache is the key here then compatible upgrades are the 3.2 Ghz with 2 x 1Mb cache or the Opteron 3.0Ghz, also with 2 x 1Mb cache.

The question then is, will I see much improvement?

Grant.

Cache RAM always helps imo.

Whether that is the problem or if it will be significant improvement or not I can't say.

Fwiw, exercised I carried out in the past - adding cache to our SCSI controller on an SQL server provided 15% improvement in speeding up reports that were taking 5 - 6 hours to run.

I appreciate we are talking about cache to the CPU but if formulas and calculations are the bottle neck I can only assume adding more cache will speed it up. By how much it would be worth benchmarking it as no doubt you will do.

This site Tom's Hardware is pretty good for PC Hardware stuff.

Sorry not much help with placing head on guillatine... :cheesy:

Good luck.
 

A Dashing Blade

Experienced member
1,373 170
This is an example of a cell:
=IF($AZ6<MIN($B$4:$B$20001),"",VLOOKUP($AZ6,$B$4:$AP$20000,22))
In this particular case this is from a table 800 rows x 40 columns (32,000 cells).

Try resizing from 20,000 to the minimum required (800?) in all cells. Do this trivially by using the "Replace" functionality.
 

A Dashing Blade

Experienced member
1,373 170
Forgot about this little "feature" of excel.
Try mininmising the XL window then restoring it. Any change in CPU usage?
 

grantx

Senior member
2,331 223
DB,

No difference.

As a temporary measure I've reduced the amount of data.

In the final analysis Excel is not an efficient method for number crunching. Therefore, realsitically, one should look at the alternatives (another day). We've all spent enough time on this, and I doubt we'll get the quick fix I naively expected. So I reckon we should kill it. Thank you to everyone who contributed.

Grant.
 
 
AdBlock Detected

We get it, advertisements are annoying!

But it's thanks to our sponsors that access to Trade2Win remains free for all. By viewing our ads you help us pay our bills, so please support the site and disable your AdBlocker.

I've Disabled AdBlock