Here's a preliminary look at credit with a limited number of results...
On a Xeon X5675 24 core system...
Native Pypy v1.9 Application - Full Node (Cross Platform) v3.09 (mt) running on all 24 cores at a time averages .0077 credits per sec per core
Native Pypy v1.9 Application - Full Node (Cross Platform) v3.09 (mt) running on just 2 cores at a time averages .0577 credits per sec per core
On a i7-3770k 8 core system...
ACT-R cognitive modeling environment using Clozure Common Lisp (Cross Platform) v2.35 (sse2) (1 core per WU) averages .0071 credits per sec per core
The mt WUs running on just 2 cores pay very well though I don't know if that's just a temporary thing until credit new figures it out. Still, I'd set your app_config.xml and get all the mt WUs you can.
Mindmodeling
Forum rules
- scole of TSBT
- Boinc Major General
- Posts: 5983
- Joined: Mon Feb 03, 2014 2:38 pm
- Location: Goldsboro, (Eastern) North Carolina, USA
#12 Re: Mindmodeling
Need to get home first then I can try this :) Still, at least I'm at work
The best form of help from above is a sniper on the rooftop....
- scole of TSBT
- Boinc Major General
- Posts: 5983
- Joined: Mon Feb 03, 2014 2:38 pm
- Location: Goldsboro, (Eastern) North Carolina, USA
#13 Re: Mindmodeling
Work available. Haven't seen any of the multi-thread WUs though.
#14 Re: Mindmodeling
There are some common lisp sse2 one available. Picked up a few.
The best form of help from above is a sniper on the rooftop....