1. Welcome! Please take a few seconds to create your free account to post threads, make some friends, remove a few ads while surfing and much more. ClutchFans has been bringing fans together to talk Houston Sports since 1996. Join us!

Random Questions That Need Answers

Discussion in 'BBS Hangout' started by Lil Pun, Jun 23, 2006.

  1. Lil Pun

    Lil Pun Member

    Joined:
    Oct 6, 1999
    Messages:
    34,143
    Likes Received:
    1,038
    What is that liquid that is always settling on top of my peanut butter in he jar?
     
  2. RC Cola

    RC Cola Member

    Joined:
    Jun 11, 2002
    Messages:
    11,504
    Likes Received:
    1,347
    I'm not sure that is completely accurate either though. I believe some applications have already gotten close to 2x the performance (~90% or more) going dual-core or even quad-core (compared to dual-core of course). I know anandtech did some server benchmarks that showed this:
    http://www.anandtech.com/IT/showdoc.aspx?i=2291&p=15
    (note that the link includes tables which makes more sense combined with the above quote)

    And in this article, they compare percentage gains among other applications:
    http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2377&p=1
    (note that these gains are actually a bit disappointing and could be much improved I guess, but they already offer performance gains of ~20% or more)

    Now I guess that should be somewhat expected since those applications make decent-to-good use of multi-core and multi-threaded processors. Single-threaded applications wouldn't get much, if any, improvements. However, considering how we might have a 100-core processor in the future, developers will probably need to change those to multi-threaded/multi-core applications (unless we only want to utilize 1/100th of the potential power that the CPU could give us). Here's a good read about this type of stuff:
    http://www.gotw.ca/publications/concurrency-ddj.htm
    Here's also a small excerpt from that article about how there has been multi-threaded coding for a while now:
    (BTW, this article is by Herb Sutter)

    In terms of gaming, it will probably be a while before developers can efficiently utilize dual/multi core processors. Although as you said, the CPU probably won't be the limiting factor in most games, unless something changes. Perhaps an increase in use of physics, procedurally-generated geometry, better AI and animation, etc. as well as better utilization of the power the graphics cards offer (higher efficiency through unified shaders? Better use of SLI/Crossfire and dual-core GPUs to increase performance by ~50% or more?) will cause the CPU processing power to become more important in games.

    I don't believe it is possible to get 2x the performance with a dual core processor (or 4x for a quad-core, and so on), but I do believe that you can get some worthwhile performance gains (>20%) assuming that developers are able to make the transition to creating multi-threaded applications.

    I don't have a dog or a cat.
     
  3. swilkins

    swilkins Member

    Joined:
    Mar 5, 2003
    Messages:
    7,115
    Likes Received:
    11
    Peanut oil
     
  4. Dubious

    Dubious Member

    Joined:
    Jun 18, 2001
    Messages:
    18,318
    Likes Received:
    5,090
    You park in the driveway because that is where you drive up to your house, if you had more money you would drive up the driveway and park in the parking court or garage.

    You drive on a parkway because after WWII when the nation was becoming dependent on automobiles and suburbs, utopian planners sought to turn driving in a pastoral experience like moving through a park. The name is still parkway but you cant see the park for the billboards.
     
  5. Rashmon

    Rashmon Member

    Joined:
    Jun 2, 2000
    Messages:
    21,206
    Likes Received:
    18,210
    If crime fighters fight crime and fire fighters fight fire, what do freedom fighters fight?
     
  6. Kyrodis

    Kyrodis Member

    Joined:
    Dec 11, 2002
    Messages:
    1,336
    Likes Received:
    22
    You're kidding right? When was the last time you had to table population and truncation on a database in your home? Like i said, for EVERYDAY applications that you plan on using, there is no way you'll see a huge performance increase in the application itself. You WILL however, see overall performance increases if you multitask a lot of those applications at once.

    Have you done any significant software development before or are you just reading stuff on Anandtech? If you're a professional software developer, then I'll gladly eat crow and shut up. :)

    I'm not a hardcore programmer by any means, but I've done enough software engineering at work to think that most of the applications we use every day will not benefit from new multithreaded coding as much as the tech junkies think it will.

    To be completely serious, what kind of multithreaded coding do you expect in Microsoft Word? Outlook? Internet Explorer? Nothing in there is processor hungry.

    Example (and I'm generalizing a great deal here):
    Old code - When you try to open two word files at the same time, it does one after another.
    New code - When you open two word files at the same time, it does them at the same time. OH WOW! That's like a 0.0005% increase in performance!

    Games, I can totally understand why they'd see larger performance increases simply because there are a lot of computations that need to be done. However (and again I'm speaking from my engineering gut here), I believe most of those computations are used to render video...which really has nothing to do with the CPU. Even if you get to a point where you can do all the non-video calculations at the same time really fast, the video rendering is still your bottleneck on how fast the game will run.

    The place you'll see the most improvement is CPU hungry workstation/server applications. For example...hmm...oh yeah, database population and truncation maybe? ;)

    EDIT: And sorry for hijacking this thread and turning it into a techie nerdfest...
     
    #46 Kyrodis, Jun 23, 2006
    Last edited: Jun 23, 2006
  7. Agent27

    Agent27 Member

    Joined:
    Aug 1, 2003
    Messages:
    355
    Likes Received:
    0
    If Pro is the opposite of Con, then what is the opposite of Progress?
     
  8. RC Cola

    RC Cola Member

    Joined:
    Jun 11, 2002
    Messages:
    11,504
    Likes Received:
    1,347
    That's true. I just thought I'd throw in the server benchmarks since they showed the biggest improvements when using multiple cores. Still, that wasn't the only application to see decent performance gains; video editing software and 3D animation programs also saw gains of at least 10%-20% (and up to 40% or even 89% depending on the situation). Now those still probably aren't that common in the home, but those are some of the more demanding applications I can think of that one might find in a home.

    I just barely started my degree in computer science, so outside of some basic coding, I have basically no experience in developing software at this point in my life. However, I have read quite of bit of information on this topic. Everything I have read seems to point to there being decent performance gains with multi-core processors, assuming the right approach is taken to developing software to take advantage of them. If nothing else, I'd gladly appreciate any online resources you could provide that indicate otherwise.

    In applications like Word, Outlook, Explorer, etc., I agree that there will be little to no performance gains (at least nothing that is noticeable, like you pointed out). Perhaps I misunderstood your original post if all you were trying prove is that we won't be seeing much of an improvement in these applications with multi-core processors. But we could have a 500 GHz single-core processor that doesn't really offer much of a performance boost for those applications either; in fact, if we restricted ourselves to those applications, there would be no need to really push our processors much over 1.5GHz (if even that high). Multi-core processors wouldn't help here much like any other type of improvement in processor performance wouldn't help since they are not CPU bound (mostly IO bound I believe since we can't type or read at a billion words per second). Perhaps I missed something, but I don't recall anyone predicting that we'd see massive performance gains in those types of applications...in fact, I'm not even sure how to measure performance gains in those applications. What would a ~20% gain in performance allow us to do in Word?

    Now things like video editing, animation software, computer graphics and rendering, gaming, etc. could all use a little more processor power than the power needed to run Microsoft Word.

    It is true that games are often not bottlenecked by the CPU; however, that might not always be the case in the near future. Developers have only had multi-core processors to work with for only a couple of years, and with the 360 and PS3 offering powerful multi-core solutions, they will be trying new things to take advantage of the hardware given to them. I think something like physics is still in its infancy (Half-Life 2 physics seem almost laughable to me compared to some of the physics demos I've seen). Calculating the physics for a large number of objects or even liquids/gases would require a large amount of processing power, and would benefit from having a thread to itself. Ageia, one of the physics leaders, even thought it would be necessary to provide a physics processing unit, although I imagine that it will fail to catch on with consumers (leaving physics calculations for the CPU). I should note that Ageia mentioned that the Cell processor could do the same tasks as their processor; assuming PS3 developers take advantage of this (great physics effects), PC developers might end up pushing a lot of physics via the multi-core processors.

    The same could be said for AI (some developers gave a whole core to AI already, and have gotten great results). I'm not sure so much about animation, but I do know about a great technology called Endorphin by Natural Motion. Using a combination of physics, AI, and other algorithms, it is able to create realistic animations that put the Madden football animations to shame. It requires a lot of processor power though, which explains why, even though it was used in a PS2 game (Tekken 5), the animation was nothing like the animation shown in some of their demos and other professional work (movies like Troy and Poseidon). For some, realistic animation is almost as important, if not more important, than realistic graphics (we need both in order to avoid the "uncanny valley").

    Things like procedurally generated content and decompression (more for consoles with limited RAM and/or disc space) will also benefit from receiving threads of their own. These are all mostly CPU-related tasks that can't really be done on graphics cards; they've also seen little use in the past, or they should see massive improvements (better AI/physics/animation) in the near future. A game like Unreal Tournament 2007 will use the latest physics technology (even supporting the physics processor), new AI, and procedurally generated content (not sure about their animation). I'm not sure if that will be enough to make games more CPU-intensive, but it would be interesting to see how single-core processors compare to dual-core processors, especially at framerates of around 60 FPS or so.

    Well, Lil Pun has gotten quite a bit of info on dual core processors. And I've learned how a dog's breath smells like as well as the breath of a cat. :)
     
  9. Mr. Brightside

    Joined:
    Mar 27, 2005
    Messages:
    18,964
    Likes Received:
    2,147
    ^^in for mat ion over load. de struct.
     
  10. Kyrodis

    Kyrodis Member

    Joined:
    Dec 11, 2002
    Messages:
    1,336
    Likes Received:
    22
    I'm only going to address gaming technology here, since we both agree that server/workstation type applications are extremely CPU hungry and will see large performance increases. As for online resources...I can't point you to any. All my arguments are based on two engineering degrees (admittedly received a long while ago), personal experience in the workplace, and some common sense.

    And again, I'm not actually saying there won't be performance increases. It's just that you seem completely convinced that these "magical code changes" in the future will somehow solve all problems. Software only runs as fast as the hardware it's on. No amount of code changes will make things run faster than any hardware bottlenecks in place.

    Uh...what? If current games are NOT bottlenecked by CPUs, what difference does it make even if we optimize the code to take advantage of multi-core processing? Forget all the online articles please. Stop spewing names of new technologies as if they actually mean anything. Just use common sense and think about how bottlenecks work.

    Example (and I'm just picking random numbers here to illustrate my point):
    -A graphics card can do 80 operations a second.
    -A Single CPU can do 100 operations a second.

    The CPU is not the bottleneck here. The game will only run as fast as the graphics card can process stuff. It'll run at 80 operations a second.

    -A graphics card can do 80 operations a second.
    -Two CPUs that can each do 100 operations a second.

    If we find a way to optimize the game code to take advantage of the dual-cpus, we can potentially have 200 non-video operations done every second right? However, the graphics card can still only do 80 operations a second. No matter how fast we do other types of calculations, we can still only play the game at an 80 ops/second speed. Hence, NO PERFORMANCE INCREASES. Do you understand what I'm saying now?

    Now as it turns out, on current machines...the CPU IS the bottleneck (especially at lower resolutions). So, yes changing game code to take advantage of multi-cpu computers WILL improve performance.

    See, I'm not actually disagreeing with you per se, but you really need to understand that the so-called "code optimization" people are always talking about isn't going to magically make things significantly faster. There are plenty of limiting factors. First of all, there are only so many threads that can be done at once. Sometimes you still need to do things serially. For example, we can't wash and dry a load of clothes at the same time right? Even though there are two separate machines, you have to do one thing after another.

    Second is the future video card bottleneck. Even if there was a way to change code to do 100 non-video calculations at once on a 100-core cpu, will the game run 100 times faster? Yes, but only if the video card can keep up. If it can't, then we'll still only be playing games as fast as the video card can render.
     
    #50 Kyrodis, Jun 24, 2006
    Last edited: Jun 24, 2006
  11. RC Cola

    RC Cola Member

    Joined:
    Jun 11, 2002
    Messages:
    11,504
    Likes Received:
    1,347
    Out of curiosity, what is your take on the non-server/workstation type applications that are also CPU hungry that have seen large performance increases? AFAIK, there have been a number of applications that have benefited from multiple cores/threads. Maybe not as everyday as Word, but more so than server/workstations type applications.

    Again, I agree that there won't be any magical code that would make something like Microsoft Word see a performance increase. Anything that is bound by something other than the CPU probably won't see much gains with optimized code for multi-core systems. Of course, as I mentioned earlier, they probably won't see any gains due to any CPU improvements (something like Word probably stopped seeing gains a long time ago). A 100 GHz processor probably runs Word the same way as a 1.5 GHz processor, just like a 100-core processor would probably run Word the same way as a single-core processor.

    On the other hand, there are several important applications that require large amounts of CPU power. They either already take advantage of multi-core systems or might see improvements if optimized for them.

    Taking existing games and optimizing their code wouldn't really help. No disagreement here.

    I know you were just picking random numbers, but I'm kind of having a difficult time accepting them since, AFAIK, graphics cards can do WAY more operations per second that CPUs...of course, those operations are much more simple. I'm not sure if this changes things, but I'm going to try to illustrate your point by displaying the FPS a game runs at while maxing out a graphics card or CPU. So something like this:
    Graphics Card - maxed out at 80 FPS
    Single-core CPU - maxed out at 100 FPS
    As in your case, the graphics card is the bottleneck. Increasing the CPU performance won't change anything, and obviously changing the graphics card will have an impact on the performance. Then:
    Graphics Card - maxed out at 80 FPS
    Dual-core CPU - One core maxes out at 100 FPS, the other is left idle.

    Again, the same as in your example, optimizing code for the CPU (200 FPS or whatever) won't make the game perform better than 80 FPS. Hopefully this illustrates the point you were trying to make. Sorry about changing your example, but this works better in my head, especially since I see more examples using these types of numbers.

    Again, I don't disagree with you at all so far. But I don't think that future games will just try to optimize code that was written for single-core machines. Rather than making code run faster, they'll make the processor do more things at the same speed. For example, let's say that in the above example, a game maxes out a dual-core processor at 200 FPS. Would a developer really want that much of a disparity between the CPU and GPU performance? Rather than keeping the game maxing out the CPU at 200 FPS, maybe the developer will want to put it back at 100 FPS. So they'll push more physics, improve the AI, improve the animation, etc., and eventually the dual-core CPU maxes out at 100 FPS (both cores working together). So we have this:
    Graphics Card: 80 FPS
    Dual-core CPU: 100 FPS

    Again, the graphics card is the bottleneck. But what would happen if that was a single-core CPU? Maybe something like this:
    Graphics Card: 80 FPS
    Single-core CPU: 50 FPS

    Now the CPU is the bottleneck. These numbers probably don't work out as such (dual-core running single-core code 2x as fast and single-core running dual-core code half as fast), but maybe you get the point I'm trying to make.

    I'm not sure I follow you here, except for the lower resolutions part (lower resolutions would allow the graphics card to push more FPS while keeping the CPU FPS roughly the same).

    I don't think we are disagreeing either. To be honest, I'm not sure what exactly to expect from games that "take advantage of multi-core CPUs." My main reason for posting in this thread was to point out that there were a number of other applications that were already seeing performance boosts from going dual core (>20%). As I've said in the past, games will probably see some of the smallest improvements compared to other applications, although whether that is a worthwhile improvement is up for debate I guess. But until some of those games come out, I won't really know for sure. I'd like to put this away until then (maybe until UT 2007 comes out).

    I believe Herb Sutter mentioned this in the link I gave earlier. I believe his analogy was that you couldn't have 9 women produce a baby in 1 month; however, you could get 9 babies after 9 months. He kind of talks more about code that you can multi-thread and code that you can't. This is obviously a big concern in a number of applications, especially games. Game developers are some of the best programmers out there IMO, so I'd liked to see what type of solutions they can come up with.

    I kind of touched on this earlier, but if there is that much of a gap between CPU and GPU bottlenecks, then developers should have the CPU do more. Using my earlier example, that would be 100 cores capable of 100 FPS each (theoretical max of 10,000 FPS?). That's a lot of power to be missing out on. And AFAIK, we don't have perfect physics, perfect animation, perfect AI, etc., so there are some things to use that power on. I'm sure video cards will be the bottlenecks for a while (unless we see some sort of merger between the two), but hopefully we'll see more than 1% utilization of the CPU as well.
     
  12. CrazyJoeDavola

    Joined:
    Apr 30, 2003
    Messages:
    2,328
    Likes Received:
    3,083
    why is it than when you take your left shoe off, your right one is left?
     
  13. Kyrodis

    Kyrodis Member

    Joined:
    Dec 11, 2002
    Messages:
    1,336
    Likes Received:
    22
    RC Cola let me see if I get what you're saying...

    You're arguing that future games will take advantage of dual-cpu solutions in a way so that it will be able to perform more-complicated physics/AI/etc. computations that will improve the overall feel of the game.

    You're not saying that the speed at which each pixel is rendered in the game itself will be significantly faster as a direct result the parallel calculations. Moreover, taking these more complicated parallel computations and attempting to run them on a single-cpu machine is where we'll see a disparity in performance.

    If that's what you're saying, then we're in complete agreement and there's no need to clog this thread with anymore tech talk. ;)

    As for my opinions on non-"everday" applications? Well, engineering CAD/simulation tools and art or video encoding/processing/rendering tools usually use a ton of CPU time. Altering these tools to perform non-dependent threads in parallel would therefore increase performance of these tools a great deal.
     
  14. SwoLy-D

    SwoLy-D Member

    Joined:
    Jul 20, 2001
    Messages:
    37,618
    Likes Received:
    1,456
    Left is the past participle of the verb "to leave", so that's why it's left. You didn't know what?
    Progress has nothing to do with "pro" and "con", other than their first three letters being the same. The opposite of Progress is "regress".

    Yes, Lil Pun and JuanValdez.. I am back... I was just watching a little soccer before I saw this thread. :D
     
  15. RC Cola

    RC Cola Member

    Joined:
    Jun 11, 2002
    Messages:
    11,504
    Likes Received:
    1,347
    Yeah, that's basically what I'm saying. We can leave this thread alone now. :)
     

Share This Page