Honestly, both together is a good option for right now. But we can't take current games and make a decision, since current machines aren't powerful enough. Which is why current real time lighting has such hard edges. Your computer can't continuously calculate the reflections and refractions that almost go on forever in real lighting. If they could: your opinions would change.
I think the addition of a physics processor would greatly help, since it can be used for anything you want (Yes, it can. You can even use your GPU to process sound if you write code for it) So, perhaps some companies would utilize it for lighting others for physics, or a combination. Who knows? Maybe there will be a duel core processor on graphics cards later so one can deal with light while the other draws.
The first and foremost problem as to why this stuff still isn't mainstreamed in games is the computer architectures that are out and affordable. Need to get rid of this bottle necked design first, then other things will be easier to do.
Re: pre-renedered or dynamic lighting?
Posted by Crono on Fri Jul 29th at 10:53pm 2005
Posted by Crono on Fri Jul 29th at 10:53pm 2005
Blame it on Microsoft, God does.
Re: pre-renedered or dynamic lighting?
Posted by KungFuSquirrel on Sat Jul 30th at 2:43am 2005
Yep. Shadows are only as dark as you let them go. If you have pure black shadows, it's not the technology, it's you not filling it in (for whatever reason). That applies to most lightmapped engines as well (particularly when you get to stuff developed without CSG editors) - radiosity is not the standard.

KungFuSquirrel
member
751 posts
345 snarkmarks
Registered: Aug 22nd 2001
Location: Austin TX
Occupation: Game Design, LightBox Interactive
Posted by KungFuSquirrel on Sat Jul 30th at 2:43am 2005
? quote:
Perhaps that is a result of art direction and not technology
Yep. Shadows are only as dark as you let them go. If you have pure black shadows, it's not the technology, it's you not filling it in (for whatever reason). That applies to most lightmapped engines as well (particularly when you get to stuff developed without CSG editors) - radiosity is not the standard.
KungFuSquirrel
member
751 posts
345 snarkmarks
Registered: Aug 22nd 2001
Location: Austin TX

Occupation: Game Design, LightBox Interactive
<A HREF="http://www.button-masher.net" TARGET="_blank">www.button-masher.net</A>
Re: pre-renedered or dynamic lighting?
Posted by SaintGreg on Sun Jul 31st at 3:40pm 2005
You can't honsetly tell me that it looks more wrong than HL2's shadows. Let me give you an overview of HL2's shadowing on models:
-Shadows only come from the sun, so pretty much all indoor lighting is completely borked
-You get shadows even when you are standing in the shade of a building or something, shadows overlap each other, like when you hold 2 doors over top of each other, the spot where they overlap will be darker than the spots where they dont
-Models dont recieve shadows, especially not from themselves
As for a physics processor, I think it'd be more lucrative just to perform those simulations on the gpu, considering that top of the line gpu's now are just barely pushing their limits. Using the gpu for doing some other stuff seems more logical than having to have another card residing in my computer. The only problem is that it becomes icky to formulate your task into gpu-terms.
Posted by SaintGreg on Sun Jul 31st at 3:40pm 2005
? quote:
current real-time tech just looks "wrong"
You can't honsetly tell me that it looks more wrong than HL2's shadows. Let me give you an overview of HL2's shadowing on models:
-Shadows only come from the sun, so pretty much all indoor lighting is completely borked
-You get shadows even when you are standing in the shade of a building or something, shadows overlap each other, like when you hold 2 doors over top of each other, the spot where they overlap will be darker than the spots where they dont
-Models dont recieve shadows, especially not from themselves
As for a physics processor, I think it'd be more lucrative just to perform those simulations on the gpu, considering that top of the line gpu's now are just barely pushing their limits. Using the gpu for doing some other stuff seems more logical than having to have another card residing in my computer. The only problem is that it becomes icky to formulate your task into gpu-terms.
To get something to work, sometimes you just have to beat your head against the wall longer; the skin grows back, but the brick doesn't.
Source hates soup!
Source hates soup!
© Snarkpit.net 2001 - 2023, about us, donate, contact
Snarkpit v6.1.0 created this page in 0.022 seconds.

Snarkpit v6.1.0 created this page in 0.022 seconds.

