log in | register | forums
Show:
Go:
Forums
Username:

Password:

User accounts
Register new account
Forgot password
Forum stats
List of members
Search the forums

Advanced search
Recent discussions
- Elsear brings super-fast Networking to Risc PC/A7000/A7000+ (News:)
- Latest hardware upgrade from RISCOSbits (News:)
- RISC OS London Show Report 2024 (News:1)
- Announcing the TIB 2024 Advent Calendar (News:1)
- Code GCC produces that makes you cry #12684 (Prog:39)
- RISCOSbits releases a new laptop solution (News:)
- Rougol November 2024 meeting on monday (News:)
- Drag'n'Drop 14i1 edition reviewed (News:)
- WROCC November 2024 talk o...ay - Andrew Rawnsley (ROD) (News:2)
- October 2024 News Summary (News:3)
Latest postings RSS Feeds
RSS 2.0 | 1.0 | 0.9
Atom 0.3
Misc RDF | CDF
 
View on Mastodon
@www.iconbar.com@rss-parrot.net
Site Search
 
Article archives
Acorn Arcade forums: Games: TEK Review?
 
  TEK Review?
  This is a long thread. Click here to view the threaded list.
 
Max Palmer Message #84438, posted by Max at 08:54, 15/6/2002
Member
Posts: 66
Hi,

Is there going to be a review of TEK soon ?
Are there any reviews out there on the web ?

Cheers,

Max

  ^[ Log in to reply ]
 
Rob Simm Message #84439, posted by RobSimm at 14:19, 16/6/2002, in reply to message #84438
Member
Posts: 6
I'd like to see one too, haven't bought it yet and would like to see that all the bugs are ironed out before I spend my money.

Rob.

  ^[ Log in to reply ]
 
Jason Togneri Message #84440, posted by filecore at 06:17, 17/6/2002, in reply to message #84439

Posts: 3868
In that case, take your time. I believe most of the major bugs were sorted by the patch but some copies (well, mine at least) don't seem to have been repaired by the patch. I'd wait for some really positive feedback if I were you.

Saying that, if you DO get positive feedback, then go out and buy it! The bugs won't let me progress far in the game but it has a lot of potential and the others seem to think the later levels are really good!

Enjoy.

- Jason

  ^[ Log in to reply ]
 
Simon Challands Message #84441, posted by SimonC at 15:49, 17/6/2002, in reply to message #84440
Elite
Right on, Commander!

Posts: 398
If it still doesn't work keep bombarding them with bug reports. I did that with Descent II (not an Artex game, I know), and eventually wound up with a pretty near stable game (for my machine, anyway).
  ^[ Log in to reply ]
 
Jan Klose Message #84442, posted by Jan Klose at 16:32, 18/6/2002, in reply to message #84441
AA refugee
Posts: 42
Wait, please *don't* bombard us with bug reports! We've done all that was possible so far, and for those who are *still* experiencing problems, a new patch will be released within the next few days (at first only to a selected number of people to see if it cures their problems before we release it to the public).

However, if you have new bugs or infos that might help tracking them, then do not hesitate to send them to us.

Cheers,
Jan

  ^[ Log in to reply ]
 
Simon Challands Message #84443, posted by SimonC at 20:06, 18/6/2002, in reply to message #84442
Elite
Right on, Commander!

Posts: 398
I didn't mean to encourage people to make a nuisance of themselves. I meant to mean that letting the developer know what's up can get results.
  ^[ Log in to reply ]
 
Andrew Message #84444, posted by andreww at 10:30, 25/6/2002, in reply to message #84443
AA refugee
Posts: 555
Istr Jan replied to this yesterday and I think it's disappeared like my Icon Bar post.
  ^[ Log in to reply ]
 
Mark Quint Message #84445, posted by ToiletDuck at 10:42, 25/6/2002, in reply to message #84444
Ooh ducky!Quack Quack
Posts: 1016
frown

Do we have to blame Neo2 or Neo3 Mr Rich? shock

  ^[ Log in to reply ]
 
Jeffrey Lee Message #84446, posted by Phlamethrower at 16:33, 25/6/2002, in reply to message #84445
PhlamethrowerHot Hot Hot Hot Hot Hot Hot Hot Hot Hot Hot Hot Hot stuff

Posts: 15100
I blame Neo2 - AA was a bit poorly yesterday.
  ^[ Log in to reply ]
 
Jonathan Atherton Message #84447, posted by Jon Atherton at 19:20, 3/7/2002, in reply to message #84446
AA refugee
Posts: 40
I've just polished off my review of Tek for Archive. Measuring 2324 words in length it wll most probably be the definative review of Tek. I've been totally honest and have made criticisms where I feel they are required. I think it benefitted me buying it rather than having a free review copy as then I can say wether I'm happy with the money I've spent.
  ^[ Log in to reply ]
 
Tony Haines Message #84448, posted by Loris at 09:48, 4/7/2002, in reply to message #84447
madbanHa ha, me mine, mwahahahaha
Posts: 1025
I showed it to my brother at the weekend. He commented that there were games like it [ie of that standard] on PCs 5 years ago.
I didn't entirely agree with him (and told him so). But I can see what he means.
The funny thing about TEK is that some features are really innovative, and others are a little backward.
Perhaps the first thing a PC user would notice is the graphics. This isn't a fault of the game, since nowadays it seems that most of the oomph in a PC is in the graphics card. Can't be helped really.
But perhaps the thing which hurts the most is the intelligence of the units. Mostly it is the way groups of them interact, and fail to pass each other sensibly. But also the pathfinding seems a little odd, with units often wandering off in the wrong direction, or just getting stuck and giving up. These are hard programming problems so it is understandable, but no less unfortunate.
But on the plus side, some of the great things are:
Atlases - creating units where they are built then having to move them around just plain makes sense. And seems to be a good play element too. 10/10 for this.
The constuction list is good too - perhaps some other RTS have this as well I expect.

On balance I'm very glad I got this, so it must be good. smile. Any chance of a campaign designer set, Jan?

  ^[ Log in to reply ]
 
Andrew Message #84449, posted by andreww at 11:13, 4/7/2002, in reply to message #84448
AA refugee
Posts: 555
The graphics look fantastic!
  ^[ Log in to reply ]
 
Tony Haines Message #84450, posted by Loris at 16:22, 4/7/2002, in reply to message #84449
madbanHa ha, me mine, mwahahahaha
Posts: 1025
The graphics look fantastic!
But you see they are 'only' 256 colours and not blended.
This really doesn't bother me (except for the invisible little bullets - which is really another point entirely smile) but the point is that this would be considered archaic on the PC.. 'Cos they just splat it all to the graphics card and it does it for free.
As I said, I'm not complaining, I can just see why at first glance it looks out-of-date.
And yes, the actual graphics are very nice.

I notice you didn't defend the AI : "But its my best mate"... "They've got lovely personalities"... wink

  ^[ Log in to reply ]
 
Jonathan Atherton Message #84451, posted by Jon Atherton at 17:26, 4/7/2002, in reply to message #84450
AA refugee
Posts: 40
Come on Toby this is one graphics artist who has obviously worked really hard. Don't forget that similar PC titles have several graphics artists and several programmers. To me the graphics are one of the best parts of the game.
  ^[ Log in to reply ]
 
Max Palmer Message #84452, posted by Max at 20:17, 4/7/2002, in reply to message #84451
Member
Posts: 66
Come on Toby this is one graphics artist who has obviously worked really hard. Don't forget that similar PC titles have several graphics artists and several programmers. To me the graphics are one of the best parts of the game.

Thanks - it did take me a while and ate up a fair bit of spare time. Working in 256 colours also added a lot more problems from the artistic point of view - and don't talk to me about lack of alpha blending !

Oh - and all for nothing ;-)

Max

  ^[ Log in to reply ]
 
Tony Haines Message #84453, posted by Loris at 17:29, 5/7/2002, in reply to message #84452
madbanHa ha, me mine, mwahahahaha
Posts: 1025
Assuming you guys were talking to me...

I don't think I've successfully transmitted my points.
I specifically wasn't saying that the graphics were poor. You can find in my last two messages phrases like:

<re. noticing the graphics>

...This isn't a fault of the game. ...

...'only' 256 colours...

[notice the quotes - this implies that is more than enough]
And yes, the actual graphics are very nice.
(Don't think this is damning with faint praise; I just didn't want to repeat the word fantastic - perhaps I should have said 'great'.)

What I meant was that regardless of the quality of the bitmaps (which is very high), the first impression (of a PC user) was of low colour resolution and no alpha-blending.
These are engine constraints - and as I implied, this is a compromise given the systems ability.
[Virtually] all PC games nowadays assume access to very powerful graphics cards, which can draw textured, alpha-blended etc. polygons really well.
This means that they relieve the processor of the burden of drawing graphics, and alpha-blending can be done 'for free'. Whereas on the RiscPC it would slow the plotting down immensely. So the programmers of TEK presumably made the sensible decision to provide a higher frame-rate without high colour and alpha-blending. Nothing wrong with that, and no aspersions cast on the graphics artist (the reverse in fact - Max, I do appreciate your effort and don't hold you responsible for drawing a 1 pixel black bullet smile).

just a slight tangent here..
I'm personally not entirely swayed by graphics cards. While they do allow PCs to produce good-looking games, they do now all look a bit samey. Ideally I think there would be just a lot more power in the CPU (or multiple processing units), so that the programmer could use whatever plotting style was most appropriate, or use the processing for something else. I hope this is the way things will eventually go, so that games can have unique looks again.

Finally, to be brutally honest, I think you fellas could all read between the lines a little less and the lines themselves a little more.

and still noone defends the AI ;-)

  ^[ Log in to reply ]
 
Lee Johnston Message #84454, posted by johnstlr at 08:56, 6/7/2002, in reply to message #84453
Member
Posts: 193
just a slight tangent here..
I'm personally not entirely swayed by graphics cards. While they do allow PCs to produce good-looking games, they do now all look a bit samey. Ideally I think there would be just a lot more power in the CPU (or multiple processing units), so that the programmer could use whatever plotting style was most appropriate, or use the processing for something else. I hope this is the way things will eventually go, so that games can have unique looks again.

Continuing slightly off topic I think it's worth bearing in mind that a lot of this came about with the first generation of cards that could do transformation and lighting. At this point every game running on the same card would give similar effects because they were constrained by the parameters used by the card to construct the pipeline.

With the advent of vertex and pixel shaders in DirectX 8 developers now have pretty much full control of the full rendering process so, if they make use of it, this phenomeon should slowly disappear.

It's also worth remembering that many PC games are written using a relatively select subset of game engines because it's often too costly to create both an engine and in depth game.

  ^[ Log in to reply ]
 
Jan Klose Message #84455, posted by Jan Klose at 15:48, 6/7/2002, in reply to message #84454
AA refugee
Posts: 42
"and still noone defends the AI ;-)"

...because the AI surely has some shortcomings. But in my personal opinion they are mainly found in the "low level" part, i.e. how clever do the units move around obstacles or how good do they look when attacking a hostile unit. The units are not generally "dumb" (they help allies that are close by and are attacked, the find their way around almost any obstacle in almost any situation - unless they block each other, but they also try to move out of each other's way), but of course they could do things better in certain situations.
By the way, we are close to releasing "patch 3" which will probably also improve the AI a bit. smile

  ^[ Log in to reply ]
 
Andrew Message #84456, posted by andreww at 19:44, 6/7/2002, in reply to message #84455
AA refugee
Posts: 555

But you see they are 'only' 256 colours and not blended.
This really doesn't bother me (except for the invisible little bullets - which is really another point entirely smile) but the point is that this would be considered archaic on the PC.. 'Cos they just splat it all to the graphics card and it does it for free.
As I said, I'm not complaining, I can just see why at first glance it looks out-of-date.
And yes, the actual graphics are very nice.

I didn't defend the Delai Lama but that doesn't say I don't like him wink
Just addressing one point that's all.
Didn't find anything wrong with the gfx at all even the explosions..

  ^[ Log in to reply ]
 
Andrew Message #84457, posted by andreww at 19:46, 6/7/2002, in reply to message #84456
AA refugee
Posts: 555
"and still noone defends the AI ;-)"
...because the AI surely has some shortcomings. But in my personal opinion they are mainly found in the "low level" part, i.e. how clever do the units move around obstacles or how good do they look when attacking a hostile unit. The units are not generally "dumb" (they help allies that are close by and are attacked, the find their way around almost any obstacle in almost any situation - unless they block each other, but they also try to move out of each other's way), but of course they could do things better in certain situations.
By the way, we are close to releasing "patch 3" which will probably also improve the AI a bit. smile
I wish you could extend the range at which units help other units in trouble Jan. It looks quite strange to have an enemy blasting my base and have a floater closeby doing nothing!
  ^[ Log in to reply ]
 
Tony Haines Message #84458, posted by Loris at 08:50, 8/7/2002, in reply to message #84457
madbanHa ha, me mine, mwahahahaha
Posts: 1025
Continuing slightly off topic I think it's worth bearing in mind that a lot of this came about with the first generation of cards that could do transformation and lighting. At this point every game running on the same card would give similar effects because they were constrained by the parameters used by the card to construct the pipeline.

With the advent of vertex and pixel shaders in DirectX 8 developers now have pretty much full control of the full rendering process so, if they make use of it, this phenomeon should slowly disappear.

This is all true... but it really only proves my point. To some extent new cards (with programmable features) fix the problem, but still they will not be perfectly adaptable - still based around drawing polygons for example. OK so some cards may allow certain types of curved surface etc - but not other surfaces, or non-surfaces.
Link: - cycle of reincarnation
Since I've been swayed by the risc argument I just champion the CPU. And I favour diversity in games, so I'm up for the most adaptable option.
and just think of all that processing power :-)

<snip>

  ^[ Log in to reply ]
 
Tony Haines Message #84459, posted by Loris at 09:07, 8/7/2002, in reply to message #84458
madbanHa ha, me mine, mwahahahaha
Posts: 1025
"and still noone defends the AI ;-)"
...because the AI surely has some shortcomings. But in my personal opinion they are mainly found in the "low level" part, i.e. how clever do the units move around obstacles or how good do they look when attacking a hostile unit. The units are not generally "dumb" (they help allies that are close by and are attacked, the find their way around almost any obstacle in almost any situation - unless they block each other, but they also try to move out of each other's way), but of course they could do things better in certain situations.
Jan, I hope you realise that I was not entirely serious.
I think you are right about the problem being low-level (ie. pathfinding). This is a tricky problem I can see (especially the blocking problem). I recently read an article somewhere on the web (possibly by Chris Crawford) where he suggested solving the problem by changing the parameters. (He stopped units getting stuck in bow-shaped lakes by removing the bow-shaped lakes).
The 'unit-thrashing' (where a group of units go nowhere because they are all bouncing off each other, or an attacked unit cannot escape from an enemy unit nearby) could perhaps be solved in some similar manner. Perhaps you could allow units to pass much closer to each other, while maintaining a strong preference for some minimal distance apart?
The other thing is that while units may find their way around things, they often choose a somewhat non-optimal route smile. This is often frustrating when the units blunder into enemy fire, or don't arrive in time to rescue something. Is this fixable?

By the way, we are close to releasing "patch 3" which will probably also improve the AI a bit. smile
...There was a patch 2? smile
  ^[ Log in to reply ]
 
Lee Johnston Message #84460, posted by johnstlr at 13:39, 8/7/2002, in reply to message #84459
Member
Posts: 193
This is all true... but it really only proves my point. To some extent new cards (with programmable features) fix the problem, but still they will not be perfectly adaptable - still based around drawing polygons for example. OK so some cards may allow certain types of curved surface etc - but not other surfaces, or non-surfaces.
Link: - cycle of reincarnation

It's an interesting argument but it seems to miss a few points

1) One of the main benefits of off-loading to the graphics card is that it executes in parallel. In general CPUs execute code serially and there probably isn't a processor available that both run the game and render the graphics.

2) Graphics processors (as with most specialist chips, ie FPAs, other DSPs) perform relatively restricted tasks. While it's true that FPAs tend to be placed on the same silicon as the main CPU I'd bet that they're still developed as a separate block. The new GPU's may be programmable, but they're only programmable within the specification of something like DirectX 8. I'd be surprised if you could run a general purpose app on one without considerable effort.


Since I've been swayed by the risc argument I just champion the CPU. And I favour diversity in games, so I'm up for the most adaptable option.
and just think of all that processing power :-)

<snip>

But the RISC viewpoint is that you have a processor do one thing only, and do it very well, which is what the ARM does. The argument you linked to favours the Pentium approach with everything on one chip.

If anything the RISC viewpoint should be championing the opposite to the link you gave. Lots of small, dedicated processors that do relatively restricted tasks, run fast and consume little power. Ideally the processor cores should be embedded into devices to control deedicated processors which, if you think about it, is precisely what ARM does.

  ^[ Log in to reply ]
 
Tony Haines Message #84461, posted by Loris at 14:36, 8/7/2002, in reply to message #84460
madbanHa ha, me mine, mwahahahaha
Posts: 1025
Heh, maybe we should take this elsewhere?
Oh well, it is game-related

<snip my text>

It's an interesting argument but it seems to miss a few points

1) One of the main benefits of off-loading to the graphics card is that it executes in parallel. In general CPUs execute code serially and there probably isn't a processor available that both run the game and render the graphics.

To take your last point first, this is not a problem if the chipset is powerful enough. You seem to have missed my reference to multiple processing units running in parallel.
I may be mistaken, but for the same amount of silicon etc. dedicated to the GPU on a PC, one could get quite a few ARM chips.
So one could dedicate, say, one ARM chip to render each 1/16th of the screen, and have them all working in parallel - with one or more ARM chips doing the other game code.

So I'm basically suggesting replacing
CPU > GPU > screen
with
PU(s) > PU(s) > screen
If the processing units are generalised, and fairly symmetrical, then one could choose the optimum pathway for the task in hand.
Need more graphical grunt? use a larger proportion of your chips to do that. Want more processing power? Take them.

2) Graphics processors (as with most specialist chips, ie FPAs, other DSPs) perform relatively restricted tasks. While it's true that FPAs tend to be placed on the same silicon as the main CPU I'd bet that they're still developed as a separate block. The new GPU's may be programmable, but they're only programmable within the specification of something like DirectX 8. I'd be surprised if you could run a general purpose app on one without considerable effort.
Um, this is sort of my point. If you are not using them to draw 10^6 textured polys, they are wasted. Want to do more processing? Well, tough because you can't. If you had a lot of general chips in there instead (or one more powerful one) then the programmer is free to allocate resources as necessary.


Since I've been swayed by the risc argument I just champion the CPU. And I favour diversity in games, so I'm up for the most adaptable option.
and just think of all that processing power :-)
But the RISC viewpoint is that you have a processor do one thing only, and do it very well, which is what the ARM does. The argument you linked to favours the Pentium approach with everything on one chip.
It is?
I'd have said it was to have the minimum number of maximally useful instructions, which can therefore be (hardware)optimised to go as fast as possible.
The CISC approach being to have an instruction for anything which might conceivably come in useful someday.

I'm suprised you say that about the ARM - it is Turing complete!

If anything the RISC viewpoint should be championing the opposite to the link you gave. Lots of small, dedicated processors that do relatively restricted tasks, run fast and consume little power. Ideally the processor cores should be embedded into devices to control deedicated processors which, if you think about it, is precisely what ARM does.

Hmm, well I disagree with your diagnosis..
So you want bitmaps? Take this hardware sprite controller.
Want to do NURBS? You need a NURBS chip.
Want transparancy? here is a blending module
Like voxels? This'll do it
Oh you want a specialised version of voxels?
You'll need that as well.
...

OK so I'm being facetious, but the point is that there are many interesting ways of managing the visual display (and other outputs).
If you want every game to use (as currently) polys manipulated in the same sort of ways, you'd be right that a specialised GPU would be the way to go (and it would be part of the RISC philosopy in that case too). But if, like me, you want to see games doing different things in different ways, then I suggest that the generalised processing route is best.

  ^[ Log in to reply ]
 
Andrew Message #84462, posted by andreww at 15:41, 8/7/2002, in reply to message #84461
AA refugee
Posts: 555
If anything the RISC viewpoint should be championing the opposite to the link you gave. Lots of small, dedicated processors that do relatively restricted tasks, run fast and consume little power. Ideally the processor cores should be embedded into devices to control deedicated processors which, if you think about it, is precisely what ARM does.

Shame the Hydra from Simtex was abandoned. A great, great shame in fact.

This would enable multi-threading in games and the OS to some degree wouldn't it?

[Edited by andreww at 16:42, 8/7/2002]

  ^[ Log in to reply ]
 
Lee Johnston Message #84463, posted by johnstlr at 17:28, 8/7/2002, in reply to message #84462
Member
Posts: 193
Heh, maybe we should take this elsewhere?

Maybe wink


To take your last point first, this is not a problem if the chipset is powerful enough. You seem to have missed my reference to multiple processing units running in parallel.

I certainly did, and I apologise for missing that point smile


I may be mistaken, but for the same amount of silicon etc. dedicated to the GPU on a PC, one could get quite a few ARM chips.
So one could dedicate, say, one ARM chip to render each 1/16th of the screen, and have them all working in parallel - with one or more ARM chips doing the other game code.

True, although ultimate throughput would be limited by access contention on the memory itself.

Actually IIRC the PowerVR chipset does what you're suggesting, just in a dedicated GPU.


So I'm basically suggesting replacing
CPU > GPU > screen
with
PU(s) > PU(s) > screen
If the processing units are generalised, and fairly symmetrical, then one could choose the optimum pathway for the task in hand.
Need more graphical grunt? use a larger proportion of your chips to do that. Want more processing power? Take them.

So you get the best of both worlds?


It is?
I'd have said it was to have the minimum number of maximally useful instructions, which can therefore be (hardware)optimised to go as fast as possible.
The CISC approach being to have an instruction for anything which might conceivably come in useful someday.

I'm suprised you say that about the ARM - it is Turing complete!

Methinks I've got my wires crossed.


OK so I'm being facetious,

Actually I think you're raising a fair point about the way the computing industry has progressed.


If you want every game to use (as currently) polys manipulated in the same sort of ways, you'd be right that a specialised GPU would be the way to go (and it would be part of the RISC philosopy in that case too). But if, like me, you want to see games doing different things in different ways, then I suggest that the generalised processing route is best.

I guess what it comes down to is that most people seem to want the kinds of games current GPU's are good for (or rather developers write them precisely because they're the kinds of games the GPUs are good for).

I can't argue with your points - I'll think before opening my mouth next time wink

  ^[ Log in to reply ]
 
Lee Johnston Message #84464, posted by johnstlr at 17:31, 8/7/2002, in reply to message #84463
Member
Posts: 193
Shame the Hydra from Simtex was abandoned. A great, great shame in fact.

This would enable multi-threading in games and the OS to some degree wouldn't it?

Hardware doesn't automatically allow multithreading - the OS has to be able to manage the processor resources available properly. Hydra would not have allowed multithreading in RISC OS without proper support from RISC OS itself. IIRC Simtec created an API for programming Hydra directly. They also created a multithreading module that was never completed (or at least the version I've got wasn't).

  ^[ Log in to reply ]
 
Andrew Message #84465, posted by andreww at 22:26, 8/7/2002, in reply to message #84464
AA refugee
Posts: 555
I thought you'd say that smile
Yes I thought that they'd released some kind of software which would allow the user to take advantage of the multi-threading.
Even for just games it would be interesting to see what could have been done.
  ^[ Log in to reply ]
 
Tony Haines Message #84466, posted by Loris at 13:22, 9/7/2002, in reply to message #84465
madbanHa ha, me mine, mwahahahaha
Posts: 1025
This is quite a long posting. Sorry.

To take your last point first, this is not a problem if the chipset is powerful enough. You seem to have missed my reference to multiple processing units running in parallel.

I certainly did, and I apologise for missing that point smile

To be fair I didn't labour the point, and in many ways it would be preferable to have a single processing unit, however it would need to be awfully fast.


I may be mistaken, but for the same amount of silicon etc. dedicated to the GPU on a PC, one could get quite a few ARM chips.
So one could dedicate, say, one ARM chip to render each 1/16th of the screen, and have them all working in parallel - with one or more ARM chips doing the other game code.

True, although ultimate throughput would be limited by access contention on the memory itself.

Memory access seems to be a bottle-neck on many systems. I'm somewhat constrained by having very little knowledge of hardware issues. But I've been thinking about it. If you give each processor its own area of memory it can use that without clashing with the others most of the time. There could also be shared areas of memory which any chip could use.
If you have multiple processors apparently a major problem is getting them to communicate efficiently. Many supercomputers have lots of processors all wired up to each other, or have an internal messaging system. I'd hope neither of these would be necessary - I guess the idea is to try and get the chips working independently as far as possible. I think my cousin referred to this as trivial parallelism. Maybe this would be no bad thing - it might mean that programmers wouldn't worry too much about communication, and could optimise graphics etc more in isolation.

Actually IIRC the PowerVR chipset does what you're suggesting, just in a dedicated GPU.

<grin> I knew this - it is what made me think it was feasible.
It occurred to me that it could be a bit more flexible though - at the cost of adding a dedicated display chip (which would probably be necessary anyway). This could be programmed to request areas of memory in whatever pattern from the different processor units, and feed them to the screen as necessary. This might help multi-player modes, for instance.

...
If the processing units are generalised, and fairly symmetrical, then one could choose the optimum pathway for the task in hand.
...

So you get the best of both worlds?

Well, I suppose it is more that you can choose which world is most appropriate, but yes, that is the intention.

...

OK so I'm being facetious,

Actually I think you're raising a fair point about the way the computing industry has progressed.

I think the thing is that with electronics you can get more bang for your buck if make something less generalised. So we've had a platform game era (bitmap plotting), and now a FPS era (polygon plotting). So I'm actually rather hopeful that as processing power approaches that sufficient to render 'anything' realtime, the processing pathway will become less 'dedicated' - ie the processor will take over.

Heh, maybe that would be a good way to replace the x86 architecture - use graphics cards as a trojan horse.
First, create a graphics card which is basically a clump of ARM chips. This must be either or both:
a) better than other graphics cards at poly plotting.
b) cheaper than other graphics cards.
Then write some games which do things impossible on dedicated graphics cards.
Keep on developing the ARM card to give it more and more power.
Then, have games which just use the x86 to bootstrap and do menial tasks. Release a program which obviates the need to learn or use x86, and promote games which just use the ARMs.
produce cheap computers/games machines without any legacy stuff, using cheap second hand and old x86 chips.
Congratulations, the x86 is now a digital appendix, and can safely be removed.

I can dream, can't I?


I guess what it comes down to is that most people seem to want the kinds of games current GPU's are good for (or rather developers write them precisely because they're the kinds of games the GPUs are good for).
Maybe both are true to some extent.
I remember in the early years of the Archimedes a lot of interesting games came out.
Zarch, Cataclysm, Spheres of Chaos...
and I remember thinking that now the power existed to make really innovative games - and thinking that the future would only improve. But now on the PC it seems that first-person-shooters and real-time-strategy (and racing and sports games which don't count smile) are the only genres of any significance. And are tending aim for realism only.
I also remember, a couple of years ago or so, talking about the 'look' of Jet Set Radio with a housemate. He said that he thought it looked just awful. So it seems for some people 'graphical realism' is everything.

I can't argue with your points - I'll think before opening my mouth next time wink
Not at all, I've enjoyed this conversation.
  ^[ Log in to reply ]
 
Lee Johnston Message #84467, posted by johnstlr at 15:43, 9/7/2002, in reply to message #84466
Member
Posts: 193
Memory access seems to be a bottle-neck on many systems. I'm somewhat constrained by having very little knowledge of hardware issues. But I've been thinking about it. If you give each processor its own area of memory it can use that without clashing with the others most of the time. There could also be shared areas of memory which any chip could use.

Then you'd be looking at optimising code so that it ran from each processor's local cache as much as possible.

Another possibility I've heard about is that there isn't a shared cache as such. Processors claim parts of the address range. When another processor wants to access a part of the address range it doesn't own it has to be updated from the processor that does.



If you have multiple processors apparently a major problem is getting them to communicate efficiently. Many supercomputers have lots of processors all wired up to each other, or have an internal messaging system. I'd hope neither of these would be necessary - I guess the idea is to try and get the chips working independently as far as possible. I think my cousin referred to this as trivial parallelism. Maybe this would be no bad thing - it might mean that programmers wouldn't worry too much about communication, and could optimise graphics etc more in isolation.

I remember doing a whole course on this at uni. Unfortunately I don't remember too much. There are many architectures in use, from straight buses that are snooped by the processors, which are fairly cheap but have limited scalability, upto complete cross buses where processors have direct links to every other processor. This is mega expensive. Another topology is a hypercube because it's easy to route messages between processors.

The problem isn't that far removed from event based architectures in software, either in a single process or across multiple processors (perhaps connected by networks). It just costs more in hardware.


<grin> I knew this - it is what made me think it was feasible.
It occurred to me that it could be a bit more flexible though - at the cost of adding a dedicated display chip (which would probably be necessary anyway). This could be programmed to request areas of memory in whatever pattern from the different processor units, and feed them to the screen as necessary. This might help multi-player modes, for instance.

Certainly on split screen games you could be looking at dedicating a processor to a single screen "split" although I'm not sure if this is your point.

*snip*


Heh, maybe that would be a good way to replace the x86 architecture - use graphics cards as a trojan horse.
First, create a graphics card which is basically a clump of ARM chips. This must be either or both:
a) better than other graphics cards at poly plotting.
b) cheaper than other graphics cards.
Then write some games which do things impossible on dedicated graphics cards.
Keep on developing the ARM card to give it more and more power.
Then, have games which just use the x86 to bootstrap and do menial tasks. Release a program which obviates the need to learn or use x86, and promote games which just use the ARMs.
produce cheap computers/games machines without any legacy stuff, using cheap second hand and old x86 chips.
Congratulations, the x86 is now a digital appendix, and can safely be removed.

I can dream, can't I?

Maybe we could ask Simtec to resurrect the "20Ghz RiscStation" project that was on the cover of AU a couple of years back. wink


Maybe both are true to some extent.
I remember in the early years of the Archimedes a lot of interesting games came out.
Zarch, Cataclysm, Spheres of Chaos...
and I remember thinking that now the power existed to make really innovative games - and thinking that the future would only improve. But now on the PC it seems that first-person-shooters and real-time-strategy (and racing and sports games which don't count smile) are the only genres of any significance. And are tending aim for realism only.

Part of the problem is that games cost millions of pounds to produce. No one is going to write a risky, innovative, new game concept if they can get their money from "improving the realism" of their last driving game.


I also remember, a couple of years ago or so, talking about the 'look' of Jet Set Radio with a housemate. He said that he thought it looked just awful. So it seems for some people 'graphical realism' is everything.

...and playability means nothing, which is sad but also very true in many cases.

  ^[ Log in to reply ]
 
Pages (2): 1 > >|

Acorn Arcade forums: Games: TEK Review?