How do devices like the Game Boy Advance achieve their frame rate?












31















I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



I'm currently using an ATmega1284P connected to an SSD1306 OLED controller via USART-SPI. That's the monochrome version.



The colour screen was an SSD1351, not originally connected to hardware SPI. I wasn't convinced it would make enough difference, it's just too slow overall



I know I can get faster MCUs, but I want to know what other options I could explore - the GBA processor is much slower than my 1284!










share|improve this question




















  • 6





    "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.

    – Oldfart
    Dec 17 '18 at 19:16






  • 1





    Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.

    – KingDuken
    Dec 17 '18 at 19:23








  • 5





    buy the display without the controller on it and make your own controller

    – old_timer
    Dec 17 '18 at 21:25






  • 4





    @immibis: Almost surely some awful I2C- or SPI-based controller. Hobbyist stuff is full of overpriced slow stuff like that when you can get a friggin' 400+ dpi iPhone screen for $20 because of economies of scale.

    – R..
    Dec 18 '18 at 5:00








  • 6





    @R.. I just want to point out that the reason for these hobbyist controllers is so that they can interface to almost any processor, since you make it sound like they're useless. You wouldn't be able to interface to an iPhone screen easily, if at all. It probably connects to a dedicated and maybe custom graphics processor.

    – immibis
    Dec 18 '18 at 9:58


















31















I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



I'm currently using an ATmega1284P connected to an SSD1306 OLED controller via USART-SPI. That's the monochrome version.



The colour screen was an SSD1351, not originally connected to hardware SPI. I wasn't convinced it would make enough difference, it's just too slow overall



I know I can get faster MCUs, but I want to know what other options I could explore - the GBA processor is much slower than my 1284!










share|improve this question




















  • 6





    "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.

    – Oldfart
    Dec 17 '18 at 19:16






  • 1





    Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.

    – KingDuken
    Dec 17 '18 at 19:23








  • 5





    buy the display without the controller on it and make your own controller

    – old_timer
    Dec 17 '18 at 21:25






  • 4





    @immibis: Almost surely some awful I2C- or SPI-based controller. Hobbyist stuff is full of overpriced slow stuff like that when you can get a friggin' 400+ dpi iPhone screen for $20 because of economies of scale.

    – R..
    Dec 18 '18 at 5:00








  • 6





    @R.. I just want to point out that the reason for these hobbyist controllers is so that they can interface to almost any processor, since you make it sound like they're useless. You wouldn't be able to interface to an iPhone screen easily, if at all. It probably connects to a dedicated and maybe custom graphics processor.

    – immibis
    Dec 18 '18 at 9:58
















31












31








31


7






I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



I'm currently using an ATmega1284P connected to an SSD1306 OLED controller via USART-SPI. That's the monochrome version.



The colour screen was an SSD1351, not originally connected to hardware SPI. I wasn't convinced it would make enough difference, it's just too slow overall



I know I can get faster MCUs, but I want to know what other options I could explore - the GBA processor is much slower than my 1284!










share|improve this question
















I've been designing my own handheld gaming device based around an AVR microcontroller and a small OLED display.



I started off with a monochrome display 128x64 pixels and can comfortably draw to it at over 60 frames per second.



I recently reworked it to use an RGB OLED, 128x128 pixels without really thinking too much only to find I could only achieve about 4 FPS. After some thought and careful refactoring I can get that up to ~12fps if I don't care too much about doing anything else!



My question is - how did a device like the GBA (Game Boy Advance) achieve a frame rate of nearly 60fps? I thought about having a separate 'graphics processor' but realised I would still be bottlenecked transferring the display data to that.



I also wondered about using the vestigial 8-bit parallel interface most of these screens tend to have, which might net me an 8x speed up, except that modern MCUs don't tend to have hardware parallel interfaces like they do for serial and bit-banging will likely eat up a lot of the speed gain.



What other options exist?



I'm currently using an ATmega1284P connected to an SSD1306 OLED controller via USART-SPI. That's the monochrome version.



The colour screen was an SSD1351, not originally connected to hardware SPI. I wasn't convinced it would make enough difference, it's just too slow overall



I know I can get faster MCUs, but I want to know what other options I could explore - the GBA processor is much slower than my 1284!







avr oled graphics atmega1284p






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Dec 20 '18 at 21:28









chicks

12116




12116










asked Dec 17 '18 at 19:01









MalphasWatsMalphasWats

3001311




3001311








  • 6





    "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.

    – Oldfart
    Dec 17 '18 at 19:16






  • 1





    Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.

    – KingDuken
    Dec 17 '18 at 19:23








  • 5





    buy the display without the controller on it and make your own controller

    – old_timer
    Dec 17 '18 at 21:25






  • 4





    @immibis: Almost surely some awful I2C- or SPI-based controller. Hobbyist stuff is full of overpriced slow stuff like that when you can get a friggin' 400+ dpi iPhone screen for $20 because of economies of scale.

    – R..
    Dec 18 '18 at 5:00








  • 6





    @R.. I just want to point out that the reason for these hobbyist controllers is so that they can interface to almost any processor, since you make it sound like they're useless. You wouldn't be able to interface to an iPhone screen easily, if at all. It probably connects to a dedicated and maybe custom graphics processor.

    – immibis
    Dec 18 '18 at 9:58
















  • 6





    "I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.

    – Oldfart
    Dec 17 '18 at 19:16






  • 1





    Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.

    – KingDuken
    Dec 17 '18 at 19:23








  • 5





    buy the display without the controller on it and make your own controller

    – old_timer
    Dec 17 '18 at 21:25






  • 4





    @immibis: Almost surely some awful I2C- or SPI-based controller. Hobbyist stuff is full of overpriced slow stuff like that when you can get a friggin' 400+ dpi iPhone screen for $20 because of economies of scale.

    – R..
    Dec 18 '18 at 5:00








  • 6





    @R.. I just want to point out that the reason for these hobbyist controllers is so that they can interface to almost any processor, since you make it sound like they're useless. You wouldn't be able to interface to an iPhone screen easily, if at all. It probably connects to a dedicated and maybe custom graphics processor.

    – immibis
    Dec 18 '18 at 9:58










6




6





"I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.

– Oldfart
Dec 17 '18 at 19:16





"I would still be bottlenecked transferring the display data to that." DSI has four lanes each up to 1.2Gbits/sec. I leave the rest of the calculations to you.

– Oldfart
Dec 17 '18 at 19:16




1




1





Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.

– KingDuken
Dec 17 '18 at 19:23







Just like any graphics in any video game device, there's memory that would handle graphics. According to this website, there's an address location for graphics, sound, etc. Instructions would be stored there. Assuming there isn't a lot of data that would create conflicts with performance time, it would run those instructions to load the graphical data with ease.

– KingDuken
Dec 17 '18 at 19:23






5




5





buy the display without the controller on it and make your own controller

– old_timer
Dec 17 '18 at 21:25





buy the display without the controller on it and make your own controller

– old_timer
Dec 17 '18 at 21:25




4




4





@immibis: Almost surely some awful I2C- or SPI-based controller. Hobbyist stuff is full of overpriced slow stuff like that when you can get a friggin' 400+ dpi iPhone screen for $20 because of economies of scale.

– R..
Dec 18 '18 at 5:00







@immibis: Almost surely some awful I2C- or SPI-based controller. Hobbyist stuff is full of overpriced slow stuff like that when you can get a friggin' 400+ dpi iPhone screen for $20 because of economies of scale.

– R..
Dec 18 '18 at 5:00






6




6





@R.. I just want to point out that the reason for these hobbyist controllers is so that they can interface to almost any processor, since you make it sound like they're useless. You wouldn't be able to interface to an iPhone screen easily, if at all. It probably connects to a dedicated and maybe custom graphics processor.

– immibis
Dec 18 '18 at 9:58







@R.. I just want to point out that the reason for these hobbyist controllers is so that they can interface to almost any processor, since you make it sound like they're useless. You wouldn't be able to interface to an iPhone screen easily, if at all. It probably connects to a dedicated and maybe custom graphics processor.

– immibis
Dec 18 '18 at 9:58












5 Answers
5






active

oldest

votes


















65














Other answers cover your question pretty well at an abstract level (hardware), but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video RAM, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)



It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around its behavior, because it would continue doing what it did whether you were ready or not.



Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



The 20% between frames, was all the CPU had to tweak video settings or RAM that would impact the whole next frame.



At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular.). You have to be careful to keep these updates small and finish before the graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPU's processing time...



For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break; each were chasing the other around the loop updating what the other wasn't currently looking at.






share|improve this answer





















  • 5





    Welcome and well put.

    – Mindwin
    Dec 18 '18 at 15:44






  • 2





    Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

    – forest
    Dec 20 '18 at 7:20



















21














The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



The video processor can then work pixel by pixel to determine which sprite to draw at that point.



However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



(1) Notable exception: Amiga






share|improve this answer
























  • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

    – TimWescott
    Dec 17 '18 at 20:54













  • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

    – Mr.Mindor
    Dec 17 '18 at 23:44











  • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

    – TimWescott
    Dec 17 '18 at 23:51











  • @TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

    – supercat
    Dec 18 '18 at 16:57



















20














"My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






share|improve this answer



















  • 7





    Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

    – Ross Ridge
    Dec 18 '18 at 8:28



















14














The GBA had a pretty slow processor. The ARM7 is very nice; they just ran it slow and gave it next to no resources.



There is a reason why a lot of Nintendo games at that point and before were side-scrollers. HARDWARE. It is all done in hardware. You had multiple layers of tiles plus one or more sprites and the hardware did all the work to extract pixels from those tables and drive the display.



You build the tile set up front and then had a smallish memory that was a tile map. Want the lower left tile to be tile 7? You put a 7 in that memory location. Want the next tile over to be tile 19? In the tile set, you put a 19 there, and so on for each layer that you have enabled. For the sprite, you simply set the x/y address. You can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



Mode 7, if I remember right, was a pixel mode, but that was like a traditional video card where you put bytes in that cover the color for a pixel and the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them, but I don't remember right. Again, the processor was fairly underclocked for that day and age and didn't have too many fast resources. So while some games were mode 7, a lot were tile based side-scrollers...



If you want a solution that is a high frame rate, you need to design that solution. You can't just take any old display you find and talk to it via SPI or I²C or something like that. Put at least one framebuffer in front of it, ideally two, and have row and column control if possible over that display.



A number of the displays I suspect you are buying have a controller on them that you are actually talking to. If you want GBA/console type performance you create/implement the controller. Or you buy/build with a GPU/video chip/logic blob, and use HDMI or other common interface into a stock monitor.



Just because a bicycle has tires and a chain and gears doesn't mean it can go as fast as a motorcycle. You need to design the system to meet your performance needs, end to end. You can put that bicycle wheel on that motorcycle, but it won't perform as desired; all of the components have to be part of the overall design.



Asteroids worked this way too; it only needed one 6502. The vector graphics were done with separate logic; the 6502 sent a tiny string of data to the vector graphics controller, which used a ROM and that data to do the xy plotting of the beam and z, on/off... Some standups had separate processors to handle audio and video separate from the processor computing the game. Of course today the video is handled by some hundreds, if not thousands, of processors that are separate from the main processor...






share|improve this answer


























  • I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

    – Caleb Jay
    Dec 20 '18 at 19:39











  • coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

    – old_timer
    Dec 20 '18 at 21:05











  • hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

    – old_timer
    Dec 20 '18 at 21:08











  • if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

    – old_timer
    Dec 20 '18 at 21:10



















7















how did a device like the GBA achieve a frame rate of nearly 60fps?




Hardware.



It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



But no matter the cpu, bit banging the interface to the display is going to suck.



I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






share|improve this answer
























  • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

    – pjc50
    Dec 17 '18 at 19:30






  • 2





    @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

    – Mark
    Dec 18 '18 at 0:03






  • 1





    @Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

    – supercat
    Dec 18 '18 at 22:31











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["\$", "\$"]]);
});
});
}, "mathjax-editing");

StackExchange.ifUsing("editor", function () {
return StackExchange.using("schematics", function () {
StackExchange.schematics.init();
});
}, "cicuitlab");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "135"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2felectronics.stackexchange.com%2fquestions%2f412696%2fhow-do-devices-like-the-game-boy-advance-achieve-their-frame-rate%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























5 Answers
5






active

oldest

votes








5 Answers
5






active

oldest

votes









active

oldest

votes






active

oldest

votes









65














Other answers cover your question pretty well at an abstract level (hardware), but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video RAM, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)



It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around its behavior, because it would continue doing what it did whether you were ready or not.



Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



The 20% between frames, was all the CPU had to tweak video settings or RAM that would impact the whole next frame.



At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular.). You have to be careful to keep these updates small and finish before the graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPU's processing time...



For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break; each were chasing the other around the loop updating what the other wasn't currently looking at.






share|improve this answer





















  • 5





    Welcome and well put.

    – Mindwin
    Dec 18 '18 at 15:44






  • 2





    Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

    – forest
    Dec 20 '18 at 7:20
















65














Other answers cover your question pretty well at an abstract level (hardware), but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video RAM, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)



It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around its behavior, because it would continue doing what it did whether you were ready or not.



Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



The 20% between frames, was all the CPU had to tweak video settings or RAM that would impact the whole next frame.



At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular.). You have to be careful to keep these updates small and finish before the graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPU's processing time...



For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break; each were chasing the other around the loop updating what the other wasn't currently looking at.






share|improve this answer





















  • 5





    Welcome and well put.

    – Mindwin
    Dec 18 '18 at 15:44






  • 2





    Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

    – forest
    Dec 20 '18 at 7:20














65












65








65







Other answers cover your question pretty well at an abstract level (hardware), but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video RAM, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)



It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around its behavior, because it would continue doing what it did whether you were ready or not.



Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



The 20% between frames, was all the CPU had to tweak video settings or RAM that would impact the whole next frame.



At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular.). You have to be careful to keep these updates small and finish before the graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPU's processing time...



For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break; each were chasing the other around the loop updating what the other wasn't currently looking at.






share|improve this answer















Other answers cover your question pretty well at an abstract level (hardware), but having actual experience with the GBA in particular I figured a more detailed explanation may be worth while.



The GBA had many drawing modes and settings which could be used to control how the graphics processor interpreted the video RAM, but one thing was inescapable: the frame rate. The graphic processor was drawing to the screen in a nearly (more on this below) constant loop. (This is likely the most relevant bit for your question.)



It would draw one line at a time taking a very short break between each. After drawing the last line for the frame it would take a break roughly equal to the time it takes to draw 30 lines. Then start again. The timing of each line, and the timing of each frame were all predetermined and set in stone. In a lot of ways the graphics processor was really the master of that system and you needed to write your games around its behavior, because it would continue doing what it did whether you were ready or not.



Roughly 75-80% of the time it was actively pushing to the screen. What frame rates could you accomplish if you were doing the same?



That 80% of the time was also what the CPU had to process user input, calculate game state, and load sprites/tiles to areas of VRAM that were currently off screen (or at least not included in the current line being drawn).



The 20% between frames, was all the CPU had to tweak video settings or RAM that would impact the whole next frame.



At the end of each line, the graphics processor would send a line sync interrupt to the CPU. This interrupt could be used to tweak settings on a few sprites, or a few background layers (this is how you can get an effect like a conical spotlight, by changing the size and location of one of the rectangular masks between each line drawn. As far as the hardware is concerned all those regions are rectangular.). You have to be careful to keep these updates small and finish before the graphic processor starts drawing the next line or you can get ugly results. Any time spent processing these interrupts also cut into that 80% of the CPU's processing time...



For games that got the most out of this system, neither the CPU nor the graphic processor ever took a real break; each were chasing the other around the loop updating what the other wasn't currently looking at.







share|improve this answer














share|improve this answer



share|improve this answer








edited Dec 20 '18 at 6:16









Peter Mortensen

1,60031422




1,60031422










answered Dec 18 '18 at 1:19









Mr.MindorMr.Mindor

66656




66656








  • 5





    Welcome and well put.

    – Mindwin
    Dec 18 '18 at 15:44






  • 2





    Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

    – forest
    Dec 20 '18 at 7:20














  • 5





    Welcome and well put.

    – Mindwin
    Dec 18 '18 at 15:44






  • 2





    Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

    – forest
    Dec 20 '18 at 7:20








5




5





Welcome and well put.

– Mindwin
Dec 18 '18 at 15:44





Welcome and well put.

– Mindwin
Dec 18 '18 at 15:44




2




2





Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

– forest
Dec 20 '18 at 7:20





Some "newer" systems like the Nintendo DS got around the fixed framerate limitation by adding the VCOUNT register to delay the next frame for a configurable amount of time (usually to help multiplayer games synchronize).

– forest
Dec 20 '18 at 7:20













21














The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



The video processor can then work pixel by pixel to determine which sprite to draw at that point.



However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



(1) Notable exception: Amiga






share|improve this answer
























  • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

    – TimWescott
    Dec 17 '18 at 20:54













  • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

    – Mr.Mindor
    Dec 17 '18 at 23:44











  • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

    – TimWescott
    Dec 17 '18 at 23:51











  • @TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

    – supercat
    Dec 18 '18 at 16:57
















21














The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



The video processor can then work pixel by pixel to determine which sprite to draw at that point.



However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



(1) Notable exception: Amiga






share|improve this answer
























  • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

    – TimWescott
    Dec 17 '18 at 20:54













  • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

    – Mr.Mindor
    Dec 17 '18 at 23:44











  • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

    – TimWescott
    Dec 17 '18 at 23:51











  • @TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

    – supercat
    Dec 18 '18 at 16:57














21












21








21







The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



The video processor can then work pixel by pixel to determine which sprite to draw at that point.



However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



(1) Notable exception: Amiga






share|improve this answer













The key feature of all the games consoles that distinguished them from early PCs and virtually all home computers(1) was hardware sprites.



The linked GBA programming guide shows how they work from the main processor point of view. Bitmaps representing player, background, enemies etc are loaded into one area of memory. Another area of memory specifies the location of the sprites. So instead of having to re-write all of video RAM every frame, which takes a lot of instructions, the processor just has to update the location of the sprites.



The video processor can then work pixel by pixel to determine which sprite to draw at that point.



However, this requires dual-port RAM shared between the two, and I think in the GBA the video processor is on the same chip as the main ARM and secondary Z80 processor.



(1) Notable exception: Amiga







share|improve this answer












share|improve this answer



share|improve this answer










answered Dec 17 '18 at 19:29









pjc50pjc50

33.4k33983




33.4k33983













  • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

    – TimWescott
    Dec 17 '18 at 20:54













  • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

    – Mr.Mindor
    Dec 17 '18 at 23:44











  • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

    – TimWescott
    Dec 17 '18 at 23:51











  • @TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

    – supercat
    Dec 18 '18 at 16:57



















  • Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

    – TimWescott
    Dec 17 '18 at 20:54













  • @TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

    – Mr.Mindor
    Dec 17 '18 at 23:44











  • @Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

    – TimWescott
    Dec 17 '18 at 23:51











  • @TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

    – supercat
    Dec 18 '18 at 16:57

















Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

– TimWescott
Dec 17 '18 at 20:54







Only a nit -- the really early arcade games had the sprites in a ROM associated with the graphics processor, not a dual-port RAM. I have no clue if that was also the case with the early consoles, although it certainly could have been done that way.

– TimWescott
Dec 17 '18 at 20:54















@TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

– Mr.Mindor
Dec 17 '18 at 23:44





@TimWescott the GBA did have multiple drawing modes and I don't have experience with most so this may not be universally true but, I don't think any of those modes had direct access to the ROMs(on cartridge): Typically all the tile/sprite/palette data had to be transferred from the ROM to the video memory and the graphics processor worked on it from there.

– Mr.Mindor
Dec 17 '18 at 23:44













@Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

– TimWescott
Dec 17 '18 at 23:51





@Mr.Mindor Sorry if I wasn't clear -- I'm not pretending to knowledge about how the GB or GBA did it. I was just commenting on the really early Nintendo arcade games back in the late 70's and early 80's, that had all of us wondering how in h*** they did that.

– TimWescott
Dec 17 '18 at 23:51













@TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

– supercat
Dec 18 '18 at 16:57





@TimWescott: I think the same was true of the NES, though the ROM in question was located within the Game Paks.

– supercat
Dec 18 '18 at 16:57











20














"My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






share|improve this answer



















  • 7





    Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

    – Ross Ridge
    Dec 18 '18 at 8:28
















20














"My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






share|improve this answer



















  • 7





    Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

    – Ross Ridge
    Dec 18 '18 at 8:28














20












20








20







"My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.






share|improve this answer













"My question is - how did a device like the GBA achieve a frame rate of nearly 60fps?"



To answer just the question, they did it with a graphics processer. I'm pretty sure the Game Boy used sprite graphics. At a top level, that means that the graphics processor gets loaded things like an image of a background, and an image of Mario, and an image of Princess Peach, etc. Then the main processor issues commands like "show the background offset by this much in x and y, overlay Mario image #3 at this x, y position", etc. So the main processor is absolutely positively not concerned with drawing each pixel, and the graphics processor is absolutely positively not concerned with computing the state of the game. Each is optimized for what it needs to do, and the result is a pretty good video game without using a lot of computation power.







share|improve this answer












share|improve this answer



share|improve this answer










answered Dec 17 '18 at 19:25









TimWescottTimWescott

3,2841210




3,2841210








  • 7





    Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

    – Ross Ridge
    Dec 18 '18 at 8:28














  • 7





    Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

    – Ross Ridge
    Dec 18 '18 at 8:28








7




7





Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

– Ross Ridge
Dec 18 '18 at 8:28





Calling it a "graphics processor" exaggerates what it does, suggesting it's some sort of CPU of it's own. It's just a video controller, which is basically a complicated kind of sequencer. As it counts up horizontal and vertical pixels, it fetches title and/or sprite data, put them in shift registers, and combines the output of the shift registers into an output pixel. It's not capable of running a program like an actual "GPU" graphics processor.

– Ross Ridge
Dec 18 '18 at 8:28











14














The GBA had a pretty slow processor. The ARM7 is very nice; they just ran it slow and gave it next to no resources.



There is a reason why a lot of Nintendo games at that point and before were side-scrollers. HARDWARE. It is all done in hardware. You had multiple layers of tiles plus one or more sprites and the hardware did all the work to extract pixels from those tables and drive the display.



You build the tile set up front and then had a smallish memory that was a tile map. Want the lower left tile to be tile 7? You put a 7 in that memory location. Want the next tile over to be tile 19? In the tile set, you put a 19 there, and so on for each layer that you have enabled. For the sprite, you simply set the x/y address. You can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



Mode 7, if I remember right, was a pixel mode, but that was like a traditional video card where you put bytes in that cover the color for a pixel and the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them, but I don't remember right. Again, the processor was fairly underclocked for that day and age and didn't have too many fast resources. So while some games were mode 7, a lot were tile based side-scrollers...



If you want a solution that is a high frame rate, you need to design that solution. You can't just take any old display you find and talk to it via SPI or I²C or something like that. Put at least one framebuffer in front of it, ideally two, and have row and column control if possible over that display.



A number of the displays I suspect you are buying have a controller on them that you are actually talking to. If you want GBA/console type performance you create/implement the controller. Or you buy/build with a GPU/video chip/logic blob, and use HDMI or other common interface into a stock monitor.



Just because a bicycle has tires and a chain and gears doesn't mean it can go as fast as a motorcycle. You need to design the system to meet your performance needs, end to end. You can put that bicycle wheel on that motorcycle, but it won't perform as desired; all of the components have to be part of the overall design.



Asteroids worked this way too; it only needed one 6502. The vector graphics were done with separate logic; the 6502 sent a tiny string of data to the vector graphics controller, which used a ROM and that data to do the xy plotting of the beam and z, on/off... Some standups had separate processors to handle audio and video separate from the processor computing the game. Of course today the video is handled by some hundreds, if not thousands, of processors that are separate from the main processor...






share|improve this answer


























  • I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

    – Caleb Jay
    Dec 20 '18 at 19:39











  • coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

    – old_timer
    Dec 20 '18 at 21:05











  • hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

    – old_timer
    Dec 20 '18 at 21:08











  • if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

    – old_timer
    Dec 20 '18 at 21:10
















14














The GBA had a pretty slow processor. The ARM7 is very nice; they just ran it slow and gave it next to no resources.



There is a reason why a lot of Nintendo games at that point and before were side-scrollers. HARDWARE. It is all done in hardware. You had multiple layers of tiles plus one or more sprites and the hardware did all the work to extract pixels from those tables and drive the display.



You build the tile set up front and then had a smallish memory that was a tile map. Want the lower left tile to be tile 7? You put a 7 in that memory location. Want the next tile over to be tile 19? In the tile set, you put a 19 there, and so on for each layer that you have enabled. For the sprite, you simply set the x/y address. You can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



Mode 7, if I remember right, was a pixel mode, but that was like a traditional video card where you put bytes in that cover the color for a pixel and the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them, but I don't remember right. Again, the processor was fairly underclocked for that day and age and didn't have too many fast resources. So while some games were mode 7, a lot were tile based side-scrollers...



If you want a solution that is a high frame rate, you need to design that solution. You can't just take any old display you find and talk to it via SPI or I²C or something like that. Put at least one framebuffer in front of it, ideally two, and have row and column control if possible over that display.



A number of the displays I suspect you are buying have a controller on them that you are actually talking to. If you want GBA/console type performance you create/implement the controller. Or you buy/build with a GPU/video chip/logic blob, and use HDMI or other common interface into a stock monitor.



Just because a bicycle has tires and a chain and gears doesn't mean it can go as fast as a motorcycle. You need to design the system to meet your performance needs, end to end. You can put that bicycle wheel on that motorcycle, but it won't perform as desired; all of the components have to be part of the overall design.



Asteroids worked this way too; it only needed one 6502. The vector graphics were done with separate logic; the 6502 sent a tiny string of data to the vector graphics controller, which used a ROM and that data to do the xy plotting of the beam and z, on/off... Some standups had separate processors to handle audio and video separate from the processor computing the game. Of course today the video is handled by some hundreds, if not thousands, of processors that are separate from the main processor...






share|improve this answer


























  • I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

    – Caleb Jay
    Dec 20 '18 at 19:39











  • coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

    – old_timer
    Dec 20 '18 at 21:05











  • hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

    – old_timer
    Dec 20 '18 at 21:08











  • if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

    – old_timer
    Dec 20 '18 at 21:10














14












14








14







The GBA had a pretty slow processor. The ARM7 is very nice; they just ran it slow and gave it next to no resources.



There is a reason why a lot of Nintendo games at that point and before were side-scrollers. HARDWARE. It is all done in hardware. You had multiple layers of tiles plus one or more sprites and the hardware did all the work to extract pixels from those tables and drive the display.



You build the tile set up front and then had a smallish memory that was a tile map. Want the lower left tile to be tile 7? You put a 7 in that memory location. Want the next tile over to be tile 19? In the tile set, you put a 19 there, and so on for each layer that you have enabled. For the sprite, you simply set the x/y address. You can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



Mode 7, if I remember right, was a pixel mode, but that was like a traditional video card where you put bytes in that cover the color for a pixel and the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them, but I don't remember right. Again, the processor was fairly underclocked for that day and age and didn't have too many fast resources. So while some games were mode 7, a lot were tile based side-scrollers...



If you want a solution that is a high frame rate, you need to design that solution. You can't just take any old display you find and talk to it via SPI or I²C or something like that. Put at least one framebuffer in front of it, ideally two, and have row and column control if possible over that display.



A number of the displays I suspect you are buying have a controller on them that you are actually talking to. If you want GBA/console type performance you create/implement the controller. Or you buy/build with a GPU/video chip/logic blob, and use HDMI or other common interface into a stock monitor.



Just because a bicycle has tires and a chain and gears doesn't mean it can go as fast as a motorcycle. You need to design the system to meet your performance needs, end to end. You can put that bicycle wheel on that motorcycle, but it won't perform as desired; all of the components have to be part of the overall design.



Asteroids worked this way too; it only needed one 6502. The vector graphics were done with separate logic; the 6502 sent a tiny string of data to the vector graphics controller, which used a ROM and that data to do the xy plotting of the beam and z, on/off... Some standups had separate processors to handle audio and video separate from the processor computing the game. Of course today the video is handled by some hundreds, if not thousands, of processors that are separate from the main processor...






share|improve this answer















The GBA had a pretty slow processor. The ARM7 is very nice; they just ran it slow and gave it next to no resources.



There is a reason why a lot of Nintendo games at that point and before were side-scrollers. HARDWARE. It is all done in hardware. You had multiple layers of tiles plus one or more sprites and the hardware did all the work to extract pixels from those tables and drive the display.



You build the tile set up front and then had a smallish memory that was a tile map. Want the lower left tile to be tile 7? You put a 7 in that memory location. Want the next tile over to be tile 19? In the tile set, you put a 19 there, and so on for each layer that you have enabled. For the sprite, you simply set the x/y address. You can also do scaling and rotation by setting some registers and the hardware takes care of the rest.



Mode 7, if I remember right, was a pixel mode, but that was like a traditional video card where you put bytes in that cover the color for a pixel and the hardware takes care of the video refresh. I think you could ping pong or at least when you had a new frame you could flip them, but I don't remember right. Again, the processor was fairly underclocked for that day and age and didn't have too many fast resources. So while some games were mode 7, a lot were tile based side-scrollers...



If you want a solution that is a high frame rate, you need to design that solution. You can't just take any old display you find and talk to it via SPI or I²C or something like that. Put at least one framebuffer in front of it, ideally two, and have row and column control if possible over that display.



A number of the displays I suspect you are buying have a controller on them that you are actually talking to. If you want GBA/console type performance you create/implement the controller. Or you buy/build with a GPU/video chip/logic blob, and use HDMI or other common interface into a stock monitor.



Just because a bicycle has tires and a chain and gears doesn't mean it can go as fast as a motorcycle. You need to design the system to meet your performance needs, end to end. You can put that bicycle wheel on that motorcycle, but it won't perform as desired; all of the components have to be part of the overall design.



Asteroids worked this way too; it only needed one 6502. The vector graphics were done with separate logic; the 6502 sent a tiny string of data to the vector graphics controller, which used a ROM and that data to do the xy plotting of the beam and z, on/off... Some standups had separate processors to handle audio and video separate from the processor computing the game. Of course today the video is handled by some hundreds, if not thousands, of processors that are separate from the main processor...







share|improve this answer














share|improve this answer



share|improve this answer








edited Dec 20 '18 at 11:00









Peter Mortensen

1,60031422




1,60031422










answered Dec 17 '18 at 21:24









old_timerold_timer

5,8661525




5,8661525













  • I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

    – Caleb Jay
    Dec 20 '18 at 19:39











  • coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

    – old_timer
    Dec 20 '18 at 21:05











  • hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

    – old_timer
    Dec 20 '18 at 21:08











  • if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

    – old_timer
    Dec 20 '18 at 21:10



















  • I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

    – Caleb Jay
    Dec 20 '18 at 19:39











  • coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

    – old_timer
    Dec 20 '18 at 21:05











  • hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

    – old_timer
    Dec 20 '18 at 21:08











  • if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

    – old_timer
    Dec 20 '18 at 21:10

















I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

– Caleb Jay
Dec 20 '18 at 19:39





I swear I remember mode7 being shoehorned by marketing as a response to Sega's "hyper mode" or something... maybe "Super FX?" en.wikipedia.org/wiki/Mode_7

– Caleb Jay
Dec 20 '18 at 19:39













coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

– old_timer
Dec 20 '18 at 21:05





coranac.com/tonc/text/bitmaps.htm#sec-modes I may have remembered it wrong I am thinking of maybe mode 5, or one of the bitmap modes, there are some tile modes with sprites and bitmap/framebuffer mode or modes. maybe there is a 7. didnt know about the one you linked but that is good to know.

– old_timer
Dec 20 '18 at 21:05













hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

– old_timer
Dec 20 '18 at 21:08





hmm reading more on mode 7 and its not just a mode. Anyway the GBA has tile modes and bitmap modes which are slower as you have to be responsible for every pixel where the tile modes one byte in the tile map produces many pixels. They also leveraged the size of the busses (width) and speed of the memory, and a rom pipeline cache thing to help get stuff (instructions) out of the rom a bit faster. But from day one you were struggling to get software to run at a decent rate and thankfully the logic took care of most of the video work.

– old_timer
Dec 20 '18 at 21:08













if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

– old_timer
Dec 20 '18 at 21:10





if you look at these displays that you are buying that have these parallel 8 bit or 4 bit or spi or i2c interfaces those are in your way for performance, you want the raw display without those controllers and then you can control how the display is managed, build a framebuffer or two so you can ping/pong and a fast interface from your cpu to the framebuffer. assuming you start with a fast enough display in the first place.

– old_timer
Dec 20 '18 at 21:10











7















how did a device like the GBA achieve a frame rate of nearly 60fps?




Hardware.



It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



But no matter the cpu, bit banging the interface to the display is going to suck.



I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






share|improve this answer
























  • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

    – pjc50
    Dec 17 '18 at 19:30






  • 2





    @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

    – Mark
    Dec 18 '18 at 0:03






  • 1





    @Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

    – supercat
    Dec 18 '18 at 22:31
















7















how did a device like the GBA achieve a frame rate of nearly 60fps?




Hardware.



It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



But no matter the cpu, bit banging the interface to the display is going to suck.



I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






share|improve this answer
























  • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

    – pjc50
    Dec 17 '18 at 19:30






  • 2





    @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

    – Mark
    Dec 18 '18 at 0:03






  • 1





    @Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

    – supercat
    Dec 18 '18 at 22:31














7












7








7








how did a device like the GBA achieve a frame rate of nearly 60fps?




Hardware.



It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



But no matter the cpu, bit banging the interface to the display is going to suck.



I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.






share|improve this answer














how did a device like the GBA achieve a frame rate of nearly 60fps?




Hardware.



It's got graphics memory, which may or may not share the same bus as program/data memory... but the important bit is that it has a graphics processor which reads the memory 60 times per second and sends the data to the LCD using an optimized interface which is designed to do this efficiently.



You can do the same with any modern microcontroller equipped with a "LCD interface" peripheral, for example the LPC4330 although this might be way overkill. Of course you will need a compatible LCD panel.



With modern fast microcontrollers (ie, ARM not AVR) and such a tiny screen, you probably won't need sprites or a blitter to accelerate graphics operations. With a 8-bit AVR it might be slow.



But no matter the cpu, bit banging the interface to the display is going to suck.



I believe the Atari 2600 used CPU bit-banging to send the picture to the TV. That's a little bit obsolete.







share|improve this answer












share|improve this answer



share|improve this answer










answered Dec 17 '18 at 19:29









peufeupeufeu

24.8k23972




24.8k23972













  • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

    – pjc50
    Dec 17 '18 at 19:30






  • 2





    @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

    – Mark
    Dec 18 '18 at 0:03






  • 1





    @Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

    – supercat
    Dec 18 '18 at 22:31



















  • Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

    – pjc50
    Dec 17 '18 at 19:30






  • 2





    @pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

    – Mark
    Dec 18 '18 at 0:03






  • 1





    @Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

    – supercat
    Dec 18 '18 at 22:31

















Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

– pjc50
Dec 17 '18 at 19:30





Even the 2600 had hardware sprites, although a very limited number (two players and two bullets I think)

– pjc50
Dec 17 '18 at 19:30




2




2





@pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

– Mark
Dec 18 '18 at 0:03





@pjc50, the Atari 2600 sort of had hardware sprites. Like every other part of the graphics subsystem, they were one-dimensional objects. If the programmer wanted something other than a set of vertical lines, the program needed to update the sprites after each row was drawn to the screen.

– Mark
Dec 18 '18 at 0:03




1




1





@Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

– supercat
Dec 18 '18 at 22:31





@Mark: The 2600 definitely had hardware sprites. The hardware only controlled horizontal positioning, but the sprites on the 2600 made it possible for games to produce games that were far more colorful than any of its competitors.

– supercat
Dec 18 '18 at 22:31


















draft saved

draft discarded




















































Thanks for contributing an answer to Electrical Engineering Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2felectronics.stackexchange.com%2fquestions%2f412696%2fhow-do-devices-like-the-game-boy-advance-achieve-their-frame-rate%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Plaza Victoria

In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...