How did old MS-DOS games utilize various graphic cards?Do all VGA cards implicitly support CGA and EGA?How was Prince of Persia “better/faster” with RWTS18?How did “Ballblazer” pull off fast, smooth, first-person, solid-model 3D on Atari 8-bits?How were the first ZX Spectrum games written?Where did DOS store graphics fonts?What techniques were used to reduce the required re-rendering in 3D programs?An old DOS application that allowed to create cards, posters, invitations, etcHow did PC boot games handle saving?Did any Apple II games use a “timing resistor”?How did Elite on the NES work?When did various PONG systems overflow/crash due to points?

Do I have any obligations to my PhD supervisor's requests after I have graduated?

Encounter design and XP thresholds

Greeting with "Ho"

How do I farm creepers for XP without them exploding?

Why do all the teams that I have worked with always finish a sprint without completion of all the stories?

Is a single radon daughter atom in air a solid?

DBCC checkdb on tempdb

What can I do with a research project that is my university’s intellectual property?

What is the origin of Scooby-Doo's name?

Why does using different ArrayList constructors cause a different growth rate of the internal array?

What happened to Steve's Shield in Iron Man 2?

Designing a magic-compatible polearm

How long would it take to cross the Channel in 1890's?

Explain why a line can never intersect a plane in exactly two points.

I don't like coffee, neither beer. How to politely work my way around that in a business situation?

How to make clear to people I don't want to answer their "Where are you from?" question?

Why don't countries like Japan just print more money?

What is the highest voltage from the power supply a Raspberry Pi 3 B can handle without getting damaged?

What determines the direction in which motor proteins go?

Do I need a shock-proof watch for cycling?

Am I legally required to provide a (GPL licensed) source code even after a project is abandoned?

How to remove this component from PCB

Causes of High CHTs

Helping ease my back pain by studying 13 hours everyday , even weekends



How did old MS-DOS games utilize various graphic cards?


Do all VGA cards implicitly support CGA and EGA?How was Prince of Persia “better/faster” with RWTS18?How did “Ballblazer” pull off fast, smooth, first-person, solid-model 3D on Atari 8-bits?How were the first ZX Spectrum games written?Where did DOS store graphics fonts?What techniques were used to reduce the required re-rendering in 3D programs?An old DOS application that allowed to create cards, posters, invitations, etcHow did PC boot games handle saving?Did any Apple II games use a “timing resistor”?How did Elite on the NES work?When did various PONG systems overflow/crash due to points?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








50















Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?










share|improve this question



















  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 7





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35

















50















Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?










share|improve this question



















  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 7





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35













50












50








50


8






Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?










share|improve this question
















Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?







graphics programming video gaming






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jun 5 at 16:36









rchard2scout

1094




1094










asked Jun 3 at 22:55









PetrPetr

353125




353125







  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 7





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35












  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 7





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35







12




12





Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

– Alma Do
Jun 4 at 12:14






Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

– Alma Do
Jun 4 at 12:14





7




7





Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

– GuitarPicker
Jun 4 at 12:58





Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

– GuitarPicker
Jun 4 at 12:58




1




1





Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

– Brian H
Jun 4 at 21:02





Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

– Brian H
Jun 4 at 21:02




5




5





There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

– TemporalWolf
Jun 4 at 22:25





There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

– TemporalWolf
Jun 4 at 22:25




1




1





Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

– SirNickity
Jun 4 at 23:35





Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

– SirNickity
Jun 4 at 23:35










6 Answers
6






active

oldest

votes


















66















Did every programmer of every game implemented all possible various API's that old graphic cards supported?




Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






share|improve this answer




















  • 12





    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

    – Matthew Barber
    Jun 3 at 23:37






  • 22





    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

    – Stephen Kitt
    Jun 4 at 6:10







  • 9





    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

    – J...
    Jun 4 at 15:24






  • 2





    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

    – John Dvorak
    Jun 4 at 20:12






  • 4





    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

    – Wumpus Q. Wumbley
    Jun 4 at 20:33


















25














Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






share|improve this answer




















  • 3





    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

    – ErikF
    Jun 4 at 4:24






  • 4





    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

    – mnem
    Jun 4 at 6:38







  • 6





    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

    – IMil
    Jun 4 at 14:38






  • 5





    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

    – BlackJack
    Jun 4 at 15:26











  • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

    – snips-n-snails
    Jun 4 at 18:11


















16














In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



The situation with audio cards was very similar.



Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



But, for example, Mode X was very popular with games!



It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



(During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



(The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



Whoa! This was a nice trip on memory lane! :)






share|improve this answer




















  • 1





    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

    – kubanczyk
    Jun 5 at 9:33












  • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

    – ninjalj
    Jun 8 at 1:22


















11














In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






share|improve this answer




















  • 4





    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

    – Neil
    Jun 5 at 12:56


















3














Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



  • You couldn't afford the cycles for the subroutine call and return!

  • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

  • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






share|improve this answer
































    2














    Like any hardware, video card may have some address in I/O space and memory space.
    They are physically connected to bus (ISA bus back in 1980th).
    When CPU writes to some memory address, videocard answers this and accepts data.
    When CPU writes to some IO, same thing happens.



    That means software may access it if it is aware of it's memory address and IO address.



    Accessing some hardware mapped to some memory address:



    MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
    OUT SOME_PORT, AX; Move value of AX to some port


    Same with C



    int* data = SOME_ADDR;
    data[0] = '1'; //write '1' to SOME_ADDR.
    outp(PORT, '1'); //to io


    IBM PC compatible computers had several types of cards:



    • Monochrome Graphic Adapter


    • Hercules


    • CGA


    • EGA


    • VGA


    Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
    VGA was the most complicated standard, there were huge books about it!
    Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



    So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
    Then, you use standard to talk to card.



    VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



    Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



    But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
    There was 13h mode where each byte represented color of pixel.
    There were modes with several planars to speed up the card (see
    https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



    Video programming was not easy!
    Some articles to read:



    • https://wiki.osdev.org/VGA_Hardware


    • http://www.brackeen.com/vga/


    There was also high level BIOS api, but it was too slow to be used by games.



    You may ask: "But how do I render 3D with all of that?".
    The answer is: you can't.



    In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



    I really suggest you t read book about how they did it for Wolf3d:



    http://fabiensanglard.net/gebbwolf3d/



    The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






    share|improve this answer


















    • 1





      Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

      – Artur Biesiadowski
      Jun 6 at 14:05











    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "648"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11219%2fhow-did-old-ms-dos-games-utilize-various-graphic-cards%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    6 Answers
    6






    active

    oldest

    votes








    6 Answers
    6






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    66















    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






    share|improve this answer




















    • 12





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 22





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33















    66















    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






    share|improve this answer




















    • 12





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 22





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33













    66












    66








    66








    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






    share|improve this answer
















    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 6 at 21:39

























    answered Jun 3 at 23:16









    Greg HewgillGreg Hewgill

    2,3591315




    2,3591315







    • 12





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 22





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33












    • 12





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 22





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33







    12




    12





    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

    – Matthew Barber
    Jun 3 at 23:37





    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

    – Matthew Barber
    Jun 3 at 23:37




    22




    22





    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

    – Stephen Kitt
    Jun 4 at 6:10






    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

    – Stephen Kitt
    Jun 4 at 6:10





    9




    9





    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

    – J...
    Jun 4 at 15:24





    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

    – J...
    Jun 4 at 15:24




    2




    2





    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

    – John Dvorak
    Jun 4 at 20:12





    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

    – John Dvorak
    Jun 4 at 20:12




    4




    4





    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

    – Wumpus Q. Wumbley
    Jun 4 at 20:33





    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

    – Wumpus Q. Wumbley
    Jun 4 at 20:33













    25














    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






    share|improve this answer




















    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11















    25














    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






    share|improve this answer




















    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11













    25












    25








    25







    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






    share|improve this answer















    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 4 at 20:20

























    answered Jun 3 at 23:46









    snips-n-snailssnips-n-snails

    9,77823478




    9,77823478







    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11












    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11







    3




    3





    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

    – ErikF
    Jun 4 at 4:24





    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

    – ErikF
    Jun 4 at 4:24




    4




    4





    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

    – mnem
    Jun 4 at 6:38






    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

    – mnem
    Jun 4 at 6:38





    6




    6





    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

    – IMil
    Jun 4 at 14:38





    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

    – IMil
    Jun 4 at 14:38




    5




    5





    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

    – BlackJack
    Jun 4 at 15:26





    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

    – BlackJack
    Jun 4 at 15:26













    @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

    – snips-n-snails
    Jun 4 at 18:11





    @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

    – snips-n-snails
    Jun 4 at 18:11











    16














    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)






    share|improve this answer




















    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      Jun 8 at 1:22















    16














    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)






    share|improve this answer




















    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      Jun 8 at 1:22













    16












    16








    16







    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)






    share|improve this answer















    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 6 at 9:05









    Stephen Kitt

    45.5k8190194




    45.5k8190194










    answered Jun 4 at 22:42









    RenPicRenPic

    1612




    1612







    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      Jun 8 at 1:22












    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      Jun 8 at 1:22







    1




    1





    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

    – kubanczyk
    Jun 5 at 9:33






    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

    – kubanczyk
    Jun 5 at 9:33














    Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

    – ninjalj
    Jun 8 at 1:22





    Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

    – ninjalj
    Jun 8 at 1:22











    11














    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






    share|improve this answer




















    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56















    11














    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






    share|improve this answer




















    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56













    11












    11








    11







    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






    share|improve this answer















    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 4 at 21:31

























    answered Jun 4 at 20:44









    JohannesDJohannesD

    2114




    2114







    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56












    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56







    4




    4





    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

    – Neil
    Jun 5 at 12:56





    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

    – Neil
    Jun 5 at 12:56











    3














    Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



    As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




    It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



    • You couldn't afford the cycles for the subroutine call and return!

    • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

    • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

    And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



    That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






    share|improve this answer





























      3














      Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



      As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




      It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



      • You couldn't afford the cycles for the subroutine call and return!

      • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

      • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

      And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



      That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






      share|improve this answer



























        3












        3








        3







        Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



        As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




        It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



        • You couldn't afford the cycles for the subroutine call and return!

        • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

        • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

        And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



        That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






        share|improve this answer















        Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



        As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




        It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



        • You couldn't afford the cycles for the subroutine call and return!

        • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

        • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

        And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



        That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Jun 5 at 15:37

























        answered Jun 5 at 15:10









        HarperHarper

        1,20859




        1,20859





















            2














            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






            share|improve this answer


















            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05















            2














            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






            share|improve this answer


















            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05













            2












            2








            2







            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






            share|improve this answer













            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Jun 5 at 22:27









            user996142user996142

            1411




            1411







            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05












            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05







            1




            1





            Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

            – Artur Biesiadowski
            Jun 6 at 14:05





            Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

            – Artur Biesiadowski
            Jun 6 at 14:05

















            draft saved

            draft discarded
















































            Thanks for contributing an answer to Retrocomputing Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11219%2fhow-did-old-ms-dos-games-utilize-various-graphic-cards%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Club Baloncesto Breogán Índice Historia | Pavillón | Nome | O Breogán na cultura popular | Xogadores | Adestradores | Presidentes | Palmarés | Historial | Líderes | Notas | Véxase tamén | Menú de navegacióncbbreogan.galCadroGuía oficial da ACB 2009-10, páxina 201Guía oficial ACB 1992, páxina 183. Editorial DB.É de 6.500 espectadores sentados axeitándose á última normativa"Estudiantes Junior, entre as mellores canteiras"o orixinalHemeroteca El Mundo Deportivo, 16 setembro de 1970, páxina 12Historia do BreogánAlfredo Pérez, o último canoneiroHistoria C.B. BreogánHemeroteca de El Mundo DeportivoJimmy Wright, norteamericano do Breogán deixará Lugo por ameazas de morteResultados de Breogán en 1986-87Resultados de Breogán en 1990-91Ficha de Velimir Perasović en acb.comResultados de Breogán en 1994-95Breogán arrasa al Barça. "El Mundo Deportivo", 27 de setembro de 1999, páxina 58CB Breogán - FC BarcelonaA FEB invita a participar nunha nova Liga EuropeaCharlie Bell na prensa estatalMáximos anotadores 2005Tempada 2005-06 : Tódolos Xogadores da Xornada""Non quero pensar nunha man negra, mais pregúntome que está a pasar""o orixinalRaúl López, orgulloso dos xogadores, presume da boa saúde económica do BreogánJulio González confirma que cesa como presidente del BreogánHomenaxe a Lisardo GómezA tempada do rexurdimento celesteEntrevista a Lisardo GómezEl COB dinamita el Pazo para forzar el quinto (69-73)Cafés Candelas, patrocinador del CB Breogán"Suso Lázare, novo presidente do Breogán"o orixinalCafés Candelas Breogán firma el mayor triunfo de la historiaEl Breogán realizará 17 homenajes por su cincuenta aniversario"O Breogán honra ao seu fundador e primeiro presidente"o orixinalMiguel Giao recibiu a homenaxe do PazoHomenaxe aos primeiros gladiadores celestesO home que nos amosa como ver o Breo co corazónTita Franco será homenaxeada polos #50anosdeBreoJulio Vila recibirá unha homenaxe in memoriam polos #50anosdeBreo"O Breogán homenaxeará aos seus aboados máis veteráns"Pechada ovación a «Capi» Sanmartín e Ricardo «Corazón de González»Homenaxe por décadas de informaciónPaco García volve ao Pazo con motivo do 50 aniversario"Resultados y clasificaciones""O Cafés Candelas Breogán, campión da Copa Princesa""O Cafés Candelas Breogán, equipo ACB"C.B. Breogán"Proxecto social"o orixinal"Centros asociados"o orixinalFicha en imdb.comMario Camus trata la recuperación del amor en 'La vieja música', su última película"Páxina web oficial""Club Baloncesto Breogán""C. B. Breogán S.A.D."eehttp://www.fegaba.com

            Vilaño, A Laracha Índice Patrimonio | Lugares e parroquias | Véxase tamén | Menú de navegación43°14′52″N 8°36′03″O / 43.24775, -8.60070

            Cegueira Índice Epidemioloxía | Deficiencia visual | Tipos de cegueira | Principais causas de cegueira | Tratamento | Técnicas de adaptación e axudas | Vida dos cegos | Primeiros auxilios | Crenzas respecto das persoas cegas | Crenzas das persoas cegas | O neno deficiente visual | Aspectos psicolóxicos da cegueira | Notas | Véxase tamén | Menú de navegación54.054.154.436928256blindnessDicionario da Real Academia GalegaPortal das Palabras"International Standards: Visual Standards — Aspects and Ranges of Vision Loss with Emphasis on Population Surveys.""Visual impairment and blindness""Presentan un plan para previr a cegueira"o orixinalACCDV Associació Catalana de Cecs i Disminuïts Visuals - PMFTrachoma"Effect of gene therapy on visual function in Leber's congenital amaurosis"1844137110.1056/NEJMoa0802268Cans guía - os mellores amigos dos cegosArquivadoEscola de cans guía para cegos en Mortágua, PortugalArquivado"Tecnología para ciegos y deficientes visuales. Recopilación de recursos gratuitos en la Red""Colorino""‘COL.diesis’, escuchar los sonidos del color""COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device"o orixinal"Sistema de desarrollo de sinestesia color-sonido para invidentes utilizando un protocolo de audio""Enseñanza táctil - geometría y color. Juegos didácticos para niños ciegos y videntes""Sistema Constanz"L'ocupació laboral dels cecs a l'Estat espanyol està pràcticament equiparada a la de les persones amb visió, entrevista amb Pedro ZuritaONCE (Organización Nacional de Cegos de España)Prevención da cegueiraDescrición de deficiencias visuais (Disc@pnet)Braillín, un boneco atractivo para calquera neno, con ou sen discapacidade, que permite familiarizarse co sistema de escritura e lectura brailleAxudas Técnicas36838ID00897494007150-90057129528256DOID:1432HP:0000618D001766C10.597.751.941.162C97109C0155020