Introduction

Just as the compression on a JPEG image allows more images to fit on a disk, texture compression allows more textures to fit inside graphics hardware, which on mobile platforms is particularly important. The Mali GPU has built in hardware texture decompression, allowing the texture to remain compressed in graphics hardware and decompress the required samples on the fly. By utilizing compressed textures within an application, the amount of memory bandwidth required is drastically reduced therefore increasing performance of the application and utilizing less power.

ETC

Ericsson Texture Compression or ETC is an open standard supported by Khronos and widely used on mobile platforms. It is a lossy algorithm designed for perceptive quality, based on the fact that the human eye is more responsive to changes in luminance than chrominance.

One minor deficiencies with the ETC v1 is that textures compressed in this format lose any alpha channel information, and can have no transparent areas. As there are quite a few clever and interesting things that can be done using alpha channels in textures, this has led many developers to use other texture compression algorithms, many of which are proprietary formats with limited hardware support.

Supporting Alpha Channels

All the methods and techniques described below are available in the OpenGL ES SDK for Linux on ARM and OpenGL ES SDK for Android  in complete source code form.

There a number of techniques that can be used to support transparency in applications and still utilize ETC compression. This page discusses these techniques.

Extracting Alpha Channels

The first step in any of these methods is extracting the alpha channel from your textures. Since the alpha channel is not packed in the compressed texture, it has to be delivered alongside it. The alpha channel can be extracted with most graphics programs, but since performing that task would be quite arduous, support for this is provided in the Mali GPU Texture Compression Tool. Whether, and how, the alpha channel is extracted may be selected by choosing an Alpha handling option the Compression Options dialog (see below).

Mali GPU Texture Compression Tool

Method 1: Texture Atlas

The alpha channel is converted to a visible grayscale image, which is then concatenated onto the original texture, making the overall texture graphic taller.

Texture Atlas

Benefits

Only one file (Minimal changes to texture loading code)

Minor changes to shader code needed (Scaling)

Drawbacks
Texture samples will only wrap correctly one direction

Scaling slows down shader execution

Method

This is the easiest method to implement as when the texture atlas image has been compressed, the only change required in your code is a remapping of texture coordinates in the shader, such that:

becomes

This scales texture coordinates to use the top half and then shifts that down to the bottom half of the image for a second sample where the alpha channel is. This example uses the red channel of the image mask to set the alpha channel.

More practically, a second varying value can be added to the vertex shader.

Making the vertex shader look like this:

The fragment shader can then use the two varying coordinates:

This uses a little more bandwidth for the extra varying vec2, but makes better use of pipelining, particularly as most developers tend to do more work in their fragment shaders than their vertex shaders. With these minor changes most applications should run fine with the texture atlas files.

Complete source code listing for this method is provided in the OpenGL ES SDK example “ETCAtlasAlpha”

However, there are cases when you might want to maintain the ability to wrap a texture over a large area. For that there are the other two methods, discussed below.

Method 2: Seperatly Packed Alpha

The alpha channel is delivered as a second packed texture. Both textures are then combined in the shade code.

Separately Packed Alpha

 

Benefits

More flexible, allows alpha/colour channels to be mixed and matched

Allows for texture wrapping in both directions

Drawbacks

Requires a second texture sampler in the shader.

Method
To create compressed images suitable for use with this method select “Create separate compressed image” in the Texture Compression Tool.

When loading the second texture call glActiveTexture(GL_TEXTURE1) before glBindTexture and glCompressedTexImage2D in order to ensure that the alpha channel is allocated in a different hardware texture slot.

When setting shader uniform variables,  allocate a second texture sampler and bind it to the second texture unit:

Then inside the fragment shader, once again merge the two samples, this time from different textures.

Complete source code listing for this method is provided in the OpenGL ES SDK example “ETCCompressedAlpha”

Method 3: Seperate Raw Alpha

The alpha channel is provided as a raw 8 bit single-channel image, combined with the texture data in the shader.

Separate Raw Alpha

 

Benefit

More flexible, allows alpha/colour information to be mixed and matched

Allows uncompressed alpha, in case lossy ETC1 compression caused artefacts

Drawbacks

Requires a second texture sampler in the shader

Uncompressed alpha takes up more space and memory bandwidth than compressed (although still far less than an uncompressed RGBA texture)

Method

To create compressed images suitable for use with this method select “Create separate compressed image” in the Texture Compression Tool. Depending what other options are selected this will produce either single a PGM file, or a PGM file for each mipmap level, or a single KTX file containing all mipmap levels as uncompressed data.

PGM format is described in http://netpbm.sourceforge.net/doc/pgm.html. KTX format is described in http://www.khronos.org/opengles/sdk/tools/KTX/file_format_spec/.

Implement a new method to load and bind this texture, but given the uncompressed nature of the texture loading is fairly trivial:

Allowing the textures to be loaded into separate active textures like before:

Once again loading them separately into the fragment shader:

Then inside the fragment shader merge the two samples, again from the two different textures:

Complete source code listing for this method is provided in the OpenGL ES SDK example “ETCUncompressedAlpha”

 

OpenGL ES SDKs

Added: 24 September 2010