- 17 1月, 2016 9 次提交
-
-
由 Sean Barrett 提交于
-
-
由 Sean Barrett 提交于
-
-
由 Sean Barrett 提交于
-
-
由 Sean Barrett 提交于
-
-
由 Sean Barrett 提交于
-
- 16 1月, 2016 4 次提交
-
-
由 Sean Barrett 提交于
-
-
由 Sean Barrett 提交于
minor tweak to get_samples_interleaved documentation
-
由 Sean Barrett 提交于
dummy definitions for malloc et al (note you have to modify source to make this work though anyway); tweak credits change;
-
- 14 1月, 2016 1 次提交
-
-
由 Romain Bailly 提交于
-
- 13 1月, 2016 2 次提交
-
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
- 04 1月, 2016 1 次提交
-
-
由 Sean Barrett 提交于
-
- 17 12月, 2015 1 次提交
-
-
由 Sean Barrett 提交于
-
- 16 12月, 2015 1 次提交
-
-
由 Sean Barrett 提交于
-
- 06 12月, 2015 4 次提交
-
-
由 Daniel Gibson 提交于
I claimed that if the most significant bit of a 16bit pixel is set, it should be opaque (as is suggested by some sources on the internet), but implemented the opposite. If implemented "correctly", lots of 16bit TGAs become invisible.. so I guess 16bit TGAs aren't really supposed to have an alpha-channel, or at least most 16bit TGAs (despite having set an "alpha-bit" in the "image descriptor byte") in the wild don't seem to work like that. So just assume 16bit non-greyscale TGAs are always STBI_rgb without an alpha channel.
-
由 Daniel Gibson 提交于
* Calculate correct stb format (incl. proper 16bit support) also when using a colormap (palette) * Create colormap with tga_comp, to correctly support 16bit RGB (instead of using tga_palette_bits/8 and just copying the data) * For TGAs with colormap, the TGA bits per pixel field specifies the size of an index to the colormap - the "real" color depth of the image is saved in the color map specification's bits per pixel field. I think only 8 and 16bit indices make sense (16 should be supported, otherwise the colormap length could be u8 instead of u16), so I added support for both. * Helper functions stbi__tga_get_comp() to calculate stb pixelformat and stbi__tga_read_rgb16() to read one 16bit pixel and convert it to 24/32bit RGB(A) - for less duplicate code
-
由 Daniel Gibson 提交于
* for paletted images, .._info()'s comp should be based on the palette's bits per pixel, not the images bits per pixel (which describes the size of an index into the palette and is also checked now) * make sure the color (map) type and the image type fields of the header are consistent (=> if TGA color type is 1 for paletted, the TGA image type must be 1 or 9) * .._test() does some more checks and uses stbi__get16le() instead of stbi__get16be() - TGA is little endian. * .._test() now always rewinds (sometimes it used to do only return 0; without rewinding) * remove "error check" at the beginning of stbi__tga_load(), because all that is already tested in stbi__tga_test()
-
由 Daniel Gibson 提交于
stbi__tga_* assumed that 16bit TGAs were Grayscale + Alpha. However, if the TGA imagetype is not one of the gray ones, it's 16Bit RGB data, with 5 Bits per channel. If the TGA image descriptor field has alpha bits (the 3 least significant ones) set, the pixel's most significant bit is for alpha: 1 for opaque and 0 for translucent. Furthermore people claim that TGAs can also pretend to have 15bpp, which is the same as 16bpp but definitely without alpha. So 15/16bpp TGAs are now decoded to STBI_rgb(_alpha).
-
- 30 11月, 2015 2 次提交
-
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
- 21 11月, 2015 2 次提交
-
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
- 14 11月, 2015 1 次提交
-
-
由 baldurk 提交于
-
- 12 11月, 2015 2 次提交
-
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
- 09 11月, 2015 9 次提交
-
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
由 Sean Barrett 提交于
-
- 08 11月, 2015 1 次提交
-
-
由 Sean Barrett 提交于
-