console: Fix rendering of VGA underline
vga_putcharxy()'s underline code sets font_data to 0xffff instead of 0xff. vga_putcharxy() then reads dmask16[0xffff >> 4] and dmask4[0xffff >> 6]. In practice, these out-of-bounds subscripts "only" put a few crap bits into the display surface. For 32 bit pixels, there's no array access. font_data's extra bits go straight into the display surface. Broken when commit 6d6f7c28 implemented underline. Spotted by Coverity. Signed-off-by: NMarkus Armbruster <armbru@redhat.com> Signed-off-by: NAnthony Liguori <aliguori@us.ibm.com>
Showing
想要评论请 注册 或 登录