我有一个表示字母q
像素的无符号字节数组。我想使用OpenGL将这些像素写入glTexImage2D
纹理。在下面的代码中,我还添加了一些检查,以确保像素数据是有效的。我检查了宽度乘以高度是否与数据长度相匹配,甚至将像素打印到终端。(代码是用生锈写的,但是GL调用与任何其他绑定中的调用相同)。
// create the array of pixels
let data: &[u8] = &[
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 7, 37, 2, 0, 0, 0, 0, 0, 0, 0, 119,
248, 252, 241, 92, 223, 130, 0, 0, 0, 84, 253, 108, 8, 36, 202, 248, 130, 0, 0, 0,
198, 182, 0, 0, 0, 52, 255, 130, 0, 0, 0, 241, 120, 0, 0, 0, 0, 245, 130, 0, 0, 3,
252, 108, 0, 0, 0, 0, 233, 130, 0, 0, 0, 223, 143, 0, 0, 0, 14, 253, 130, 0, 0, 0,
144, 234, 20, 0, 0, 126, 255, 130, 0, 0, 0, 23, 219, 223, 142, 173, 194, 231, 130,
0, 0, 0, 0, 18, 120, 156, 108, 13, 223, 130, 0, 0, 0, 0, 0, 0, 0, 0, 0, 223, 130,
0, 0, 0, 0, 0, 0, 0, 0, 0, 223, 130, 0, 0, 0, 0, 0, 0, 0, 0, 0, 149, 87, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0,
];
// check that WxH equals the number of pixels
let data_w = 11;
let data_h = 15;
assert_eq!(data_w * data_h, data.len());
// write the pixels to the texture
glBindTexture(GL_TEXTURE_2D, texture);
// initialize the texture
glTexImage2D(
GL_TEXTURE_2D,
0,
GL_RED as _,
256,
256,
0,
GL_RED,
GL_UNSIGNED_BYTE,
ptr::null(),
);
// write the pixels
glTexSubImage2D(
GL_TEXTURE_2D,
0,
0,
0,
data_w as _,
data_h as _,
GL_RED,
GL_UNSIGNED_BYTE,
data.as_ptr() as _,
);
// print the pixels to the terminal
let mut counter = 0;
for _ in 0..data_h {
for _ in 0..data_w {
if data[counter] > 100 {
print!("+");
} else {
print!(" ");
}
counter += 1;
}
println!()
}
终端测试的输出是:
++++ ++
++ +++
++ ++
++ ++
++ ++
++ ++
++ +++
+++++++
+++ ++
++
++
+
因此,问题显然不在像素数据中。
当我将纹理呈现到屏幕上时,如下所示:
为什么它不能正确渲染?
发布于 2022-06-26 18:14:42
结果表明,OpenGL在将数据写入纹理时做了一些字节行对齐。解决方案是在glTexImage2D
之前添加这一行
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
链接到:文档。
https://stackoverflow.com/questions/72764105
复制相似问题