前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Games101--Assignment3

Games101--Assignment3

作者头像
Enterprise_
发布2020-07-14 10:09:26
1.5K0
发布2020-07-14 10:09:26
举报

Assignment3的任务要求

1. 修改函数rasterize_triangle(const Triangle& t) in rasterizer.cpp: 在此处实现与作业2 类似的插值算法,实现法向量、颜色、纹理颜色的插值。
2. 修改函数get_projection_matrix() in main.cpp: 将你自己在之前的实验中实现的投影矩阵填到此处,此时你可以运行./Rasterizer output.png normal来观察法向量实现结果。
3. 修改函数phong_fragment_shader() in main.cpp: 实现Blinn-Phong 模型计算Fragment Color.
4. 修改函数texture_fragment_shader() in main.cpp: 在实现Blinn-Phong的基础上,将纹理颜色视为公式中的kd,实现Texture Shading Fragment Shader.
5. 修改函数bump_fragment_shader() in main.cpp: 在实现Blinn-Phong 的基础上,仔细阅读该函数中的注释,实现Bump mapping.
6. 修改函数displacement_fragment_shader() in main.cpp: 在实现Bumpmapping 的基础上,实现displacement mapping.

Assignment3–FAQ

http://games-cn.org/forums/topic/frequently-asked-questionskeep-updating/

(1) bump mapping 部分的 h(u,v)=texture_color(u,v).norm, 其中 u,v 是 tex_coords, w,h 是 texture 的宽度与高度
(2) rasterizer.cpp 中 v = t.toVector4()
(3) get_projection_matrix 中的 eye_fov 应该被转化为弧度制
(4) bump 与 displacement 中修改后的 normal 仍需要 normalize
(5) 可能用到的 eigen 方法:norm(), normalized(), cwiseProduct()
(6) 实现 h(u+1/w,v) 的时候要写成 h(u+1.0/w,v)

步骤

1.实现法向量、颜色、纹理颜色的插值

首先计算三角形内某点的重心坐标(

\alpha

,

\beta

,

\gamma

),所用公式即为课上所提。使用重心坐标对三角形内的点

(\alpha,\beta,\gamma)

进行插值,需要插值的属性也用重心坐标进行线性组合,三个顶点的属性为

V_A

V_B

V_C

,则其属性即为

V=\alpha V_A+\beta V_B+\gamma V_C

//Screen space rasterization
void rst::rasterizer::rasterize_triangle(const Triangle& t, const std::array<Eigen::Vector3f, 3>& view_pos) 
{
    // TODO: From your HW3, get the triangle rasterization code.
    // TODO: Inside your rasterization loop:
    //    * v[i].w() is the vertex view space depth value z.
    //    * Z is interpolated view space depth for the current pixel
    //    * zp is depth between zNear and zFar, used for z-buffer

    // float Z = 1.0 / (alpha / v[0].w() + beta / v[1].w() + gamma / v[2].w());
    // float zp = alpha * v[0].z() / v[0].w() + beta * v[1].z() / v[1].w() + gamma * v[2].z() / v[2].w();
    // zp *= Z;

    // TODO: Interpolate the attributes:
    // auto interpolated_color
    // auto interpolated_normal
    // auto interpolated_texcoords
    // auto interpolated_shadingcoords

    // Use: fragment_shader_payload payload( interpolated_color, interpolated_normal.normalized(), interpolated_texcoords, texture ? &*texture : nullptr);
    // Use: payload.view_pos = interpolated_shadingcoords;
    // Use: Instead of passing the triangle's color directly to the frame buffer, pass the color to the shaders first to get the final color;
    // Use: auto pixel_color = fragment_shader(payload);
    auto v = t.toVector4();
    int left=MIN(v[0].x(),MIN(v[1].x(),v[2].x()))-1;
    int right=MAX(v[0].x(),MAX(v[1].x(),v[2].x()))+1;
    int bottom=MIN(v[0].y(),MIN(v[1].y(),v[2].y()))-1;
    int top=MAX(v[0].y(),MAX(v[1].y(),v[2].y()))+1;
    //int top=MAX(v[0].y(),MIN(v[1].y(),v[2].y()))+1;//镂空效果,不一定非得是top,只要缩小了包围盒。
    for(int x=left;x<=right;x++){
        for(int y=bottom;y<=top;y++){
            if(insideTriangle(x,y,t.v)){
                auto[alpha, beta, gamma]=computeBarycentric2D(x, y, t.v);
                float Z = 1.0/(alpha / v[0].w() + beta / v[1].w() + gamma / v[2].w());
                float zp= alpha * v[0].z() / v[0].w() + beta * v[1].z() / v[1].w() + gamma * v[2].z() / v[2].w();
                zp*=Z; 
                if(zp<depth_buf[get_index(x,y)]){
                    depth_buf[get_index(x,y)]=zp;
                    auto interpolated_color=Vector3f(alpha*t.color[0]+beta*t.color[1]+gamma*t.color[2]);
                    auto interpolated_normal=Vector3f(alpha*t.normal[0]+beta*t.normal[1]+gamma*t.normal[2]).normalized();
                    auto interpolated_texcoords=Vector2f(alpha*t.tex_coords[0]+beta*t.tex_coords[1]+gamma*t.tex_coords[2]);
                    auto interpolated_shadingcoords=Vector3f(alpha*view_pos[0]+beta*view_pos[1]+gamma*view_pos[2]);
                      
                    fragment_shader_payload payload( interpolated_color, interpolated_normal.normalized(), interpolated_texcoords, texture ? &*texture : nullptr);
                    payload.view_pos = interpolated_shadingcoords;
                    auto pixel_color = fragment_shader(payload);
                    depth_buf[get_index(x, y)] = zp;
                    set_pixel(Vector2i(x,y),pixel_color);
                }
            }
        }   
    }
}

2.投影矩阵

这里使用的还是之前Assignment2中使用的投影矩阵。

Eigen::Matrix4f get_projection_matrix(float eye_fov, float aspect_ratio, float zNear, float zFar)
{
    // TODO: Use the same projection matrix from the previous assignments
    Eigen::Matrix4f projection = Eigen::Matrix4f::Identity();
    float n=zNear,f=zFar;
    float t=-n*tan(eye_fov/2.0);//Use -n because the left-hand coordinate used in the framework 
    float r=t*aspect_ratio;
    float b=-t,l=-r;
    projection<<(2.0*n)/(r-l),0,(r+l)/(l-r),0,
    0,(2.0*n)/(t-b),(t+b)/(b-t),0,
    0,0,(n+f)/(n-f),(2.0*n*f)/(f-n),
    0,0,1,0;
    return projection;
}

3.normal上色实现结果

在这里我因为找BUG还得到了另一个镂空的效果,当然这样的结果不是题目要求的,但是看着还挺有意思。原因是此时包围盒的边界计算错误(包围盒没有完全覆盖三角形),造成被三角形覆盖的像素没有被完全采样。不同的边界缩小会有些许区别。

4.实现Blinn-Phong 模型计算Fragment Color

所用公式如下:

\begin{aligned} L&=L_a+L_d+L_s\\ &=k_a I_a + k_d (\frac{I}{r^2}) \max(0,\pmb{n}\cdot \pmb{l})+ k_s (\frac{I}{r^2})\max(0,\pmb{n}\cdot \pmb{h})^p \end{aligned}

需要注意的是Eigen中的矩阵系数乘积计算可以使用cwiseProduct。

Eigen::Vector3f phong_fragment_shader(const fragment_shader_payload& payload)
{
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = payload.color;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};
    /*
    struct light
    {
        Eigen::Vector3f position;
        Eigen::Vector3f intensity;
    };
    */
    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;

    Eigen::Vector3f color = payload.color;
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;

    Eigen::Vector3f result_color = {0, 0, 0};

    auto n=normal.normalized();
    
    for (auto& light : lights)
    {
        // TODO: For each light source in the code, calculate what the *ambient*, *diffuse*, and *specular* 
        // components are. Then, accumulate that result on the *result_color* object.
        auto l=(light.position-point).normalized();
        auto v=(eye_pos-point).normalized();
        auto r=light.position-point;
        auto h=(v+l).normalized();
        auto La=ka.cwiseProduct(amb_light_intensity);  
        auto Ld=kd.cwiseProduct(light.intensity/(r.dot(r)))*MAX(0.0,n.dot(l));
        auto Ls=ks.cwiseProduct(light.intensity/(r.dot(r)))*pow((MAX(0.0,n.dot(h))),p);
        result_color+=La+Ld+Ls;
    }
    return result_color * 255.f;
}

结果如下:

5.实现Texture Shading Fragment Shader

公式还是Blinn-Phong模型的公式,但是此时将纹理颜色视为公式中的

k_d
Eigen::Vector3f texture_fragment_shader(const fragment_shader_payload& payload)
{
    Eigen::Vector3f return_color = {0, 0, 0};
    if (payload.texture)
    {
        // TODO: Get the texture value at the texture coordinates of the current fragment
        return_color=payload.texture->getColor(payload.tex_coords.x(),payload.tex_coords.y());
    }
    Eigen::Vector3f texture_color;q
    texture_color << return_color.x(), return_color.y(), return_color.z();

    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = texture_color / 255.f;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;

    Eigen::Vector3f color = texture_color;
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;

    Eigen::Vector3f result_color = {0, 0, 0};

    auto n=normal.normalized();
    for (auto& light : lights)
    {
        // TODO: For each light source in the code, calculate what the *ambient*, *diffuse*, and *specular* 
        // components are. Then, accumulate that result on the *result_color* object.
        auto l=(light.position-point).normalized();
        auto v=(eye_pos-point).normalized();
        auto r=light.position-point;
        auto h=(v+l).normalized();
        auto I=light.intensity;
        auto La=ka.cwiseProduct(amb_light_intensity); 
        auto Ld=kd.cwiseProduct(I/(r.dot(r)))*MAX(0.0,n.dot(l));
        auto Ls=ks.cwiseProduct(I/(r.dot(r)))*pow((MAX(0.0,n.dot(h))),p);
        result_color+=La+Ld+Ls;
    }
    return result_color * 255.f;
}

结果如下:

6.实现Bump mapping

注意 bump mapping 部分的

h(u,v)=texture\_color(u,v).norm

, 其中

u

,

v

tex\_coords

w

,

h

texture

的宽度与高度,最后法线要归一化,其余按照注释实现即可。

Eigen::Vector3f bump_fragment_shader(const fragment_shader_payload& payload)
{
    
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = payload.color;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;

    Eigen::Vector3f color = payload.color; 
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;

    float kh = 0.2, kn = 0.1;

    // TODO: Implement bump mapping here
    // Let n = normal = (x, y, z)
    // Vector t = (x*y/sqrt(x*x+z*z),sqrt(x*x+z*z),z*y/sqrt(x*x+z*z))
    // Vector b = n cross product t
    // Matrix TBN = [t b n]
    // dU = kh * kn * (h(u+1/w,v)-h(u,v))
    // dV = kh * kn * (h(u,v+1/h)-h(u,v))
    // Vector ln = (-dU, -dV, 1)
    // Normal n = normalize(TBN * ln)
    auto n=normal.normalized();
    auto x=n.x(),y=n.y(),z=n.z();
    auto t=Vector3f(x*y/sqrt(x*x+z*z),sqrt(x*x+z*z),z*y/sqrt(x*x+z*z));
    auto b=n.cross(t);
    Matrix3f TBN;
    TBN<<
    t.x(),b.x(),n.x(),
    t.y(),b.y(),n.y(),
    t.z(),b.z(),n.z();

    auto u=payload.tex_coords.x(),v=payload.tex_coords.y();
    auto h=payload.texture->height,w=payload.texture->width;
    auto dU = kh * kn * (payload.texture->getColor(u+1.0/w,v).norm()-payload.texture->getColor(u,v).norm());
    auto dV = kh * kn * (payload.texture->getColor(u,v+1.0/h).norm()-payload.texture->getColor(u,v).norm());

    auto ln=Vector3f(-dU,-dV,1.0);
    normal=(TBN*ln).normalized();
    Eigen::Vector3f result_color = {0, 0, 0};
    result_color = normal;

    return result_color * 255.f;
}

结果如下:

7.实现displacement mapping

和Bump mapping相比较,displacement mapping对实际点进行了更新。

Eigen::Vector3f displacement_fragment_shader(const fragment_shader_payload& payload)
{
    
    Eigen::Vector3f ka = Eigen::Vector3f(0.005, 0.005, 0.005);
    Eigen::Vector3f kd = payload.color;
    Eigen::Vector3f ks = Eigen::Vector3f(0.7937, 0.7937, 0.7937);

    auto l1 = light{{20, 20, 20}, {500, 500, 500}};
    auto l2 = light{{-20, 20, 0}, {500, 500, 500}};

    std::vector<light> lights = {l1, l2};
    Eigen::Vector3f amb_light_intensity{10, 10, 10};
    Eigen::Vector3f eye_pos{0, 0, 10};

    float p = 150;

    Eigen::Vector3f color = payload.color; 
    Eigen::Vector3f point = payload.view_pos;
    Eigen::Vector3f normal = payload.normal;

    float kh = 0.2, kn = 0.1; 
    // TODO: Implement displacement mapping here
    // Let n = normal = (x, y, z)
    // Vector t = (x*y/sqrt(x*x+z*z),sqrt(x*x+z*z),z*y/sqrt(x*x+z*z))
    // Vector b = n cross product t
    // Matrix TBN = [t b n]
    // dU = kh * kn * (h(u+1/w,v)-h(u,v))
    // dV = kh * kn * (h(u,v+1/h)-h(u,v))
    // Vector ln = (-dU, -dV, 1)
    // Position p = p + kn * n * h(u,v)
    // Normal n = normalize(TBN * ln)
    auto n=normal.normalized();
    auto x=n.x(),y=n.y(),z=n.z();
    auto t=Vector3f(x*y/sqrt(x*x+z*z),sqrt(x*x+z*z),z*y/sqrt(x*x+z*z));
    auto b=n.cross(t);
    Matrix3f TBN;
    TBN<<
    t.x(),b.x(),n.x(),
    t.y(),b.y(),n.y(),
    t.z(),b.z(),n.z();

    auto u=payload.tex_coords.x(),v=payload.tex_coords.y();
    auto h=payload.texture->height,w=payload.texture->width;
    auto dU=kh * kn * (payload.texture->getColor(u+1.0/w,v).norm()-payload.texture->getColor(u,v).norm());
    auto dV = kh * kn * (payload.texture->getColor(u,v+1.0/h).norm()-payload.texture->getColor(u,v).norm());
    auto ln=Vector3f(-dU,-dV,1.0);
    normal=(TBN*ln).normalized();
    point+=kn*n*(payload.texture->getColor(u,v).norm());

    Eigen::Vector3f result_color = {0, 0, 0};

    n=normal;
    
    for (auto& light : lights)
    {
        // TODO: For each light source in the code, calculate what the *ambient*, *diffuse*, and *specular* 
        // components are. Then, accumulate that result on the *result_color* object.
        auto l=(light.position-point).normalized();
        auto v=(eye_pos-point).normalized();
        auto r=light.position-point;
        auto h=(v+l).normalized();
        auto I=light.intensity;
        auto La=ka.cwiseProduct(amb_light_intensity);  
        auto Ld=kd.cwiseProduct(I/(r.dot(r)))*MAX(0.0,n.dot(l));
        auto Ls=ks.cwiseProduct(I/(r.dot(r)))*pow((MAX(0.0,n.dot(h))),p);
        result_color+=La+Ld+Ls;
    }
    return result_color * 255.f;
}

结果如下:

本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2020-07-12 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Assignment3的任务要求
  • Assignment3–FAQ
  • 步骤
    • 1.实现法向量、颜色、纹理颜色的插值
      • 2.投影矩阵
        • 3.normal上色实现结果
          • 4.实现Blinn-Phong 模型计算Fragment Color
            • 5.实现Texture Shading Fragment Shader
              • 6.实现Bump mapping
                • 7.实现displacement mapping
                领券
                问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档