# 一、Logistic回归简介

Logistic回归是解决二分类问题的分类算法。假设有mmm个训练样本{(x(1),y(1)),(x(2),y(2)),⋯,(x(m),y(m))}{(x(1),y(1)),(x(2),y(2)),⋯,(x(m),y(m))}\left \{ \left ( \mathbf{x}^{(1)},y^{(1)} \right ),\left ( \mathbf{x}^{(2)},y^{(2)} \right ),\cdots ,\left ( \mathbf{x}^{(m)},y^{(m)} \right ) \right \}，对于Logistic回归，其输入特征为：x(i)∈ℜn+1x(i)∈ℜn+1\mathbf{x}^{(i)}\in \Re ^{n+1}，类标记为：y(i)∈{0,1}y(i)∈{0,1}y^{(i)}\in \left \{ 0,1 \right \}，假设函数为Sigmoid函数：

hθ(x)=11+e−θTxhθ(x)=11+e−θTx

h_\theta \left ( x \right )=\frac{1}{1+e^{-\theta ^Tx}}

J(θ)=−1m∑i=1m[y(i)loghθ(x(i))+(1−y(i))log(1−hθ(x(i)))]J(θ)=−1m∑i=1m[y(i)loghθ(x(i))+(1−y(i))log(1−hθ(x(i)))]

J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ y^{(i)}logh_\theta \left ( \mathbf{x}^{(i)} \right )+\left ( 1-y^{(i)} \right )log\left ( 1-h_\theta \left ( \mathbf{x}^{(i)} \right ) \right ) \right ]

▿θjJ(θ)=−1m∑mi=1[y(i)hθ(x(i))⋅▿θjhθ(x(i))+1−y(i)1−hθ(x(i))⋅▿θj(1−hθ(x(i)))]=−1m∑mi=1[y(i)hθ(x(i))⋅▿θjhθ(x(i))−1−y(i)1−hθ(x(i))⋅▿θjhθ(x(i))]=−1m∑mi=1[(y(i)hθ(x(i))−1−y(i)1−hθ(x(i)))⋅▿θjhθ(x(i))]▽θjJ(θ)=−1m∑i=1m[y(i)hθ(x(i))⋅▽θjhθ(x(i))+1−y(i)1−hθ(x(i))⋅▽θj(1−hθ(x(i)))]=−1m∑i=1m[y(i)hθ(x(i))⋅▽θjhθ(x(i))−1−y(i)1−hθ(x(i))⋅▽θjhθ(x(i))]=−1m∑i=1m[(y(i)hθ(x(i))−1−y(i)1−hθ(x(i)))⋅▽θjhθ(x(i))]

\begin{matrix} \triangledown _{\theta _j}J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ \frac{y^{(i)}}{h_\theta \left ( \mathbf{x}^{(i)} \right )}\cdot \triangledown _{\theta _j}h_\theta \left ( \mathbf{x}^{(i)} \right )+\frac{1-y^{(i)}}{1-h_\theta \left ( \mathbf{x}^{(i)} \right )}\cdot \triangledown _{\theta _j}\left ( 1-h_\theta \left ( \mathbf{x}^{(i)} \right ) \right )\right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [ \frac{y^{(i)}}{h_\theta \left ( \mathbf{x}^{(i)} \right )}\cdot \triangledown _{\theta _j}h_\theta \left ( \mathbf{x}^{(i)} \right )-\frac{1-y^{(i)}}{1-h_\theta \left ( \mathbf{x}^{(i)} \right )}\cdot \triangledown _{\theta _j}h_\theta \left ( \mathbf{x}^{(i)} \right ) \right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [ \left ( \frac{y^{(i)}}{h_\theta \left ( \mathbf{x}^{(i)} \right )}-\frac{1-y^{(i)}}{1-h_\theta \left ( \mathbf{x}^{(i)} \right )} \right )\cdot \triangledown _{\theta _j}h_\theta \left ( \mathbf{x}^{(i)} \right ) \right ] \end{matrix}

=−1m∑mi=1[y(i)−hθ(x(i))hθ(x(i))(1−hθ(x(i)))⋅▿θjhθ(x(i))]=−1m∑mi=1[y(i)−hθ(x(i))hθ(x(i))(1−hθ(x(i)))⋅▿θTx(i)hθ(x(i))⋅▿θj(θTx(i))]=−1m∑i=1m[y(i)−hθ(x(i))hθ(x(i))(1−hθ(x(i)))⋅▽θjhθ(x(i))]=−1m∑i=1m[y(i)−hθ(x(i))hθ(x(i))(1−hθ(x(i)))⋅▽θTx(i)hθ(x(i))⋅▽θj(θTx(i))]

\begin{matrix} =-\frac{1}{m}\sum_{i=1}^{m}\left [ \frac{y^{(i)}-h_\theta \left ( \mathbf{x}^{(i)} \right )}{h_\theta \left ( \mathbf{x}^{(i)} \right )\left ( 1-h_\theta \left ( \mathbf{x}^{(i)} \right ) \right )}\cdot \triangledown _{\theta _j}h_\theta \left ( \mathbf{x}^{(i)} \right ) \right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [ \frac{y^{(i)}-h_\theta \left ( \mathbf{x}^{(i)} \right )}{h_\theta \left ( \mathbf{x}^{(i)} \right )\left ( 1-h_\theta \left ( \mathbf{x}^{(i)} \right ) \right )}\cdot \triangledown _{\theta ^T\mathbf{x}^{(i)}}h_\theta \left ( \mathbf{x}^{(i)} \right )\cdot \triangledown _{\theta _j}\left ( \theta ^T\mathbf{x}^{(i)} \right ) \right ] \end{matrix}

▿θTx(i)hθ(x(i))=hθ(x(i))(1−hθ(x(i)))▽θTx(i)hθ(x(i))=hθ(x(i))(1−hθ(x(i)))

\triangledown _{\theta ^T\mathbf{x}^{(i)}}h_\theta \left ( \mathbf{x}^{(i)} \right )=h_\theta \left ( \mathbf{x}^{(i)} \right )\left ( 1-h_\theta \left ( \mathbf{x}^{(i)} \right ) \right )

▿θj(θTx(i))=x(i)j▽θj(θTx(i))=xj(i)

\triangledown _{\theta _j}\left ( \theta ^T\mathbf{x}^{(i)} \right )=x^{(i)}_j

▿θjJ(θ)=−1m∑i=1m[(y(i)−hθ(x(i)))⋅x(i)j]▽θjJ(θ)=−1m∑i=1m[(y(i)−hθ(x(i)))⋅xj(i)]

\triangledown _{\theta _j}J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ \left ( y^{(i)}-h_\theta \left ( \mathbf{x}^{(i)} \right ) \right )\cdot x^{(i)}_j \right ]

θj:=θj−α▿θjJ(θ)θj:=θj−α▽θjJ(θ)

\theta _j:=\theta _j-\alpha \triangledown _{\theta _j}J\left ( \theta \right )

# 二、Softmax回归

## 2.1、Softmax回归简介

Softmax是Logistic回归在多分类上的推广，即类标签yyy的取值大于等于222。假设有mmm个训练样本{(x(1),y(1)),(x(2),y(2)),⋯,(x(m),y(m))}{(x(1),y(1)),(x(2),y(2)),⋯,(x(m),y(m))}\left \{ \left ( \mathbf{x}^{(1)},y^{(1)} \right ),\left ( \mathbf{x}^{(2)},y^{(2)} \right ),\cdots ,\left ( \mathbf{x}^{(m)},y^{(m)} \right ) \right \}，对于Softmax回归，其输入特征为：x(i)∈ℜn+1x(i)∈ℜn+1\mathbf{x}^{(i)}\in \Re ^{n+1}，类标记为：y(i)∈{0,1,⋯k}y(i)∈{0,1,⋯k}y^{(i)}\in \left \{ 0,1,\cdots k \right \}。假设函数为对于每一个样本估计其所属的类别的概率p(y=j∣x)p(y=j∣x)p\left ( y=j\mid \mathbf{x} \right )，具体的假设函数为：

hθ(x(i))=⎡⎣⎢⎢⎢⎢⎢p(y(i)=1∣x(i);θ)p(y(i)=2∣x(i);θ)⋮p(y(i)=k∣x(i);θ)⎤⎦⎥⎥⎥⎥⎥=1∑kj=1eθTjx(i)⎡⎣⎢⎢⎢⎢⎢eθT1x(i)eθT2x(i)⋮eθTkx(i)⎤⎦⎥⎥⎥⎥⎥hθ(x(i))=[p(y(i)=1∣x(i);θ)p(y(i)=2∣x(i);θ)⋮p(y(i)=k∣x(i);θ)]=1∑j=1keθjTx(i)[eθ1Tx(i)eθ2Tx(i)⋮eθkTx(i)]

h_\theta \left ( \mathbf{x}^{(i)} \right )=\begin{bmatrix} p\left ( y^{(i)}=1\mid \mathbf{x}^{(i)};\theta \right )\\ p\left ( y^{(i)}=2\mid \mathbf{x}^{(i)};\theta \right )\\ \vdots \\ p\left ( y^{(i)}=k\mid \mathbf{x}^{(i)};\theta \right ) \end{bmatrix}=\frac{1}{\sum_{j=1}^{k}e^{\theta ^T_j\mathbf{x}^{(i)}}}\begin{bmatrix} e^{\theta ^T_1\mathbf{x}^{(i)}}\\ e^{\theta ^T_2\mathbf{x}^{(i)}}\\ \vdots \\ e^{\theta ^T_k\mathbf{x}^{(i)}} \end{bmatrix}

p(y(i)=j∣x(i);θ)=eθTjx(i)∑kl=1eθTlx(i)p(y(i)=j∣x(i);θ)=eθjTx(i)∑l=1keθlTx(i)

p\left ( y^{(i)}=j\mid \mathbf{x}^{(i)};\theta \right )=\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}}

## 2.2、Softmax回归的代价函数

I{expression}={01 if expression=false if expression=trueI{expression}={0 if expression=false1 if expression=true

I\left \{ expression \right \}=\begin{cases} 0 & \text{ if } expression=false \\ 1 & \text{ if } expression=true \end{cases}

J(θ)=−1m[∑i=1m∑j=1kI{y(i)=j}logeθTjx(i)∑kl=1eθTlx(i)]J(θ)=−1m[∑i=1m∑j=1kI{y(i)=j}logeθjTx(i)∑l=1keθlTx(i)]

J\left ( \theta \right )=-\frac{1}{m}\left [ \sum_{i=1}^{m}\sum_{j=1}^{k}I\left \{ y^{(i)}=j \right \}log\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}} \right ]

## 2.3、Softmax回归的求解

▿θjJ(θ)=−1m∑i=1m[▿θj∑j=1kI{y(i)=j}logeθTjx(i)∑kl=1eθTlx(i)]▽θjJ(θ)=−1m∑i=1m[▽θj∑j=1kI{y(i)=j}logeθjTx(i)∑l=1keθlTx(i)]

\triangledown _{\theta _j}J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ \triangledown _{\theta _j}\sum_{j=1}^{k}I\left \{ y^{(i)}=j \right \}log\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}} \right ]

• 若y(i)=jy(i)=j y^{(i)}=j，则I{y(i)=j}=1I{y(i)=j}=1I\left \{ y^{(i)}=j \right \}=1

▿θjJ(θ)=−1m∑mi=1[▿θjlogeθTjx(i)∑kl=1eθTlx(i)]=−1m∑mi=1[∑kl=1eθTlx(i)eθTjx(i)⋅eθTjx(i)⋅x(i)⋅∑kl=1eθTlx(i)−eθTjx(i)⋅x(i)⋅eθTjx(i)(∑kl=1eθTlx(i))2]=−1m∑mi=1[∑kl=1eθTlx(i)−eθTjx(i)∑kl=1eθTlx(i)⋅x(i)]▽θjJ(θ)=−1m∑i=1m[▽θjlogeθjTx(i)∑l=1keθlTx(i)]=−1m∑i=1m[∑l=1keθlTx(i)eθjTx(i)⋅eθjTx(i)⋅x(i)⋅∑l=1keθlTx(i)−eθjTx(i)⋅x(i)⋅eθjTx(i)(∑l=1keθlTx(i))2]=−1m∑i=1m[∑l=1keθlTx(i)−eθjTx(i)∑l=1keθlTx(i)⋅x(i)]

\begin{matrix} \triangledown _{\theta _j}J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ \triangledown _{\theta _j}log\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}} \right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [ \frac{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}}{e^{\theta ^T_j\mathbf{x}^{(i)}}}\cdot \frac{e^{\theta ^T_j\mathbf{x}^{(i)}}\cdot \mathbf{x}^{(i)}\cdot \sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}-e^{\theta ^T_j\mathbf{x}^{(i)}}\cdot \mathbf{x}^{(i)}\cdot e^{\theta ^T_j\mathbf{x}^{(i)}}}{\left ( \sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}} \right )^2} \right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [ \frac{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}-e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}}\cdot \mathbf{x}^{(i)} \right ] \end{matrix}

• 若y(i)≠jy(i)≠j y^{(i)}\neq j，假设y(i)≠j′y(i)≠j′y^{(i)}\neq {j}'，则I{y(i)=j}=0I{y(i)=j}=0I\left \{ y^{(i)}=j \right \}=0，I{y(i)=j′}=1I{y(i)=j′}=1I\left \{ y^{(i)}={j}' \right \}=1

▿θjJ(θ)=−1m∑mi=1[▿θjlogeθTj′x(i)∑kl=1eθTlx(i)]=−1m∑mi=1[∑kl=1eθTlx(i)eθTj′x(i)⋅−eθTj′x(i)⋅x(i)⋅eθTjx(i)(∑kl=1eθTlx(i))2]=−1m∑mi=1[−eθTjx(i)∑kl=1eθTlx(i)⋅x(i)]▽θjJ(θ)=−1m∑i=1m[▽θjlogeθj′Tx(i)∑l=1keθlTx(i)]=−1m∑i=1m[∑l=1keθlTx(i)eθj′Tx(i)⋅−eθj′Tx(i)⋅x(i)⋅eθjTx(i)(∑l=1keθlTx(i))2]=−1m∑i=1m[−eθjTx(i)∑l=1keθlTx(i)⋅x(i)]

\begin{matrix} \triangledown _{\theta _j}J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ \triangledown _{\theta _j}log\frac{e^{\theta ^T_{{j}'}\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}} \right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [\frac{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}}{e^{\theta ^T_{{j}'}\mathbf{x}^{(i)}}}\cdot \frac{-e^{\theta ^T_{{j}'}\mathbf{x}^{(i)}}\cdot \mathbf{x}^{(i)}\cdot e^{\theta ^T_j\mathbf{x}^{(i)}}}{\left ( \sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}} \right )^2} \right ]\\ =-\frac{1}{m}\sum_{i=1}^{m}\left [ -\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}}\cdot \mathbf{x}^{(i)} \right ] \end{matrix}

−1m∑i=1m[x(i)(I{y(i)=j}−p(y(i)=j∣x(i);θ))]−1m∑i=1m[x(i)(I{y(i)=j}−p(y(i)=j∣x(i);θ))]

-\frac{1}{m}\sum_{i=1}^{m}\left [ \mathbf{x}^{(i)}\left ( I\left \{ y^{(i)}=j \right \}-p\left ( y^{(i)}=j\mid \mathbf{x}^{(i)};\theta \right ) \right ) \right ]

θj:=θj−α▿θjJ(θ)θj:=θj−α▽θjJ(θ)

\theta _j:=\theta _j-\alpha \triangledown _{\theta _j}J\left ( \theta \right )

## 5、Softmax回归中的参数特点

p(y(i)=j∣x(i);θ)=e(θj−ψ)Tx(i)∑kl=1e(θl−ψ)Tx(i)=eθTjx(i)⋅e−ψTx(i)∑kl=1eθTlx(i)⋅e−ψTx(i)=eθTjx(i)∑kl=1eθTlx(i)p(y(i)=j∣x(i);θ)=e(θj−ψ)Tx(i)∑l=1ke(θl−ψ)Tx(i)=eθjTx(i)⋅e−ψTx(i)∑l=1keθlTx(i)⋅e−ψTx(i)=eθjTx(i)∑l=1keθlTx(i)

\begin{matrix} p\left ( y^{(i)}=j\mid \mathbf{x}^{(i)};\theta \right )=\frac{e^{(\theta _j-\psi )^T\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{(\theta _l-\psi )^T\mathbf{x}^{(i)}}}\\ =\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}\cdot e^{-\psi ^T\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}\cdot e^{-\psi ^T\mathbf{x}^{(i)}}}\\ =\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}} \end{matrix}

λ2∑i=1k∑j=0nθ2ijλ2∑i=1k∑j=0nθij2

\frac{\lambda }{2}\sum_{i=1}^{k}\sum_{j=0}^{n}\theta ^2_{ij}

J(θ)=−1m[∑i=1m∑j=1kI{y(i)=j}logeθTjx(i)∑kl=1eθTlx(i)]+λ2∑i=1k∑j=0nθ2ijJ(θ)=−1m[∑i=1m∑j=1kI{y(i)=j}logeθjTx(i)∑l=1keθlTx(i)]+λ2∑i=1k∑j=0nθij2

J\left ( \theta \right )=-\frac{1}{m}\left [ \sum_{i=1}^{m}\sum_{j=1}^{k}I\left \{ y^{(i)}=j \right \}log\frac{e^{\theta ^T_j\mathbf{x}^{(i)}}}{\sum_{l=1}^{k}e^{\theta ^T_l\mathbf{x}^{(i)}}} \right ]+\frac{\lambda }{2}\sum_{i=1}^{k}\sum_{j=0}^{n}\theta ^2_{ij}

▿θjJ(θ)=−1m∑i=1m[x(i)(I{y(i)=j}−p(y(i)=j∣x(i);θ))]+λθj▽θjJ(θ)=−1m∑i=1m[x(i)(I{y(i)=j}−p(y(i)=j∣x(i);θ))]+λθj

\triangledown {\theta _j}J\left ( \theta \right )=-\frac{1}{m}\sum_{i=1}^{m}\left [ \mathbf{x}^{(i)}\left ( I\left \{ y^{(i)}=j \right \}-p\left ( y^{(i)}=j\mid \mathbf{x}^{(i)};\theta \right ) \right ) \right ]+\lambda \theta _j

## 5、Softmax与Logistic回归的关系

Logistic回归算法是Softmax回归的特征情况，即k=2k=2k=2时的情况，当 k=2k=2k=2时，Softmax回归为：

hθ(x)=1eθT1x+eθT2x[eθT1xeθT2x]hθ(x)=1eθ1Tx+eθ2Tx[eθ1Txeθ2Tx]

h_\theta \left ( x \right )=\frac{1}{e^{\theta _1^Tx}+e^{\theta _2^Tx}}\begin{bmatrix} e^{\theta _1^Tx}\\ e^{\theta _2^Tx} \end{bmatrix}

hθ(x)=1e(θ1−ψ)Tx+e(θ2−ψ)Tx[e(θ1−ψ)Txe(θ2−ψ)Tx]=⎡⎣⎢⎢11+e(θ2−θ1)Txe(θ2−θ1)Tx1+e(θ2−θ1)Tx⎤⎦⎥⎥=⎡⎣⎢⎢11+e(θ2−θ1)Tx1−11+e(θ2−θ1)Tx⎤⎦⎥⎥hθ(x)=1e(θ1−ψ)Tx+e(θ2−ψ)Tx[e(θ1−ψ)Txe(θ2−ψ)Tx]=[11+e(θ2−θ1)Txe(θ2−θ1)Tx1+e(θ2−θ1)Tx]=[11+e(θ2−θ1)Tx1−11+e(θ2−θ1)Tx]

\begin{matrix} h_\theta \left ( \mathbf{x} \right )=\frac{1}{e^{(\theta _1-\psi )^T\mathbf{x}}+e^{(\theta _2-\psi )^T\mathbf{x}}}\begin{bmatrix} e^{(\theta _1-\psi )^T\mathbf{x}}\\ e^{(\theta _2-\psi )^T\mathbf{x}} \end{bmatrix}\\ =\begin{bmatrix} \frac{1}{1+e^{(\theta _2-\theta _1 )^T\mathbf{x}}}\\ \frac{e^{(\theta _2-\theta _1 )^T\mathbf{x}}}{1+e^{(\theta _2-\theta _1 )^T\mathbf{x}}} \end{bmatrix}\\ =\begin{bmatrix} \frac{1}{1+e^{(\theta _2-\theta _1 )^T\mathbf{x}}}\\ 1-\frac{1}{1+e^{(\theta _2-\theta _1 )^T\mathbf{x}}} \end{bmatrix} \end{matrix}

## 6、多分类算法和二分类算法的选择

• 是互斥的 –> Softmax回归
• 不是互斥的 –> 多个独立的Logistic回归

# 参考文献

0 条评论

• ### 简单易学的机器学习算法——马尔可夫链蒙特卡罗方法MCMC

对于一般的分布的采样，在很多的编程语言中都有实现，如最基本的满足均匀分布的随机数，但是对于复杂的分布，要想对其采样，却没有实现好的函数，在这里，可以使用马尔可夫...

• ### 简单易学的机器学习算法——Metropolis-Hastings算法

在简单易学的机器学习算法——马尔可夫链蒙特卡罗方法MCMC中简单介绍了马尔可夫链蒙特卡罗MCMC方法的基本原理，介绍了Metropolis采样算法的基本过程，这...

• ### [置顶] 《Python机器学习算法》勘误

本书在出版的过程中已经经过详细的检查，但是大小问题依旧存在，感谢各位细心的读者为本书指出的错误。 第34页的错误在Python2.7.9版本上不会报错。 第1...

• ### RHCE培训笔记-6（下）

弥补原有的基本权限的不足：chown , chmod , chgrp 一个文件只有一个所有者和一个所属组

• ### 图解Java设计模式之观察者模式

1）气象站可以将每天测量到的湿度、温度、气压等等以公告的形式发布出去（比如发布到自己的网站或第三方）。 2）需要设计开放型API，便于其他第三方也能接入气象站...

• ### C# ArcEngine二次开发时，如何在AxSceneControl场景中清除所选要素。代码实现工具条，点击清除所选要素。

ArcEngine二次开发时，如何在AxSceneControl中清除所选。C#代码实现工具，点击清除场景中所选的要素。一开始...

• ### java基础第十二篇之集合、增强for循环、迭代器和泛型

public static void main(String[] args) { // TODO Auto-generated method stub //...

• ### 微信朋友圈限时推广微视，腾讯对短视频志在必得？

近日，很多用户发布微信朋友圈时，都看到“用微视拍摄（推广）”的新菜单。在拍摄功能里，除了可以正常的拍照外，还能选择跳转到腾讯旗下的微视进行拍摄。如果用户没有下载...

• ### npm的安装方法

由于低版本的Linux系统又不能及时更新源的问题，导致使用npm的插件不能正常工作，需要最新版本。