Skip to content
代码片段 群组 项目
未验证 提交 2af08acc 编辑于 作者: dlagul's avatar dlagul 提交者: GitHub
浏览文件

Update tester.py

上级 edd2330d
无相关合并请求
......@@ -84,9 +84,9 @@ class Tester(object):
# See https://arxiv.org/pdf/1606.05908.pdf, Page 9, Section 2.2 for details.
# The constant items in the loss function (not the coefficients) can be any number here, or even omitted
# due to they have no any impact on gradientis propagation during training. So do in testing.
# log(N(x|mu,var))
# = log{1/(sqrt(2*pi)*var)exp{-(x-mu)^2/(2*var^2)}}
# = -0.5*{log(2*pi)+2*log(var)+[(x-mu)/exp{log(var)}]^2}
# log(N(x|mu,sigma^2))
# = log{1/(sqrt(2*pi)*sigma)exp{-(x-mu)^2/(2*sigma^2)}}
# = -0.5*{log(2*pi)+2*log(sigma)+[(x-mu)/exp{log(sigma)}]^2}
loglikelihood = -0.5 * torch.sum(torch.pow(((original_seq.float()-recon_seq_mu.float())/torch.exp(recon_seq_logvar.float())), 2)
+ 2 * recon_seq_logvar.float()
+ np.log(np.pi*2))
......@@ -106,9 +106,9 @@ class Tester(object):
# The constant items in the loss function (not the coefficients) can be any number here, or even omitted
# due to they have no any impact on gradientis propagation during training.
# So do in testing, because they also have no any impact on the results (all anomaly scores increase or decrease to the same extent).
# log(N(x|mu,var))
# = log{1/(sqrt(2*pi)*var)exp{-(x-mu)^2/(2*var^2)}}
# = -0.5*{log(2*pi)+2*log(var)+[(x-mu)/exp{log(var)}]^2}
# log(N(x|mu,sigma^2))
# = log{1/(sqrt(2*pi)*sigma)exp{-(x-mu)^2/(2*sigma^2)}}
# = -0.5*{log(2*pi)+2*log(sigma)+[(x-mu)/exp{log(sigma)}]^2}
llh = -0.5 * torch.sum(torch.pow(((x.float()-recon_x_mu.float())/torch.exp(recon_x_logvar.float())), 2)
+ 2 * recon_x_logvar.float()
+ np.log(np.pi*2))
......
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册