天天看点

小源笔记论文《 属性关联的双极容度多属性决策 VIKOR方法》步骤4

作者:LearningYard学苑
小源笔记论文《 属性关联的双极容度多属性决策 VIKOR方法》步骤4

分享兴趣,传播快乐,增长见闻,留下美好。

亲爱的您,

这里是LearningYard学苑!

今天小编给大家带来期刊论文精读,

欢迎您的用心访问!

本期推文阅读时长大约6分钟,请您耐心阅读。

Share interest, spread happiness, increase knowledge, and leave beautiful.

Dear you,

this is the LearningYard Academy!

Today, the editor brings you intensive reading of journal papers,

welcome your visit!

This tweet usually takes about 6 minutes to read. Please be patient and read.

今天小编将从思维导图、精读内容、知识补充三个板块为大家带来论文《属性关联的双极容度多属性决策 VIKOR方法》步骤四,接下来我们开始今天的学习吧!

Today, the editor will bring you the calculation and analysis of the paper "VIKOR Method of Bipolar Tolerance Multi-attribute Decision Making of Attribute Association" from the three parts of mind map, intensive reading content and knowledge supplement. Next, let's start today's study!

思维导图

本节内容思维导图如下所示:

A mind map of the contents of this section is shown below.

小源笔记论文《 属性关联的双极容度多属性决策 VIKOR方法》步骤4

精读内容

本节内容我们将继续学习步骤四的相关内容,作者提到双极容度Marichal熵具有类似香农熵的特性,那么接下来小编详细地学习了香农熵、交叉熵以及KL散度的相关知识。

In this section, we will continue to learn the relevant content of step 4. The author mentioned that the bipolar capacity Marichal entropy has the characteristics similar to Shannon entropy. Then the editor will learn the relevant knowledge of Shannon entropy, cross entropy and KL divergence in detail.

小源笔记论文《 属性关联的双极容度多属性决策 VIKOR方法》步骤4

在香农熵中,以前人们一般使用ln,现在主流认识是以log2为底,一般来说,考虑一个事件的信息量是一连串相互独立随机变量发生的结果,其中每一个选择都在0或1之间做出,我们能算出的所有可能结果数为N=2的n次方,n为独立随机变量个数,所以我们把指数形式变为线性形式就是n=log2(N)。因此上面小编学习的内容都是以2为底,而文章使用的是ln的形式。

In Shannon entropy, people used to use ln, but now the mainstream understanding is based on log2. Generally speaking, considering the amount of information of an event is the result of a series of independent random variables, in which each choice is made between 0 or 1. All possible results we can calculate are the nth power of N=2, and n is the number of independent random variables, so we change the exponential form to linear form, which is n=log2 (N). Therefore, the content of the above small editor is based on 2, and the article uses the form of ln.

知识补充

在上文中,我们提到了KL散度,接下来和小编一起来了解一下关于KL散度的相关定义吧!

In the above, we mentioned KL divergence. Next, let's learn about the relevant definitions of KL divergence with Xiaobian!

相对熵(relative entropy),又被称为Kullback-Leibler散度(Kullback-Leibler divergence)或信息散度(information divergence),是两个概率分布(probability distribution)间差异的非对称性度量,在信息理论中,相对熵等价于两个概率分布的信息熵(Shannon entropy)的差值。

Relative entropy, also known as Kullback-Leibler divergence or information divergence, is an asymmetric measure of the difference between two probability distributions. In information theory, relative entropy is equivalent to the difference between the information entropy of two probability distributions.

KL散度在信息论中有自己明确的物理意义,它是用来度量使用基于Q分布的编码来编码来自P分布的样本平均所需的额外的Bit个数,而其在机器学习领域的物理意义则是用来度量两个函数的相似程度或者相近程度,在泛函分析中也被频繁地用到。

KL divergence has its own clear physical meaning in information theory. It is used to measure the number of additional Bits required to encode the sample average from the P distribution using the code based on the Q distribution. Its physical meaning in the field of machine learning is used to measure the similarity degree or similarity degree of two functions, which is also frequently used in functional analysis.

今天的分享就到这里了。

如果您对今天的文章有独特的想法,

欢迎给我们留言,

让我们相约明天,

祝您今天过得开心快乐!

That's it for today's sharing.

If you have a unique idea about today’s article,

welcome to leave us a message,

let us meet tomorrow,

I wish you a nice day today!

参考资料:DeepL翻译、百度百科、哔哩哔哩

参考文献:

[1]林萍萍,李登峰,江彬倩,余高锋,韦安鹏.属性关联的双极容度多属性决策VIKOR方法[J].系统工程理论与实践,2021,41(08):2147-2156.

本文由LearningYard学苑原创,如有侵权请在后台留言!

文案 |Yuan

排版 |Yuan

审核 |Qian

继续阅读