999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

A Study to Improve the Method of Ascertaining Attribute Weight Based on Rough Sets Theory

2009-04-29 00:00:00LIUChengZHANGJian-binBAOXin-zhong
中國管理信息化 2009年15期

Abstract: There are some shortages to ascertain attribute weight based on rough set in current studies. In this paper, attribute importance represented by rough set is studied deeply. Aiming at the existing problems, algebra presentation of rough sets is proved to be more comprehensive than its information presentation, then a new method of ascertaining attribute weigh is put forward based on rough set conditional entropy. Finally, it is shown that the new method is more reasonable than the old one by an example.

Key words: Weight; Rough Sets; Attribute Significance Degree; Decision Table

doi:10.3969/j.issn.1673-0194.2009.15.034

CLC number: TP224.0Article character:AArticle ID:1673-0194(2009)15-0112-03

1 PREFACES

Weight, which reflects positions and functions of various elements in the process of judging and decision-making, is very crucial and its accuracy directly affects final results. There are several common methods to ascertain attribute weight, such as experts scoring, the fuzzy statistics and the sort contrast dualism. But in these common methods, ascertaining attribute weight is affected excessively on experts experience and knowledge, so that they sometimes are not able to reflect the actual situation objectively. However, rough sets theory fully reflects the objectivity of the data, without offering any more prior information except the data set to be dealt with. Therefore, some experts have researched on the method of ascertaining attribute weight based on rough sets theory.

Rough sets theory [1] is a method about expressing, studying and generalizing to study incomplete and uncertain knowledge and data. This was first put forward by professor Pawlak from Warsaw University of Technology in Poland in the early 1980’s. Rough sets has been successfully applied in many areas, such as expert systems, machine learning and pattern recognition, all because of its character——no need of prior information [2-6]. The method to ascertain condition attribute weight in the decision table has been introduced in the document [7], and it has been quoted in different areas by many scholars. This paper analyzes the shortages of the method in the document [7], gives a new method to ascertain attribute weight based on rough set, and proves the new method reasonable.

2 ANALYSIS ON THE ORIGINAL METHOD OF ASCERTAINING ATTRIBUTE WEIGHT BASED ON ROUGH SETS THEORY

The importance of various attributes (indicators) needs to be ascertained because they are not in the same. In the rough sets theory, we should take away an attribute first and then consider the change in classification without the attribute. If the classification changes too much after removing the attribute, it is of bigger intensity and greater importance. Otherwise, it is of smaller intensity and lower importance [4-7]. According to this characteristic, the literature [7] defines the importance of the attributes (indicators) as followed.

Definition 1[1]: In the decision table S=(U,C,D,V,f), the dependence gB(D) of decision attribute D to conditional attribute sets (indicators) BC is defined as follows:

gB(D)=|POSB(D)|/|U|

Definition 2[7]: (Algebra presentation definition of attribute) In the decision table S=(U,C,D,V,f), the significance degree of conditional attribute (indicator) c is defined as follows:

Sig(c)=gc(D)-gc-{c}(D).

The weight W0(c) of conditional attribute (indicator) c is defined as:

W0(c)=Sig(c)∑a∈CSig(a)

Note: The definition illustrates that the greater Sig(c) is, the more important the conditional attribute (indicator) c is, so as to the weight of the attribute.

In order to explain the shortage of the definition better, take an example as follows.

CHINA MANAGEMENT INFORMATIONIZATION /

/ CHINA MANAGEMENT INFORMATIONIZATION

U/{a,b,c}={{x1,x2},{x3},{x4},{x5},{x6,x7},{x8,x9}, {x10,x11}}.

U/D={{x1,x5,x6,x8,x11},{x2,x3,x4,x7,x9,x10}}.

Therefore:POS(a,b,c)(D)= {x3,x4,x5}; POS(a,b)(D)= {x3,x4,x5}; POS(b,c)(D)= {x3,x4,x5}; POS(a,c)(D)= {x3,x4,x5}. As a result:

ga,b,c(D)=|POS{a,b,c}(D)|/|U|=3/11.

Similar that:ga,b,c(D)=g{a,c}(D)=g{b,c}(D)=3/11.

Then:Sig(a)=g{a,b,c}(D)-g{b,c}(D)=0;Sig(b)=g{a,b,c}(D)-g{a,c}(D)=0; Sig(c)=g{a,b,c}(D)-g{a,b}(D)=0.

From the example above, we can see that the weight of conditional attributes can not be calculated accurately in this case. The conditional attributes of a, b, c are unnecessary in the decision Table 1 in respect to simple attribute. The literature[8] simplified the decision table and figured out the importance of the attributes for this problem. At this time, the importance of at least one attributes is not 0, and then calculated it with the formula of ascertaining weight in literature[7]. Although it can guarantee at least one weight not to be 0, it ignores the practical significance of the attributes form the 0 weight and the reduction weight. In order to solve this problem, we offer a new method to ascertain weight.

3A METHOD TO ASCERTAIN WEIGHT BASED ON CONDITION INFORMATION ENTROPY

3.1 Condition information entropy

The following definition establishes the relationship between knowledge of rough sets theory and information entropy, so that we can express the main concepts and operations of rough sets theory from the information point of view, which is usually called information presentation of the rough sets theory.

Definition 3: In the decision table S=(U,C,D,V,f), any attribution set SC∪D in U is a random variable defined in the algebra subset of U. The probability distributing of the set could be confirmed as followed:

[S∶p]SiS2…Si

p(Si) p(S2) … p(Si)

In this formula, p(Sj)=|Sj|/|U|,j=1,2,…,t.

Definition 4: In the decision table S=(U,C,D,V,f),compared to the conditional attribute sets C(U/C={C1,C2,…,Cm}), the conditional entropy H1(D|C) of decision attribute sets D(U/D={D1,D2,…,Dk}) is defined as followed:

H(D/C)=-∑mi=1p(Ci)∑kj=1p(Dj|Ci)logp(Dj|Ci)

In this formula, p(Dj|Ci)=|Dj∩Ci|/|Ci|(i=1,2,…,n;j=1,2,…,n).

3.2 Attribute important degree of condition information entropy

Definition 5: (information presentation definition of attribute) In the decision table S=(U,C,D,V,f), , the significance degree of conditional attribute (indicator) c is defined as follows:

Sig(c)=I(D|(C-{c}))-I(D|C)

The greater the value of Sig, on the known of C conditions, the more important the c to the decision D. Compared to the importance of attributes in the form of algebra, while its definition considered the attribute impact on the subset determined by discussion domain, information definition considered uncertain classified subset. It means that, though the important degree of attribute is 0 under the definition of algebra, it is not necessarily 0 under the definition of information, but if it is 0 under the definition of information, it is definitely 0 under the definition of algebra[9].

3.3 The method to ascertain weight based on condition information entropy

Definition 6: In the decision table S=(U,C,D,V,f), , the significance degree of conditional attribute (indicator) c is defined as follows:

NewSig(c)=I(D|(C-{c}))-I(D|C)

w(b)=SigNew(b)∑a∈CSigNew(a)

4 EN EXAMPLE

Now, the new method to ascertain weigh is used to calculate the weights of several attributes in the decision table 1.

5 CONCLUSION

We analyze the existing method to ascertain the weight based on rough set theory, aiming at its shortage, work out a new way based on information entropy with the character that the rough set’s information presentation is more comprehensive than the algebra presentation, and verify it with an example. The new method of ascertaining weight is more comprehensive, rational, universal and reasonable than the existing method.

References

[1] Pawlak Z. Rough Sets: Probability Versus Deterministic Approach[J]. International Journal of Man-Machine Studies, 1988, 29(1): 81-95.

[2] Hu X H,Cercone N. Learning in Relational Data Bases: A Rough Set Approach[J]. Computational Inteligence, 1995, 11( 2) : 323-338.

[3] Swiniarski S, Hargis L. Rough Set as a Front End of Neural-Networks Texture Classifiers[J]. Neuro -computing, 2001, 36(1): 85-102.

[4] Zhang W X, Wu W Z, Liang J Y. The Theory and Method of Rough Sets[M]. Beijing: Science Press, 2000.

[5] Zhang W X, Chou G F. Uncertain Decision Based on Rough Sets[M]. Beijing: Tsinghua University Press, 2005.

[6] Miao D Q, Fan S D. The Calculation of Knowledge Granulation and Its Application[J]. Systems Engineering: Theory Practice, 2002(1):48-56.

[7] Cao X Y, Liang J G. The Method of Ascertaining Weight Based on Rough Sets Theory[J]. Chinese Journal of Management Science, 2002,10(5): 98-100.

[8] Zhou A F, Chen Z Y. How to Choose the SC Partner Based on Tough Set[J]. Logistics Technology, 2007, 26(8): 178-181.

[9] Wang G Y, Yu H, Yang D C. Decision Table Reduction based on Conditional Information Entropy[J]. Chinese Journal of Computers, 2002, 25(7):759-766.


登錄APP查看全文

主站蜘蛛池模板: 五月婷婷激情四射| 久久天天躁狠狠躁夜夜躁| 日韩在线欧美在线| 老司国产精品视频91| 欧美国产精品不卡在线观看| 国产人成乱码视频免费观看| 在线观看91精品国产剧情免费| 1024你懂的国产精品| 手机在线免费不卡一区二| 久久人人妻人人爽人人卡片av| 亚洲首页在线观看| 露脸一二三区国语对白| 免费国产无遮挡又黄又爽| 免费观看无遮挡www的小视频| 国产精品亚洲片在线va| 亚洲欧美在线综合图区| 激情综合五月网| 国产成人免费视频精品一区二区| 国产精品熟女亚洲AV麻豆| 亚洲水蜜桃久久综合网站| 国产成人亚洲精品无码电影| 伊大人香蕉久久网欧美| 色爽网免费视频| 国产一区二区三区免费| 亚洲中文在线视频| 欧美成人精品一区二区 | 国产真实乱子伦精品视手机观看| 国产高清免费午夜在线视频| 久久这里只有精品8| 亚洲精品日产AⅤ| 亚洲a免费| 欧美日韩国产成人高清视频| 欧美三级视频网站| 激情综合激情| 欧美激情,国产精品| 人与鲁专区| 国产精品网址在线观看你懂的| 永久免费无码日韩视频| 91精品免费久久久| 日韩视频福利| 久久人人妻人人爽人人卡片av| 国产又爽又黄无遮挡免费观看| 婷婷开心中文字幕| 尤物国产在线| 国产精品无码影视久久久久久久| 日韩AV手机在线观看蜜芽| 青青草原国产av福利网站| 国产一级视频在线观看网站| 亚洲乱强伦| 成人福利在线免费观看| 在线观看免费人成视频色快速| 伊人久久大香线蕉aⅴ色| 成人福利免费在线观看| 亚洲水蜜桃久久综合网站| 青青草一区| 久久综合九九亚洲一区| AⅤ色综合久久天堂AV色综合| 一级毛片在线播放免费观看| 欧美成人第一页| 欧美全免费aaaaaa特黄在线| a在线亚洲男人的天堂试看| 永久免费av网站可以直接看的 | 成人自拍视频在线观看| 热re99久久精品国99热| 欧美视频免费一区二区三区 | 亚洲国产综合第一精品小说| 欧美激情视频一区| 精品少妇人妻av无码久久| 福利在线免费视频| 波多野结衣一区二区三视频| 亚洲不卡网| 亚洲国产精品日韩欧美一区| 一级毛片免费不卡在线视频| 国产农村妇女精品一二区| 成人无码一区二区三区视频在线观看| 国产午夜不卡| 国产成人无码播放| 免费女人18毛片a级毛片视频| 欧美成人A视频| 亚洲永久精品ww47国产| 国产成人艳妇AA视频在线| 久久99国产综合精品女同|