1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299
| package com.symdata.pssystem.utils;
import java.util.*; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Value; import org.springframework.stereotype.Service;
import javax.annotation.PostConstruct;
/** * @version 1.0 * @Description: 敏感词过滤 * @Project:util * @Date : 2020年4月20日 下午4:17:15 */ @Service @Slf4j public class SensitiveWordUtil { @SuppressWarnings("rawtypes") private Map sensitiveWordMap = null; public static int minMatchTYpe = 1; //最小匹配规则 public static int maxMatchType = 2; //最大匹配规则 @Value("${xb.sensitive.words:}") //yml 文件读取敏感词 private String sensitives;
/** * 构造函数,初始化敏感词库 */ @PostConstruct private void init() { sensitiveWordMap = new SensitiveWordInit().initKeyWord(sensitives); }
// public static SensitivewordFilter getInstance() { // if (filter == null) { // filter = new SensitivewordFilter(); // } // return filter; // }
/** * 判断文字是否包含敏感字符 * * @param txt 文字 * @param matchType 匹配规则 1:最小匹配规则,2:最大匹配规则 * @return 若包含返回true,否则返回false * @version 1.0 */ public boolean isContaintSensitiveWord(String txt, int matchType) { boolean flag = false; for (int i = 0; i < txt.length(); i++) { int matchFlag = this.CheckSensitiveWord(txt, i, matchType); //判断是否包含敏感字符 if (matchFlag > 0) { //大于0存在,返回true flag = true; } } return flag; }
/** * 获取文字中的敏感词 * * @param txt 文字 * @param matchType 匹配规则 1:最小匹配规则,2:最大匹配规则 * @return * @version 1.0 */ public Set<String> getSensitiveWord(String txt, int matchType) { Set<String> sensitiveWordList = new HashSet<String>();
for (int i = 0; i < txt.length(); i++) { int length = CheckSensitiveWord(txt, i, matchType); //判断是否包含敏感字符 if (length > 0) { //存在,加入list中 sensitiveWordList.add(txt.substring(i, i + length)); i = i + length - 1; //减1的原因,是因为for会自增 } }
return sensitiveWordList; }
/** * 替换敏感字字符 * * @param txt * @param matchType * @param replaceChar 替换字符,默认* * @version 1.0 */ public String replaceSensitiveWord(String txt, int matchType, String replaceChar) { String resultTxt = txt; Set<String> set = getSensitiveWord(txt, matchType); //获取所有的敏感词 Iterator<String> iterator = set.iterator(); String word = null; String replaceString = null; while (iterator.hasNext()) { word = iterator.next(); replaceString = getReplaceChars(replaceChar, word.length()); resultTxt = resultTxt.replaceAll(word, replaceString); }
return resultTxt; }
/** * 获取替换字符串 * * @param replaceChar * @param length * @return * @version 1.0 */ private String getReplaceChars(String replaceChar, int length) { String resultReplace = replaceChar; for (int i = 1; i < length; i++) { resultReplace += replaceChar; }
return resultReplace; }
/** * 检查文字中是否包含敏感字符,检查规则如下:<br> * * @param txt * @param beginIndex * @param matchType * @return,如果存在,则返回敏感词字符的长度,不存在返回0 * @version 1.0 */ @SuppressWarnings({"rawtypes"}) public int CheckSensitiveWord(String txt, int beginIndex, int matchType) { boolean flag = false; //敏感词结束标识位:用于敏感词只有1位的情况 int matchFlag = 0; //匹配标识数默认为0 char word = 0; Map nowMap = sensitiveWordMap; for (int i = beginIndex; i < txt.length(); i++) { word = txt.charAt(i); nowMap = (Map) nowMap.get(word); //获取指定key if (nowMap != null) { //存在,则判断是否为最后一个 matchFlag++; //找到相应key,匹配标识+1 if ("1".equals(nowMap.get("isEnd"))) { //如果为最后一个匹配规则,结束循环,返回匹配标识数 flag = true; //结束标志位为true if (SensitiveWordUtil.minMatchTYpe == matchType) { //最小规则,直接返回,最大规则还需继续查找 break; } } } else { //不存在,直接返回 break; } } if (matchFlag < 2 || !flag) { //长度必须大于等于1,为词 matchFlag = 0; } return matchFlag; }
public static void main(String[] args) { SensitiveWordUtil filter = new SensitiveWordUtil(); System.out.println("敏感词的数量:" + filter.sensitiveWordMap.size()); String string = "太多的伤感情怀也许只局限于饲养基地 荧幕中的情节,主人公尝试着去用某种方式渐渐的很潇洒地释自杀指南怀那些自己经历的伤感。" + "然后法轮功 我们的扮演的角色就是跟随着主人公的喜红客,联盟 怒哀乐而过于牵强的把自己的情感也附加于银幕情节中,然后感动就流泪," + "难过就躺在某一个人的怀里尽情的阐述心扉或者手机卡复制器一个人一杯红酒一部电影在夜三级片 深人静的晚上,关上电话静静的发呆着。"; System.out.println("待检测语句字数:" + string.length()); long beginTime = System.currentTimeMillis(); Set<String> set = filter.getSensitiveWord(string, 2); long endTime = System.currentTimeMillis(); System.out.println("语句中包含敏感词的个数为:" + set.size() + "。包含:" + set); System.out.println("总共消耗时间为:" + (endTime - beginTime)); } }
class SensitiveWordInit { private String ENCODING = "UTF-8"; //字符编码 @SuppressWarnings("rawtypes") public HashMap sensitiveWordMap;
public SensitiveWordInit() { super(); }
/** * @param sensitives * @version 1.0 */ @SuppressWarnings("rawtypes") public Map initKeyWord(String sensitives) { try { //读取敏感词库 Set<String> keyWordSet = readSensitiveWordFile(sensitives); //将敏感词库加入到HashMap中 addSensitiveWordToHashMap(keyWordSet); //spring获取application,然后application.setAttribute("sensitiveWordMap",sensitiveWordMap); } catch (Exception e) { e.printStackTrace(); } return sensitiveWordMap; }
/** * 读取敏感词库,将敏感词放入HashSet中,构建一个DFA算法模型:<br> * 中 = { * isEnd = 0 * 国 = {<br> * isEnd = 1 * 人 = {isEnd = 0 * 民 = {isEnd = 1} * } * 男 = { * isEnd = 0 * 人 = { * isEnd = 1 * } * } * } * } * 五 = { * isEnd = 0 * 星 = { * isEnd = 0 * 红 = { * isEnd = 0 * 旗 = { * isEnd = 1 * } * } * } * } * * @param keyWordSet 敏感词库 * @version 1.0 */ @SuppressWarnings({"rawtypes", "unchecked"}) private void addSensitiveWordToHashMap(Set<String> keyWordSet) { sensitiveWordMap = new HashMap(keyWordSet.size()); //初始化敏感词容器,减少扩容操作 String key = null; Map nowMap = null; Map<String, String> newWorMap = null; //迭代keyWordSet Iterator<String> iterator = keyWordSet.iterator(); while (iterator.hasNext()) { key = iterator.next(); //关键字 nowMap = sensitiveWordMap; for (int i = 0; i < key.length(); i++) { char keyChar = key.charAt(i); //转换成char型 Object wordMap = nowMap.get(keyChar); //获取
if (wordMap != null) { //如果存在该key,直接赋值 nowMap = (Map) wordMap; } else { //不存在则,则构建一个map,同时将isEnd设置为0,因为他不是最后一个 newWorMap = new HashMap<String, String>(); newWorMap.put("isEnd", "0"); //不是最后一个 nowMap.put(keyChar, newWorMap); nowMap = newWorMap; }
if (i == key.length() - 1) { nowMap.put("isEnd", "1"); //最后一个 } } } }
/** * 读取敏感词库中的内容,将内容添加到set集合中 * * @param sensitives * @return * @throws Exception * @version 1.0 */ @SuppressWarnings("resource") private Set<String> readSensitiveWordFile(String sensitives) throws Exception { Set<String> set = new HashSet<>(); String[] split = sensitives.split("|"); set.addAll(Arrays.asList(split));
// File file = new File("/home/bxw/src/main/resources/SensitiveWord.txt"); //读取文件 // InputStreamReader read = new InputStreamReader(new FileInputStream(file), ENCODING); // try { // if (file.isFile() && file.exists()) { //文件流是否存在 // set = new HashSet<String>(); // BufferedReader bufferedReader = new BufferedReader(read); // String txt = null; // while ((txt = bufferedReader.readLine()) != null) { //读取文件,将文件内容放入到set中 // set.add(txt); // } // } else { //不存在抛出异常信息 // throw new Exception("敏感词库文件不存在"); // } // } catch (Exception e) { // throw e; // } finally { // read.close(); //关闭文件流 // } return set; } }
|