{"title": "Masking: A New Perspective of Noisy Supervision", "book": "Advances in Neural Information Processing Systems", "page_first": 5836, "page_last": 5846, "abstract": "It is important to learn various types of classifiers given training data with noisy labels. Noisy labels, in the most popular noise model hitherto, are corrupted from ground-truth labels by an unknown noise transition matrix. Thus, by estimating this matrix, classifiers can escape from overfitting those noisy labels. However, such estimation is practically difficult, due to either the indirect nature of two-step approaches, or not big enough data to afford end-to-end approaches. In this paper, we propose a human-assisted approach called ''Masking'' that conveys human cognition of invalid class transitions and naturally speculates the structure of the noise transition matrix. To this end, we derive a structure-aware probabilistic model incorporating a structure prior, and solve the challenges from structure extraction and structure alignment. Thanks to Masking, we only estimate unmasked noise transition probabilities and the burden of estimation is tremendously reduced. We conduct extensive experiments on CIFAR-10 and CIFAR-100 with three noise structures as well as the industrial-level Clothing1M with agnostic noise structure, and the results show that Masking can improve the robustness of classifiers significantly.", "full_text": "Masking: A New Perspective of Noisy Supervision\n\nBo Han\u22171,2, Jiangchao Yao\u22173,1, Gang Niu2, Mingyuan Zhou4,\n\nIvor W. Tsang1, Ya Zhang3, Masashi Sugiyama2,5\n\n1Centre for Arti\ufb01cial Intelligence, University of Technology Sydney\n\n2Center for Advanced Intelligence Project, RIKEN\n\n3Cooperative Medianet Innovation Center, Shanghai Jiao Tong University\n\n4McCombs School of Business, The University of Texas at Austin\n\n5Graduate School of Frontier Sciences, University of Tokyo\n\nAbstract\n\nIt is important to learn various types of classi\ufb01ers given training data with noisy\nlabels. Noisy labels, in the most popular noise model hitherto, are corrupted from\nground-truth labels by an unknown noise transition matrix. Thus, by estimating\nthis matrix, classi\ufb01ers can escape from over\ufb01tting those noisy labels. However,\nsuch estimation is practically dif\ufb01cult, due to either the indirect nature of two-step\napproaches, or not big enough data to afford end-to-end approaches. In this paper,\nwe propose a human-assisted approach called \u201cMasking\u201d that conveys human cog-\nnition of invalid class transitions and naturally speculates the structure of the noise\ntransition matrix. To this end, we derive a structure-aware probabilistic model\nincorporating a structure prior, and solve the challenges from structure extraction\nand structure alignment. Thanks to Masking, we only estimate unmasked noise\ntransition probabilities and the burden of estimation is tremendously reduced. We\nconduct extensive experiments on CIFAR-10 and CIFAR-100 with three noise\nstructures as well as the industrial-level Clothing1M with agnostic noise struc-\nture, and the results show that Masking can improve the robustness of classi\ufb01ers\nsigni\ufb01cantly.\n\n1\n\nIntroduction\n\nIt is always challenging to learn from noisy labels [2, 34, 4, 37, 25], since these labels are systemati-\ncally corrupted. As a negative effect, noisy labels inevitably degenerate the accuracy of classi\ufb01ers.\nThis negative effect becomes more prominent for deep learning, since these complex models can fully\nmemorize noisy labels, which correspondingly degenerates their generalization [48]. Unfortunately,\nnoisy labels are ubiquitous and unavoidable in our daily life, such as web queries [24], social-network\ntagging [6], crowdsourcing [45], medical images [8], and \ufb01nancial analysis [1].\nTo handle such noisy labels, recent approaches explore three directions mainly. One direction focuses\non training only on selected samples, which leverages the sample-selection bias [18] to overcome\nthe label noise issue. For example, MentorNet [19] trains on \u201csmall-loss\u201d samples. Meanwhile,\nDecoupling [26] trains on \u201cdisagreement\u201d samples, for which the predictions of two networks\ndisagree. However, since the data for training are selected on the \ufb02y rather than selected in the\nbeginning, it is hard to characterize these sample-selection biases, and then it is hard to give any\ntheoretical guarantee on the consistency of learning.\nAnother direction develops regularization techniques, including explicit and implicit regularizations.\nThis direction employs the regularization bias to overcome the label noise issue. Explicit regulariza-\n\u2217The \ufb01rst two authors (Bo Han and Jiangchao Yao) made equal contributions. The implementation is available\n\nat https://github.com/bhanML/Masking.\n\n32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montr\u00e9al, Canada.\n\n\ftion is added to the objective function, such as manifold regularization [5] and virtual adversarial\ntraining [30]. Implicit regularization is designed for training algorithms, such as temporal ensem-\nbling [20] and mean teacher [41]. Nevertheless, both approaches introduce a permanent regularization\nbias, and the learned classier barely reaches the optimal performance [9].\nThe last direction estimates the noise transition matrix without introducing sample-selection bias\nand regularization bias. As an approximation of real-world corruption, noisy labels are theoretically\n\ufb02ipped from the ground-truth labels by an unknown noise transition matrix. In this approach, the\naccuracy of classi\ufb01ers can be improved by estimating this matrix accurately. The previous methods\nfor estimating the noise transition matrix can be roughly summarized into two solutions.\nOne solution estimates the noise transition matrix in advance, and subsequently learns the classi\ufb01er\nbased on this estimated matrix. For example, Patrini et al. [32] leveraged a two-step solution to\nestimate the noise transition matrix. The bene\ufb01t is to require limited data only for the estimation\nprocedure. Nonetheless, their method is too heuristic to estimate the noise transition matrix accurately.\nThe other solution jointly estimates the noise transition matrix and learns the classi\ufb01er in an end-to-\nend framework. For instance, on top of the softmax layer, Sukhbaatar et al. [39] added a constrained\nlinear layer to model the noise transition matrix, while Goldberger et al. [10] added a nonlinear\nsoftmax layer. The bene\ufb01t is the generality of their uni\ufb01ed learning framework. However, their\nbrute-force learning leads to inexact estimation due to a \ufb01nite dataset.\nTherefore, an important question is, with a \ufb01nite dataset, can we leverage a constrained end-to-end\nmodel to overcome the above de\ufb01ciencies? In this paper, we present a human-assisted approach\ncalled \u201cMasking\u201d. Masking conveys human cognition of invalid class transitions (i.e., cat (cid:61) car),\nand speculates the structure of the noise transition matrix. The structure information can be viewed\nas a constraint to improve the estimation procedure. Namely, given the structure information, we can\nfocus on estimating the noise transition probability along the structure, which reduces the estimation\nburden largely.\nTo instantiate our approach, we derive a structure-aware probabilistic model, by incorporating a\nstructure prior. In the realization, we encounter two practical challenges: structure extraction and\nstructure alignment. Speci\ufb01cally, to address the structure extraction challenge, we propose a tempered\nsigmoid function to simulate the human cognition on the structure of the noise transition matrix. To\naddress the structure alignment challenge, we propose a variant of Generative Adversarial Networks\n(GANs) [12] to avoid the dif\ufb01culty of specifying the explicit distributions. We conduct extensive\nexperiments on two benchmark datasets (CIFAR-10 and CIFAR-100) with three noise structures, and\nthe industrial-level dataset (Clothing1M[46]) with agnostic noise structure. The experimental results\ndemonstrate that the proposed approach can improve the robustness of classi\ufb01ers signi\ufb01cantly.\n\n2 A new perspective of noisy supervision\n\nIn this section, we explore the noisy supervision from a brand new perspective, namely the structure\nof the noise transition matrix. First, we discuss where noisy labels normally come from, and why\nwe can speculate the structure of the noise transition matrix. Then, we present the representative\nstructures of the noise transition matrix.\nIn practice, noisy labels mainly come from the interaction between humans and tasks, such as\nsocial-network tagging and crowdsourcing. Assume that the more complex the interaction is, the\nmore efforts human beings spend. Due to cognitive psychology [29], human cognition can mask\ninvalid class transitions, and highlight valid class transitions automatically. This denotes that, human\ncognition can speculate the structure of the noise transition matrix correspondingly.\nOne effort-saving interaction is that, you tag a cluster of scenery images in social networks, such\nas beach, prairie, and mountain. However, the foreground of some images appears a dog or a cat,\nyielding noisy labels [35]. Thus, human cognition masks invalid class transitions (i.e., beach (cid:61)\nmountain), and highlights valid class transitions (i.e., beach \u2194 dog). This noise structure should be\nthe diagonal matrix coupled with two column lines, namely a column-diagonal matrix (Figure 1(a)),\nwhere the column lines correspond to the dog and cat classes, respectively.\n\n2\n\n\f(a) Column-diagonal\n\n(b) Tri-diagonal\n\n(c) Block-diagonal\n\nFigure 1: Three types of noise structure. Vertical axis denotes the class of ground-truth label, while\nhorizontal axis denotes the class of noisy label. White block means the valid class transitions, while\nblack block means the invalid class transitions.\n\nAnother effort-consuming interactions stem from the task annotation on Amazon Mechanical Turk 2.\nEven with high-degree efforts, amateur workers may be potentially confused by very similar classes\nto yield noisy labels, due to their limited expertise. There are two practical cases.\nThe \ufb01ne-grained case denotes that, the transition from one class to its similar class is continuous (e.g.,\nAustralian terrier, Norfolk terrier, Norwich terrier, Irish terrier, and Scotch terrier), and workers make\nmistakes in the adjacent positions [7]. Thus, human cognition can mask invalid class transitions (i.e.,\nAustralian terrier (cid:61) Norwich terrier), and highlight valid class transitions (i.e., Norfolk terrier \u2194\nNorwich terrier \u2194 Irish terrier). This noise structure should be a tri-diagonal matrix (Figure 1(b)).\nThe hierarchical-grained case denotes that, the transition among super-classes is discrete (e.g., aquatic\nmammals and \ufb02owers) and impossible, while the transition among sub-classes is continuous (e.g.,\naquatic mammals contain beaver, dolphin, otter, seal, and whale) and possible [32]. Thus, human\ncognition can mask invalid class transitions (i.e., aquatic mammals (cid:61) \ufb02owers), and highlight valid\nclass transitions (i.e., beaver \u2194 dolphin). This noise structure should be a block-diagonal matrix\n(Figure 1(c)), where each block represents a super-class.\nWhen we already know the noise structure from human cognition, we only focus on estimating the\nnoise transition probability. However, how can we instill the structure information into the estimation\nprocedure? We are going to answer this question in the following sections.\n\n3 Learning with Masking\n\nWe brie\ufb02y show how benchmark models handle noisy labels, and reveal their de\ufb01ciencies (Section 3.1).\nThen, we show why the Masking approach can solve such issues (Section 3.2). After the Masking\nprocedure, we present a straightforward idea to incorporate the structure information, and show\nits potential dilemma (Section 3.2.1). Fortunately, we \ufb01nd a suitable approach to incorporate the\nstructure information into an end-to-end model (Section 3.2.2). To realize this approach, we encounter\ntwo practical challenges, and present principled solutions (Section 3.2.3).\n\n3.1 De\ufb01ciency of benchmark models\n\nFigure 2(a) is a basic model to train a classi\ufb01er in the setting of noisy labels. Assuming that x\nrepresents a \u201cDog\u201d image, its latent ground-truth label y should be a \u201cDog\u201d class. However, its\nannotated label \u02dcy belongs to the \u201cCat\u201d class. In essence, the noisy label \u02dcy is \ufb02ipped from the ground-\ntruth label y by an unknown noise transition matrix. Therefore, current techniques tend to improve\nthe accuracy of classi\ufb01er (x \u2192 y) by estimating the noise transition matrix (y \u2192 \u02dcy).\nHere, we introduce two benchmark realizations of Figure 2(a). The \ufb01rst benchmark model comes\nfrom Patrini et al. [32], which uses the anchor set condition [21] to independently estimate the noise\n\n2https://www.mturk.com/\n\n3\n\nNoisy labelGround-truth label\f(a) Benchmark model.\n\n(b) MASKING model.\n\nFigure 2: Comparison of benchmark models and our MASKING model. Assume that, (x, \u02dcy) denotes\nthe instance with the noisy label, and y represents the latent ground-truth label. T is the noise\ntransition matrix, where Tij = Pr(\u02dcy = ej|y = ei). Left Panel: a benchmark model. Right Panel:\nMASKING models the matrix T by an explicit variable s. Thus, we embed a structure constraint on\nthe variable s, where the structure information come from human cognition h.\n\ntransition matrix (y \u2192 \u02dcy). Based on the estimated matrix, they learn the classi\ufb01er (x \u2192 y) by\nthe strategy of loss corrections. However, the estimation phase is not justi\ufb01ed for agnostic noisy\ndata, which thus limits the performance of the classi\ufb01er. The other benchmark model comes from\nGoldberger and Ben-Reuven [10], which uni\ufb01es the two steps of the \ufb01rst benchmark model into a\njoint fashion. Speci\ufb01cally, the noise transition matrix (y \u2192 \u02dcy) modeled by a nonlinear softmax layer\nconnects the classi\ufb01er (x \u2192 y) with noisy labels \u02dcy for the end-to-end training. However, due to a\n\ufb01nite dataset, this brute-force estimation may suffer from many local minima.\n\n3.2 Does structure matter?\n\nTherefore, given a \ufb01nite dataset, can we leverage a constrained end-to-end model to solve the above\nde\ufb01ciencies? The answer is af\ufb01rmative. The reason can be explained intuitively: when human\ncognition masks the invalid class transitions (i.e., cat (cid:61) car), the structure information is available\nas the constraint. The constrained end-to-end model only focuses on estimating the noise transition\nprobability. The estimation burden will be largely reduced, and thus the brute-force estimation can\n\ufb01nd a good local minimum more easily.\nIn this paper, we summarize this human-assisted approach as \u201cMasking\u201d. Intuitively, with high\nprobability, Masking means some class transitions will not happen (i.e., cat (cid:61) car), while some class\ntransitions will happen (i.e., beaver \u2194 dolphin). Before delving into our realization, we \ufb01rst present\na straightforward idea to incorporate the structure information into an end-to-end model as follows.\n\n3.2.1 Straightforward dilemma\n\nFor structure instillation, the most straightforward idea is to add a regularizer on the objective of\nan end-to-end model. Namely, we can use a regularizer to represent the structure constraint. Due\nto the regularization effect in the optimization [11], the noise transition matrix learned by previous\nmodels [32, 10] will satisfy our expectation regarding the structure. For example, we can apply the\nLagrange multiplier in the benchmark model from [10] to instantiate this idea.\nHowever, such a deterministic method may not be easily implemented in practice due to three reasons.\nFirst, such a class of regularizers requires a suitable distance measure to compute the distance between\nthe learned transition matrix and the prior, and the corresponding regularization parameter. In deep\nlearning scenarios, it is quite hard to justify the choice of a distance measure, e.g., why choosing L2\ndistance instead of other distance measures for the structure instillation. Second, if we leverage a\nnoisy validation set, we need to construct the unbiased risk estimator for the backward correction.\nThis is impossible, as the inverse of estimated noise transition matrix cannot be accurately computed.\nLast but not least, even though we construct a clean validation set, it needs to repeat the training\nprocedure to tune the regularization parameter, which consumes remarkable computational resources.\nSpeci\ufb01cally, it requires a lot of non-trivial trials in the training-validation phase to \ufb01nd an optimal\nweight. For example, the Residual-52 net will take about one week to be well trained on the\nWebVision dataset, consisting of 16 million images with noisy labels [21]. If we consider adding a\n\n4\n\nxAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==eyAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEyAAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9xAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==yAAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9sAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3hAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==eyAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKExAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==eyAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEyAAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9xAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3btZhN2N2IJ/QVePCji1Z/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3H1FpHst7M0nQj+hQ8pAzaqzUeOqXK27VnYOsEi8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAqelXqoxoWxMh9i1VNIItZ/ND52SM6sMSBgrW9KQufp7IqOR1pMosJ0RNSO97M3E/7xuasJrP+MySQ1KtlgUpoKYmMy+JgOukBkxsYQyxe2thI2ooszYbEo2BG/55VXSuqh6btVrXFZqN3kcRTiBUzgHD66gBndQhyYwQHiGV3hzHpwX5935WLQWnHzmGP7A+fwB5jmM/A==yAAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68diC/YA2lM120q7dbMLuRgihv8CLB0W8+pO8+W/ctjlo64OBx3szzMwLEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dZwqhi0Wi1h1A6pRcIktw43AbqKQRoHATjC5m/mdJ1Sax/LBZAn6ER1JHnJGjZWa2aBSdWvuHGSVeAWpQoHGoPLVH8YsjVAaJqjWPc9NjJ9TZTgTOC33U40JZRM6wp6lkkao/Xx+6JScW2VIwljZkobM1d8TOY20zqLAdkbUjPWyNxP/83qpCW/8nMskNSjZYlGYCmJiMvuaDLlCZkRmCWWK21sJG1NFmbHZlG0I3vLLq6R9WfPcmte8qtZvizhKcApncAEeXEMd7qEBLWCA8Ayv8OY8Oi/Ou/OxaF1zipkT+APn8wfnvYz9sAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipqQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/epYz3hAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipOR6UK27VXYCsEy8nFcjRGJS/+sOYpRFKwwTVuue5ifEzqgxnAmelfqoxoWxCR9izVNIItZ8tDp2RC6sMSRgrW9KQhfp7IqOR1tMosJ0RNWO96s3F/7xeasIbP+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqt3kcRTiDc7gED2pQh3toQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AzfmM7A==eyAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKEAAAB9HicbVBNS8NAEJ3Ur1q/qh69BIvgqSQi6LHoxWMF+wFtKJvNpF262cTdTSWE/g4vHhTx6o/x5r9x2+agrQ8GHu/NMDPPTzhT2nG+rdLa+sbmVnm7srO7t39QPTxqqziVFFs05rHs+kQhZwJbmmmO3UQiiXyOHX98O/M7E5SKxeJBZwl6ERkKFjJKtJG8/hMLUDMeYJ5NB9WaU3fmsFeJW5AaFGgOql/9IKZphEJTTpTquU6ivZxIzSjHaaWfKkwIHZMh9gwVJELl5fOjp/aZUQI7jKUpoe25+nsiJ5FSWeSbzojokVr2ZuJ/Xi/V4bWXM5GkGgVdLApTbuvYniVgB0wi1TwzhFDJzK02HRFJqDY5VUwI7vLLq6R9UXedunt/WWvcFHGU4QRO4RxcuIIG3EETWkDhEZ7hFd6sifVivVsfi9aSVcwcwx9Ynz9uYpKE\fregularizer to instill structure information, each trial for adjusting the weight is a disaster. To sum up,\nwe do not consider this straightforward regularization approach.\n\n3.2.2 When structure meets generative model\n\nBased on the previous discussions, we conjecture that the Bayesian method should be a more suitable\ntool to model the structure information, since the structure information can be explicitly represented\nas the prior. Following this conjecture, we deduce an end-to-end probabilistic model to incorporate\nthe structure information as shown in Figure 2(b).\nCompared to benchmark models in Figure 2(a), we model the noise transition matrix with a random\nvariable s, and we instill the structure information by controlling the prior of its corresponding\nstructure variable, termed as so. Here, we assume there exists a deterministic function f (\u00b7) such\nthat so = f (s). The reason that we make such an assumption can be intuitively explained with the\nobservation, once one arbitrary matrix is given, human cognition can certainly describe its structure,\ne.g., diagonal or tri-diagonal. Thus, there must be a function that can implement the mapping from\ns to its structure so. For clarity, we present an exemplar generative process for \u201cmulti-class\u201d &\n\u201csingle-label\u201d classi\ufb01cation problem with noisy supervision.\n\n\u2022 The latent ground-truth label y \u223c P (y|x), where P (y|x) is a Categorical distribution 3.\n\u2022 The noise transition matrix s \u223c P (s) and its structure so \u223c P (so), where P (s) is an\nimplicit distribution modeled by neural networks without the exact form (i.e., multi-Dirac\ndistribution), P (so) = P (s) ds\n\n, and f (\u00b7) is the mapping function from s to so.\n\n\u2022 The noisy label \u02dcy \u223c P (\u02dcy|y, s), where P (\u02dcy|y, s) models the transition from y to \u02dcy given s.\nAccording to the above generative process, we can deduce the following evidence lower bound\n(ELBO) (the details in Appendix A) to approximate the log-likelihood of the noisy data, which can\nbe named as MASKING. MASKING is a structure-aware probabilistic model:\n\ndso(cid:12)(cid:12)so=f (s)\n\n\u2212 ln\uf8eb\uf8ec\uf8edQ(so)/ P (so)\n(cid:124)(cid:123)(cid:122)(cid:125)\n\nstructure prior\n\n\uf8f6\uf8f7\uf8f8(cid:12)(cid:12)(cid:12)(cid:12)(cid:12)so=f (s)\n\n\uf8f9\uf8fa\uf8fa\uf8fa\uf8fa\uf8fb\n\n,\n\n(1)\n\nwhere Q(s) is the variational distribution to approximate the posterior of the noise transition matrix s,\nand Q(so) = Q(s) ds\nis the corresponding variational distribution of the structure so. Eq. (1)\nseamlessly uni\ufb01es previous models and structure instillation, remarked as the following bene\ufb01ts.\n\nRemark 1 The \ufb01rst term inside the expectation in Eq. (1) recovers the previous benchmark models,\nrepresenting the log-likelihood from x to the noisy label \u02dcy. The second term inside the expectation in\nEq. (1) is for structure instillation, re\ufb02ecting the inconsistency between the distribution Q(so) learned\nfrom the training data and the structure prior P (so) provided by human cognition.\nAs a whole, MASKING model bene\ufb01ts from the human guidance (the second term) in the procedure\nof learning with noisy supervisions (the \ufb01rst term), which avoids the unexpected local minima with\nincorrect structures in previous works. Moreover, we avoid the dif\ufb01culty of hyperparameter selection\nby deducing a Bayesian framework, which does not require the regularization parameter to be tuned.\nNote that human cognition may introduce uncertainty. In this case, we just focus on the certain\nknowledge and use this knowledge as the prior; on the other hand, we dispose the uncertain knowledge\nby Masking. A special case is when we do not have any transition knowledge, we can unmask the\nwhole matrix, i.e., allowing all possible transitions. This naturally degenerates our MASKING model\nto the unconstrained S-adapation method.\n\n3.2.3 Towards principled realization\n\nTo realize the MASKING model, concretely, the second term in Eq. (1), we encounter two practical\nchallenges, and present the principled solutions as follows.\n\n3For the single-label classi\ufb01cation, a Categorical distribution is a natural choice. For the multi-label\n\nclassi\ufb01cation, a Multinomial distribution is used, since one example corresponds to multiple labels.\n\n5\n\nln P (\u02dcy|x) \u2265 E\n\nP (\u02dcy|y, s)P (y|x)\n\nprevious model\n\n(cid:123)(cid:122)\n\n(cid:125)\n\nQ(s)\n\n\uf8ee\uf8ef\uf8ef\uf8ef\uf8ef\uf8f0\nln(cid:88)y\n(cid:124)\ndso(cid:12)(cid:12)so=f (s)\n\n\ff (s) =\n\n, where \u03b1 \u2208 (0, 1), \u03b2 (cid:28) 1,\n\n(2)\n\n1\n\n1 + exp(\u2212 s\u2212\u03b1\n\u03b2 )\n\nChallenge from structure extraction: One challenge comes from how to specify the mapping\nfunction f (\u00b7) in Eq. (1), which extracts the structure variable so from the variable s. Without this step,\nwe cannot compute the structure variable so for Q(so) = Q(s) ds\n, and let alone optimize\nthe second term in Eq. (1). However, to the best of our knowledge, there is no related work that\nspeci\ufb01es f (\u00b7) in the area of structure extraction.\nHere, we explore a principled solution by simulating human cognition on the structure of the noise\ntransition matrix. In terms of the noise transition probability between two classes, human cognition\nconsiders the small value (i.e., 0.5%) as a sign of the invalid class transition, but the large value\n(i.e., 20%) as a noise pattern [14]. It indicates that, the larger transition probability is favored when\nquantifying the noise transition matrix into a structure. Such a procedure is very similar to the\nthresholding binarization operation with a tempered sigmoid function (Eq. (2)). Thus, we can use the\nfollowing tempered sigmoid function as f (\u00b7) to simulate the mapping from s to so,\n\ndso(cid:12)(cid:12)so=f (s)\n\nand we name its output f (s) as the masked structure so from s. By controlling the location parameter\n\u03b1 and the scale parameter \u03b2, Eq. (2) quanti\ufb01es s into the structure so for the second term in Eq. (1).\nChallenge from structure alignment: The second term in Eq. (1) is to make the structure learned\nfrom the training data (represented by Q(so)), close to the human prior (represented by P (so)). We\nconsider it as structure alignment. The challenge here is how to specify the distributions P (so) and\nQ(so) in Eq. (1) to reasonably measure their divergence for the structure alignment. The smaller\ndivergence between P (so) and Q(so), the more similar they are.\nThis question is dif\ufb01cult because, for prior P (so), we usually provide one or a few limited structure\ncandidates, and human cognition has the sparse empirical certainty [13] according to a speci\ufb01c noisy\ndata. That is to say, P (so) should be a distribution that concentrates on one or a few points, e.g.,\na multi-Dirac distribution or a multi-spike distribution4. These distributions are quite unstable for\noptimization, since they are approximately discrete, and easy to cause the computational over\ufb02ow\nproblem [36]. Regarding Q(so), it is equal to specifying Q(s) since they are correlated by so = f (s).\nIf we \ufb01nd Q(s) such that E\nQ(s) [ln Q(so)/P (so)] is analytically computable, we can avoid this\nchallenge. However, such a speci\ufb01cation with existing distributions is usually intractable.\nFortunately, we can employ the implicit models to deal with the above dilemma [42]. This is because\nin the implicit models, the distribution is directly simulated with neural networks plus a random noise,\nwhich avoids to specify an explicit distribution. Following this methodology, Q(s), Q(so) and P (so)\nin Eq. (1) can be implemented like Generative Adversarial Networks (GANs). Speci\ufb01cally, Q(s)\nand the divergence between Q(so) and P (so) are parameterized with two neural networks, i.e., one\ngenerator and one discriminator, and then play an adversarial game [12]. However, different from the\noriginal GANs, our model has one extra discriminator (called as reconstructor), since the \ufb01rst term in\nEq. (1) involves s, which will act as the second discriminator during the game.\nConcretely, the corresponding implementation of each term in Eq. (1) is illustrated in Figure 3,\nconsisting of three modules, generator, discriminator and reconstructor. The generator is responsible\nfor generating a distribution Q(s) of the noise transition matrix s, which serves for both terms in\nEq. (1). The discriminator implements the function of the second term in Eq. (1) to measure the\ndifference M(so, \u02c6so) between the extracted structure so with Eq. (2) and our prior structure \u02c6so. The\nreconstructor is for the \ufb01rst term in Eq. (1), facilitating the classi\ufb01er prediction P (y|x) and the noise\ntransition matrix s to yield noisy labels \u02dcy. In this way, we instill the structure information from\nhuman cognition into an end-to-end model.\n\n4 Related literature\n\nExcept several works mentioned before, we survey other solutions for noisy labels here. Statistical\nlearning focuses on theoretical guarantees, consisting of three directions - surrogate losses, noise rate\nestimation and probabilistic modeling. For example, in the surrogate losses, Natarajan et al. [31]\nproposed an unbiased estimator to provide the noise corrected loss approach. Masnadi-Shirazi et al.\n\n4https://en.wikipedia.org/wiki/Dirac_delta_function;\n\nwiki/Spike-and-slab_variable_selection\n\nhttps://en.wikipedia.org/\n\n6\n\n\fFigure 3: A GAN-like structure to model the structure instillation on learning with noisy supervision.\n\n[27] presented a robust non-convex loss, which is the special case in a family of robust losses [16]. In\nnoise rate estimation, both Menon et al. [28] and Liu et al. [23] proposed a class-probability estimator\nusing order statistics on the range of scores. Sanderson et al. [38] presented the same estimator using\nthe slope of the ROC curve. In probabilistic modeling, Raykar et al. [33] proposed a two-coin model\nto handle noisy labels from multiple annotators. Yan et al. [47] extended this two-coin model by\nsetting the dynamic \ufb02ipping probability associated with samples.\nDeep learning acquires better performance due to its complex nonlinearity [19, 44, 17, 49]. For\nexample, Li et al. proposed a uni\ufb01ed framework to distill the knowledge from clean labels and\nknowledge graph [22], which can be exploited to learn a better model from noisy labels. Veit et al.\ntrained a label cleaning network by a small set of clean labels, and used this network to reduce the\nnoise in large-scale noisy labels [43]. Tanaka et al. presented a joint optimization framework to learn\nparameters and estimate true labels simultaneously [40]. Ren et al. leveraged an additional validation\nset to adaptively assign weights to training examples in every iteration [35]. Rodrigues et al. added\na crowd layer after the output layer for noisy labels from multiple annotators [37]. However, all\nmethods require extra resources (e.g., knowledge graph or validation set) or more complex networks.\n\n5 Experiments\n\nIn this section, we verify the robustness of MASKING from two folds. First, we conduct experiments\non two benchmark datasets with three types of noise structure: namely (1) column-diagonal (Fig-\nure 1(a)); (2) tri-diagonal (Figure 1(b)); and (3) block-diagonal (Figure 1(c)). Second, we conduct\nexperiments on one industrial-level dataset with agnostic noise structure.\nBenchmark datasets. CIFAR-10 and CIFAR-100 datasets are used. Both datasets consist of 50k\nsamples for training and 10k samples for testing, where each sample is a 32 \u00d7 32 color image and its\nlabel. For CIFAR-10, we randomly \ufb02ip the labels of the training set according to the \ufb01rst two types of\nnoise structure, which has been illustrated in Figure 1 with prede\ufb01ned transition probabilities. For\nCIFAR-100, we implement the similar procedure to generate the noisy data, but follow the last type of\nnoise structure in Figure 1(c). All prede\ufb01ned transition probabilities are found in Table 2 (4th row).\n\nIndustrial-level dataset. An industrial-level dataset called Clothing1M [46] from online shopping\nwebsites (i.e., Taobao.com) is used here, where the ground-truth transition matrix is not available.\nClothing1M includes mislabeled images of different clothes, such as hoodie, jacket and windbreaker.\nThis dataset consist of 1000k samples for training and 1k samples for testing, where each sample\nis a 256 \u00d7 256 color image and its label. Although we cannot know the accurate structure prior for\nClothing1M, we can distill an approximated structure from the pre-estimated transition matrix [46].\nWe consider the transition probabilities greater than 0.1 as the valid transition patterns, and the\ntransition probabilities smaller than 0.01 as the invalid transition patterns.\n\nBaselines and measurements. We compare MASKING with the state-of-the-art & the most related\ntechniques for noisy supervision: (1) forward correction [32] (F-correction) and (2) S-adaptation [10].\nWe also compare it with directly training deep networks on noisy data (marked as (3) NOISY) and\nclean data (marked as (4) CLEAN). The performance of CLEAN can be viewed as an oracle or an\nupper bound. The prediction accuracy is used to evaluate the classi\ufb01cation performance of each\nmodel in the test set. Besides, we qualitatively visualize the noise transition matrix when the training\nof each model converges, to analyze whether the true noise transition matrix is approached.\n\nImplementations. All experiments are conducted on a NVIDIA TITAN GPU, and all methods are\nimplemented by Tensor\ufb02ow. We adopt the same base network as the classi\ufb01er of all methods, and\napply the cross-entropy loss for noisy labels. The stochastic gradient descent optimizer has been\nused to update the parameter of baselines and the classi\ufb01er in MASKING. For the generator and the\n\n7\n\nReconstructorGeneratorDiscriminator\u02c6soAAAB8HicbVDLSgNBEOyNrxhfUY9eFoPgKeyKoMegF48RzEOSJcxOJsmQeSwzvUJY8hVePCji1c/x5t84SfagiQUNRVU33V1xIrjFIPj2CmvrG5tbxe3Szu7e/kH58KhpdWooa1AttGnHxDLBFWsgR8HaiWFExoK14vHtzG89MWO5Vg84SVgkyVDxAacEnfTYHRHM7LSne+VKUA3m8FdJmJMK5Kj3yl/dvqapZAqpINZ2wiDBKCMGORVsWuqmliWEjsmQdRxVRDIbZfODp/6ZU/r+QBtXCv25+nsiI9LaiYxdpyQ4ssveTPzP66Q4uI4yrpIUmaKLRYNU+Kj92fd+nxtGUUwcIdRwd6tPR8QQii6jkgshXH55lTQvqmFQDe8vK7WbPI4inMApnEMIV1CDO6hDAyhIeIZXePOM9+K9ex+L1oKXzxzDH3ifPzFCkKY=AAAB8HicbVDLSgNBEOyNrxhfUY9eFoPgKeyKoMegF48RzEOSJcxOJsmQeSwzvUJY8hVePCji1c/x5t84SfagiQUNRVU33V1xIrjFIPj2CmvrG5tbxe3Szu7e/kH58KhpdWooa1AttGnHxDLBFWsgR8HaiWFExoK14vHtzG89MWO5Vg84SVgkyVDxAacEnfTYHRHM7LSne+VKUA3m8FdJmJMK5Kj3yl/dvqapZAqpINZ2wiDBKCMGORVsWuqmliWEjsmQdRxVRDIbZfODp/6ZU/r+QBtXCv25+nsiI9LaiYxdpyQ4ssveTPzP66Q4uI4yrpIUmaKLRYNU+Kj92fd+nxtGUUwcIdRwd6tPR8QQii6jkgshXH55lTQvqmFQDe8vK7WbPI4inMApnEMIV1CDO6hDAyhIeIZXePOM9+K9ex+L1oKXzxzDH3ifPzFCkKY=AAAB8HicbVDLSgNBEOyNrxhfUY9eFoPgKeyKoMegF48RzEOSJcxOJsmQeSwzvUJY8hVePCji1c/x5t84SfagiQUNRVU33V1xIrjFIPj2CmvrG5tbxe3Szu7e/kH58KhpdWooa1AttGnHxDLBFWsgR8HaiWFExoK14vHtzG89MWO5Vg84SVgkyVDxAacEnfTYHRHM7LSne+VKUA3m8FdJmJMK5Kj3yl/dvqapZAqpINZ2wiDBKCMGORVsWuqmliWEjsmQdRxVRDIbZfODp/6ZU/r+QBtXCv25+nsiI9LaiYxdpyQ4ssveTPzP66Q4uI4yrpIUmaKLRYNU+Kj92fd+nxtGUUwcIdRwd6tPR8QQii6jkgshXH55lTQvqmFQDe8vK7WbPI4inMApnEMIV1CDO6hDAyhIeIZXePOM9+K9ex+L1oKXzxzDH3ifPzFCkKY=AAAB8HicbVDLSgNBEOyNrxhfUY9eFoPgKeyKoMegF48RzEOSJcxOJsmQeSwzvUJY8hVePCji1c/x5t84SfagiQUNRVU33V1xIrjFIPj2CmvrG5tbxe3Szu7e/kH58KhpdWooa1AttGnHxDLBFWsgR8HaiWFExoK14vHtzG89MWO5Vg84SVgkyVDxAacEnfTYHRHM7LSne+VKUA3m8FdJmJMK5Kj3yl/dvqapZAqpINZ2wiDBKCMGORVsWuqmliWEjsmQdRxVRDIbZfODp/6ZU/r+QBtXCv25+nsiI9LaiYxdpyQ4ssveTPzP66Q4uI4yrpIUmaKLRYNU+Kj92fd+nxtGUUwcIdRwd6tPR8QQii6jkgshXH55lTQvqmFQDe8vK7WbPI4inMApnEMIV1CDO6hDAyhIeIZXePOM9+K9ex+L1oKXzxzDH3ifPzFCkKY=M(so,\u02c6so)AAACA3icbVDLSsNAFJ3UV62vqDvdDBahgpREBF0W3bgRKtgHNCFMptN26GQmzEyEEgJu/BU3LhRx60+482+ctFlo64ELh3Pu5d57wphRpR3n2yotLa+srpXXKxubW9s79u5eW4lEYtLCggnZDZEijHLS0lQz0o0lQVHISCccX+d+54FIRQW/15OY+BEacjqgGGkjBfaBFyE9woilt1lNBeLUGyGdqiwQJ4FdderOFHCRuAWpggLNwP7y+gInEeEaM6RUz3Vi7adIaooZySpeokiM8BgNSc9QjiKi/HT6QwaPjdKHAyFNcQ2n6u+JFEVKTaLQdOYXq3kvF//zeokeXPop5XGiCcezRYOEQS1gHgjsU0mwZhNDEJbU3ArxCEmEtYmtYkJw519eJO2zuuvU3bvzauOqiKMMDsERqAEXXIAGuAFN0AIYPIJn8ArerCfrxXq3PmatJauY2Qd/YH3+AP6yl7o=AAACA3icbVDLSsNAFJ3UV62vqDvdDBahgpREBF0W3bgRKtgHNCFMptN26GQmzEyEEgJu/BU3LhRx60+482+ctFlo64ELh3Pu5d57wphRpR3n2yotLa+srpXXKxubW9s79u5eW4lEYtLCggnZDZEijHLS0lQz0o0lQVHISCccX+d+54FIRQW/15OY+BEacjqgGGkjBfaBFyE9woilt1lNBeLUGyGdqiwQJ4FdderOFHCRuAWpggLNwP7y+gInEeEaM6RUz3Vi7adIaooZySpeokiM8BgNSc9QjiKi/HT6QwaPjdKHAyFNcQ2n6u+JFEVKTaLQdOYXq3kvF//zeokeXPop5XGiCcezRYOEQS1gHgjsU0mwZhNDEJbU3ArxCEmEtYmtYkJw519eJO2zuuvU3bvzauOqiKMMDsERqAEXXIAGuAFN0AIYPIJn8ArerCfrxXq3PmatJauY2Qd/YH3+AP6yl7o=AAACA3icbVDLSsNAFJ3UV62vqDvdDBahgpREBF0W3bgRKtgHNCFMptN26GQmzEyEEgJu/BU3LhRx60+482+ctFlo64ELh3Pu5d57wphRpR3n2yotLa+srpXXKxubW9s79u5eW4lEYtLCggnZDZEijHLS0lQz0o0lQVHISCccX+d+54FIRQW/15OY+BEacjqgGGkjBfaBFyE9woilt1lNBeLUGyGdqiwQJ4FdderOFHCRuAWpggLNwP7y+gInEeEaM6RUz3Vi7adIaooZySpeokiM8BgNSc9QjiKi/HT6QwaPjdKHAyFNcQ2n6u+JFEVKTaLQdOYXq3kvF//zeokeXPop5XGiCcezRYOEQS1gHgjsU0mwZhNDEJbU3ArxCEmEtYmtYkJw519eJO2zuuvU3bvzauOqiKMMDsERqAEXXIAGuAFN0AIYPIJn8ArerCfrxXq3PmatJauY2Qd/YH3+AP6yl7o=AAACA3icbVDLSsNAFJ3UV62vqDvdDBahgpREBF0W3bgRKtgHNCFMptN26GQmzEyEEgJu/BU3LhRx60+482+ctFlo64ELh3Pu5d57wphRpR3n2yotLa+srpXXKxubW9s79u5eW4lEYtLCggnZDZEijHLS0lQz0o0lQVHISCccX+d+54FIRQW/15OY+BEacjqgGGkjBfaBFyE9woilt1lNBeLUGyGdqiwQJ4FdderOFHCRuAWpggLNwP7y+gInEeEaM6RUz3Vi7adIaooZySpeokiM8BgNSc9QjiKi/HT6QwaPjdKHAyFNcQ2n6u+JFEVKTaLQdOYXq3kvF//zeokeXPop5XGiCcezRYOEQS1gHgjsU0mwZhNDEJbU3ArxCEmEtYmtYkJw519eJO2zuuvU3bvzauOqiKMMDsERqAEXXIAGuAFN0AIYPIJn8ArerCfrxXq3PmatJauY2Qd/YH3+AP6yl7o=f(s)AAAB63icbVBNSwMxEJ34WetX1aOXYBHqpeyKoMeiF48V7Ae0S8mm2TY0yS5JVihL/4IXD4p49Q9589+YbfegrQ8GHu/NMDMvTAQ31vO+0dr6xubWdmmnvLu3f3BYOTpumzjVlLVoLGLdDYlhgivWstwK1k00IzIUrBNO7nK/88S04bF6tNOEBZKMFI84JTaXopq5GFSqXt2bA68SvyBVKNAcVL76w5imkilLBTGm53uJDTKiLaeCzcr91LCE0AkZsZ6jikhmgmx+6wyfO2WIo1i7UhbP1d8TGZHGTGXoOiWxY7Ps5eJ/Xi+10U2QcZWklim6WBSlAtsY54/jIdeMWjF1hFDN3a2Yjokm1Lp4yi4Ef/nlVdK+rPte3X+4qjZuizhKcApnUAMfrqEB99CEFlAYwzO8whuS6AW9o49F6xoqZk7gD9DnD2TGjcw=AAAB63icbVBNSwMxEJ34WetX1aOXYBHqpeyKoMeiF48V7Ae0S8mm2TY0yS5JVihL/4IXD4p49Q9589+YbfegrQ8GHu/NMDMvTAQ31vO+0dr6xubWdmmnvLu3f3BYOTpumzjVlLVoLGLdDYlhgivWstwK1k00IzIUrBNO7nK/88S04bF6tNOEBZKMFI84JTaXopq5GFSqXt2bA68SvyBVKNAcVL76w5imkilLBTGm53uJDTKiLaeCzcr91LCE0AkZsZ6jikhmgmx+6wyfO2WIo1i7UhbP1d8TGZHGTGXoOiWxY7Ps5eJ/Xi+10U2QcZWklim6WBSlAtsY54/jIdeMWjF1hFDN3a2Yjokm1Lp4yi4Ef/nlVdK+rPte3X+4qjZuizhKcApnUAMfrqEB99CEFlAYwzO8whuS6AW9o49F6xoqZk7gD9DnD2TGjcw=AAAB63icbVBNSwMxEJ34WetX1aOXYBHqpeyKoMeiF48V7Ae0S8mm2TY0yS5JVihL/4IXD4p49Q9589+YbfegrQ8GHu/NMDMvTAQ31vO+0dr6xubWdmmnvLu3f3BYOTpumzjVlLVoLGLdDYlhgivWstwK1k00IzIUrBNO7nK/88S04bF6tNOEBZKMFI84JTaXopq5GFSqXt2bA68SvyBVKNAcVL76w5imkilLBTGm53uJDTKiLaeCzcr91LCE0AkZsZ6jikhmgmx+6wyfO2WIo1i7UhbP1d8TGZHGTGXoOiWxY7Ps5eJ/Xi+10U2QcZWklim6WBSlAtsY54/jIdeMWjF1hFDN3a2Yjokm1Lp4yi4Ef/nlVdK+rPte3X+4qjZuizhKcApnUAMfrqEB99CEFlAYwzO8whuS6AW9o49F6xoqZk7gD9DnD2TGjcw=AAAB63icbVBNSwMxEJ34WetX1aOXYBHqpeyKoMeiF48V7Ae0S8mm2TY0yS5JVihL/4IXD4p49Q9589+YbfegrQ8GHu/NMDMvTAQ31vO+0dr6xubWdmmnvLu3f3BYOTpumzjVlLVoLGLdDYlhgivWstwK1k00IzIUrBNO7nK/88S04bF6tNOEBZKMFI84JTaXopq5GFSqXt2bA68SvyBVKNAcVL76w5imkilLBTGm53uJDTKiLaeCzcr91LCE0AkZsZ6jikhmgmx+6wyfO2WIo1i7UhbP1d8TGZHGTGXoOiWxY7Ps5eJ/Xi+10U2QcZWklim6WBSlAtsY54/jIdeMWjF1hFDN3a2Yjokm1Lp4yi4Ef/nlVdK+rPte3X+4qjZuizhKcApnUAMfrqEB99CEFlAYwzO8whuS6AW9o49F6xoqZk7gD9DnD2TGjcw=Q(s)AAAB63icbVBNS8NAEJ3Ur1q/qh69LBahXkoigh6LXjy2YD+gDWWz3bRLdzdhdyOU0L/gxYMiXv1D3vw3btIctPXBwOO9GWbmBTFn2rjut1Pa2Nza3invVvb2Dw6PqscnXR0litAOiXik+gHWlDNJO4YZTvuxolgEnPaC2X3m956o0iySj2YeU1/giWQhI9hkUruuL0fVmttwc6B14hWkBgVao+rXcByRRFBpCMdaDzw3Nn6KlWGE00VlmGgaYzLDEzqwVGJBtZ/mty7QhVXGKIyULWlQrv6eSLHQei4C2ymwmepVLxP/8waJCW/9lMk4MVSS5aIw4chEKHscjZmixPC5JZgoZm9FZIoVJsbGU7EheKsvr5PuVcNzG177uta8K+IowxmcQx08uIEmPEALOkBgCs/wCm+OcF6cd+dj2VpyiplT+APn8wdEs423AAAB63icbVBNS8NAEJ3Ur1q/qh69LBahXkoigh6LXjy2YD+gDWWz3bRLdzdhdyOU0L/gxYMiXv1D3vw3btIctPXBwOO9GWbmBTFn2rjut1Pa2Nza3invVvb2Dw6PqscnXR0litAOiXik+gHWlDNJO4YZTvuxolgEnPaC2X3m956o0iySj2YeU1/giWQhI9hkUruuL0fVmttwc6B14hWkBgVao+rXcByRRFBpCMdaDzw3Nn6KlWGE00VlmGgaYzLDEzqwVGJBtZ/mty7QhVXGKIyULWlQrv6eSLHQei4C2ymwmepVLxP/8waJCW/9lMk4MVSS5aIw4chEKHscjZmixPC5JZgoZm9FZIoVJsbGU7EheKsvr5PuVcNzG177uta8K+IowxmcQx08uIEmPEALOkBgCs/wCm+OcF6cd+dj2VpyiplT+APn8wdEs423AAAB63icbVBNS8NAEJ3Ur1q/qh69LBahXkoigh6LXjy2YD+gDWWz3bRLdzdhdyOU0L/gxYMiXv1D3vw3btIctPXBwOO9GWbmBTFn2rjut1Pa2Nza3invVvb2Dw6PqscnXR0litAOiXik+gHWlDNJO4YZTvuxolgEnPaC2X3m956o0iySj2YeU1/giWQhI9hkUruuL0fVmttwc6B14hWkBgVao+rXcByRRFBpCMdaDzw3Nn6KlWGE00VlmGgaYzLDEzqwVGJBtZ/mty7QhVXGKIyULWlQrv6eSLHQei4C2ymwmepVLxP/8waJCW/9lMk4MVSS5aIw4chEKHscjZmixPC5JZgoZm9FZIoVJsbGU7EheKsvr5PuVcNzG177uta8K+IowxmcQx08uIEmPEALOkBgCs/wCm+OcF6cd+dj2VpyiplT+APn8wdEs423AAAB63icbVBNS8NAEJ3Ur1q/qh69LBahXkoigh6LXjy2YD+gDWWz3bRLdzdhdyOU0L/gxYMiXv1D3vw3btIctPXBwOO9GWbmBTFn2rjut1Pa2Nza3invVvb2Dw6PqscnXR0litAOiXik+gHWlDNJO4YZTvuxolgEnPaC2X3m956o0iySj2YeU1/giWQhI9hkUruuL0fVmttwc6B14hWkBgVao+rXcByRRFBpCMdaDzw3Nn6KlWGE00VlmGgaYzLDEzqwVGJBtZ/mty7QhVXGKIyULWlQrv6eSLHQei4C2ymwmepVLxP/8waJCW/9lMk4MVSS5aIw4chEKHscjZmixPC5JZgoZm9FZIoVJsbGU7EheKsvr5PuVcNzG177uta8K+IowxmcQx08uIEmPEALOkBgCs/wCm+OcF6cd+dj2VpyiplT+APn8wdEs423P(y|x)AAAB7XicbVBNSwMxEJ2tX7V+VT16CRahXsquCHosevFYwX5Au5Rsmm1js8mSZMVl7X/w4kERr/4fb/4b03YP2vpg4PHeDDPzgpgzbVz32ymsrK6tbxQ3S1vbO7t75f2DlpaJIrRJJJeqE2BNORO0aZjhtBMriqOA03Ywvp767QeqNJPizqQx9SM8FCxkBBsrtRrV9OnxtF+uuDV3BrRMvJxUIEejX/7qDSRJIioM4VjrrufGxs+wMoxwOin1Ek1jTMZ4SLuWChxR7WezayfoxCoDFEplSxg0U39PZDjSOo0C2xlhM9KL3lT8z+smJrz0MybixFBB5ovChCMj0fR1NGCKEsNTSzBRzN6KyAgrTIwNqGRD8BZfXiats5rn1rzb80r9Ko+jCEdwDFXw4ALqcAMNaAKBe3iGV3hzpPPivDsf89aCk88cwh84nz8LYI7EAAAB7XicbVBNSwMxEJ2tX7V+VT16CRahXsquCHosevFYwX5Au5Rsmm1js8mSZMVl7X/w4kERr/4fb/4b03YP2vpg4PHeDDPzgpgzbVz32ymsrK6tbxQ3S1vbO7t75f2DlpaJIrRJJJeqE2BNORO0aZjhtBMriqOA03Ywvp767QeqNJPizqQx9SM8FCxkBBsrtRrV9OnxtF+uuDV3BrRMvJxUIEejX/7qDSRJIioM4VjrrufGxs+wMoxwOin1Ek1jTMZ4SLuWChxR7WezayfoxCoDFEplSxg0U39PZDjSOo0C2xlhM9KL3lT8z+smJrz0MybixFBB5ovChCMj0fR1NGCKEsNTSzBRzN6KyAgrTIwNqGRD8BZfXiats5rn1rzb80r9Ko+jCEdwDFXw4ALqcAMNaAKBe3iGV3hzpPPivDsf89aCk88cwh84nz8LYI7EAAAB7XicbVBNSwMxEJ2tX7V+VT16CRahXsquCHosevFYwX5Au5Rsmm1js8mSZMVl7X/w4kERr/4fb/4b03YP2vpg4PHeDDPzgpgzbVz32ymsrK6tbxQ3S1vbO7t75f2DlpaJIrRJJJeqE2BNORO0aZjhtBMriqOA03Ywvp767QeqNJPizqQx9SM8FCxkBBsrtRrV9OnxtF+uuDV3BrRMvJxUIEejX/7qDSRJIioM4VjrrufGxs+wMoxwOin1Ek1jTMZ4SLuWChxR7WezayfoxCoDFEplSxg0U39PZDjSOo0C2xlhM9KL3lT8z+smJrz0MybixFBB5ovChCMj0fR1NGCKEsNTSzBRzN6KyAgrTIwNqGRD8BZfXiats5rn1rzb80r9Ko+jCEdwDFXw4ALqcAMNaAKBe3iGV3hzpPPivDsf89aCk88cwh84nz8LYI7EAAAB7XicbVBNSwMxEJ2tX7V+VT16CRahXsquCHosevFYwX5Au5Rsmm1js8mSZMVl7X/w4kERr/4fb/4b03YP2vpg4PHeDDPzgpgzbVz32ymsrK6tbxQ3S1vbO7t75f2DlpaJIrRJJJeqE2BNORO0aZjhtBMriqOA03Ywvp767QeqNJPizqQx9SM8FCxkBBsrtRrV9OnxtF+uuDV3BrRMvJxUIEejX/7qDSRJIioM4VjrrufGxs+wMoxwOin1Ek1jTMZ4SLuWChxR7WezayfoxCoDFEplSxg0U39PZDjSOo0C2xlhM9KL3lT8z+smJrz0MybixFBB5ovChCMj0fR1NGCKEsNTSzBRzN6KyAgrTIwNqGRD8BZfXiats5rn1rzb80r9Ko+jCEdwDFXw4ALqcAMNaAKBe3iGV3hzpPPivDsf89aCk88cwh84nz8LYI7EP(\u02dcy|y,s)AAAB+XicbVBNS8NAEN3Ur1q/oh69LBahgpREBD0WvXisYD+gDWWz2bRLN5uwOymE2H/ixYMiXv0n3vw3btsctPXBwOO9GWbm+YngGhzn2yqtrW9sbpW3Kzu7e/sH9uFRW8epoqxFYxGrrk80E1yyFnAQrJsoRiJfsI4/vpv5nQlTmsfyEbKEeREZSh5ySsBIA9tu1vrARcDybPqUXejzgV116s4ceJW4BamiAs2B/dUPYppGTAIVROue6yTg5UQBp4JNK/1Us4TQMRmynqGSREx7+fzyKT4zSoDDWJmSgOfq74mcRFpnkW86IwIjvezNxP+8XgrhjZdzmaTAJF0sClOBIcazGHDAFaMgMkMIVdzciumIKELBhFUxIbjLL6+S9mXdderuw1W1cVvEUUYn6BTVkIuuUQPdoyZqIYom6Bm9ojcrt16sd+tj0Vqyiplj9AfW5w82jJNfAAAB+XicbVBNS8NAEN3Ur1q/oh69LBahgpREBD0WvXisYD+gDWWz2bRLN5uwOymE2H/ixYMiXv0n3vw3btsctPXBwOO9GWbm+YngGhzn2yqtrW9sbpW3Kzu7e/sH9uFRW8epoqxFYxGrrk80E1yyFnAQrJsoRiJfsI4/vpv5nQlTmsfyEbKEeREZSh5ySsBIA9tu1vrARcDybPqUXejzgV116s4ceJW4BamiAs2B/dUPYppGTAIVROue6yTg5UQBp4JNK/1Us4TQMRmynqGSREx7+fzyKT4zSoDDWJmSgOfq74mcRFpnkW86IwIjvezNxP+8XgrhjZdzmaTAJF0sClOBIcazGHDAFaMgMkMIVdzciumIKELBhFUxIbjLL6+S9mXdderuw1W1cVvEUUYn6BTVkIuuUQPdoyZqIYom6Bm9ojcrt16sd+tj0Vqyiplj9AfW5w82jJNfAAAB+XicbVBNS8NAEN3Ur1q/oh69LBahgpREBD0WvXisYD+gDWWz2bRLN5uwOymE2H/ixYMiXv0n3vw3btsctPXBwOO9GWbm+YngGhzn2yqtrW9sbpW3Kzu7e/sH9uFRW8epoqxFYxGrrk80E1yyFnAQrJsoRiJfsI4/vpv5nQlTmsfyEbKEeREZSh5ySsBIA9tu1vrARcDybPqUXejzgV116s4ceJW4BamiAs2B/dUPYppGTAIVROue6yTg5UQBp4JNK/1Us4TQMRmynqGSREx7+fzyKT4zSoDDWJmSgOfq74mcRFpnkW86IwIjvezNxP+8XgrhjZdzmaTAJF0sClOBIcazGHDAFaMgMkMIVdzciumIKELBhFUxIbjLL6+S9mXdderuw1W1cVvEUUYn6BTVkIuuUQPdoyZqIYom6Bm9ojcrt16sd+tj0Vqyiplj9AfW5w82jJNfAAAB+XicbVBNS8NAEN3Ur1q/oh69LBahgpREBD0WvXisYD+gDWWz2bRLN5uwOymE2H/ixYMiXv0n3vw3btsctPXBwOO9GWbm+YngGhzn2yqtrW9sbpW3Kzu7e/sH9uFRW8epoqxFYxGrrk80E1yyFnAQrJsoRiJfsI4/vpv5nQlTmsfyEbKEeREZSh5ySsBIA9tu1vrARcDybPqUXejzgV116s4ceJW4BamiAs2B/dUPYppGTAIVROue6yTg5UQBp4JNK/1Us4TQMRmynqGSREx7+fzyKT4zSoDDWJmSgOfq74mcRFpnkW86IwIjvezNxP+8XgrhjZdzmaTAJF0sClOBIcazGHDAFaMgMkMIVdzciumIKELBhFUxIbjLL6+S9mXdderuw1W1cVvEUUYn6BTVkIuuUQPdoyZqIYom6Bm9ojcrt16sd+tj0Vqyiplj9AfW5w82jJNf\fdiscriminator, we follow the advice in Gulrajani et al. [15] to choose the RMSProp optimizer. For\nboth datasets, the batch size is set to 128 for 15,000 iterations. \u03b1 and \u03b2 in Eq. (2) are respectively set\n0.05 and 0.005. The estimation of the noise transition matrix in F-correction and the initialization of\nthe adaption layer in S-adaptation follow the strategy in Patrini et al. [32]. Note that we have tried the\noriginal initialization way in Goldberger and Ben-Reuven [10], but it is not better than the way in [32].\nMore details about the network architectures and the learning rates are summarized in Appendix B.\nOur implementation of MASKING is available at https://github.com/bhanML/Masking.\n\nEmpirical results. We train MASKING and baselines (except CLEAN) on the noisy datasets and\nvalidate the performance in the clean test datasets. Figure 4 depicts the test accuracy on benchmark\ndatasets with three types of noise transition matrix. According to the comparison, we can \ufb01nd that\nMASKING persistently outperforms F-correction, S-adaptation and NOISY. In terms of tri-diagonal\nand block-diagonal cases, it almost achieves the performance comparable to that of CLEAN.\nBesides, Table 2 presents the visualization of the noise transition matrix estimated by F-correction,\nS-adaptation and MASKING, as well as the true noise transition matrix. As can be seen, with the\nguidance of the prior structure, MASKING infers the noise transition matrix better than two baselines.\nSpeci\ufb01cally, we observe that for the estimation on the tri-diagonal structure, both F-correction and\nS-adaptation severely fail. F-correction pre-estimates a non-ideal matrix, and S-adaptation tunes it\nworse, while MASKING learns it better. To avoid the performance drop when directly training on the\nnoisy dataset as [19, 3], we have con\ufb01gured the dropout layer in deep neural networks as [3].\n\n(a) Column-diagnoal (CIFAR-10)\n\n(b) Tri-diagnoal (CIFAR-10)\n\n(c) Block-diagnoal (CIFAR-100)\n\nFigure 4: Test accuracy vs iterations on benchmark datasets with three types of noise structure.\n\nTable 1: Test accuracy on Clothing1M with agnostic noise structure.\n\nModels\nNOISY\nF-correction\nS-adaptation\nMASKING\nCLEAN\n\nPerformance(%)\n\n68.9\n69.8\n70.3\n71.1\n75.2\n\nFurthermore, test accuracy of all methods on Clothing1M dataset is shown in Table 1. The comparison\ndenotes that, when the noise model of the training data is completely unknown to all methods,\nMASKING still outperforms other methods. Compared to results in Figure 4, the robustness of\nMASKING marginally outperforms that of F-correction and S-adaptation. We conjecture two reasons\nexist: First, the structure prior we used here is not the ground-truth noise structure. Second, the\nground-truth noise structure of Clothing1M is too complex to be estimated easily. To solve the\nestimation issue, we can use crowdsourcing to provide labels in practice, and invite an expert to \ufb01nd\nthe accurate noise structure. This should be much cheaper than letting the expert assign labels.\n6 Conclusions\n\nThis paper presents a Masking approach. This approach conveys human cognition of invalid class\ntransitions, and speculates the structure of the noise transition matrix. Given the structure information,\n\n8\n\n0.00.20.40.60.81.01.21.4Iteration\u00d71050.400.500.600.700.800.90Accuracy0.00.20.40.60.81.01.21.4Iteration\u00d71050.400.500.600.700.800.90Accuracy0.00.20.40.60.81.01.21.4Iteration\u00d71050.300.350.400.450.500.550.60AccuracyNOISYF-correctionS-adaptationMASKINGCLEAN\fTable 2: The estimation of the noise transition matrix by F-correction (1st row), S-adaptation (2nd\nrow) and MASKING (3rd row), and the truth (4th row) in the case of three types of noise transition\nstructure: column-diagonal (1st column), tri-diagonal (2nd column), block-diagonal (3rd column).\n\nwe derive a structure-aware probabilistic model (MASKING), which incorporates a structure prior.\nEmpirical results demonstrate that our approach can improve the robustness of classi\ufb01ers obviously.\nIn future, we will explore how MASKING self-corrects the incorrect noise structure. Namely, when\nthe noise structure is wrongly set at the initial stage, how does our model correct the initial structure\nby learning from the \ufb01nite dataset?\n\nAcknowledgments.\n\nMS was supported by the International Research Center for Neurointelligence (WPI-IRCN) at The\nUniversity of Tokyo Institutes for Advanced Study. IWT was supported by ARC FT130100746,\nDP180100106 and LP150100671. MZ acknowledges the support of Award IIS-1812699 from the U.S.\nNational Science Foundation. YZ was supported by the High Technology Research and Development\nProgram of China (2015AA015801), NSFC (61521062), and STCSM (18DZ2270700). BH would\nlike to thank the \ufb01nancial support from RIKEN-AIP. JY would like to thank the \ufb01nancial support\nSJTU-CMIC and UTS-CAI. We gratefully acknowledge the support of NVIDIA Corporation with\nthe donation of the Titan Xp GPU used for this research.\n\n9\n\n\fReferences\n[1] Y. A\u00eft-Sahalia, J. Fan, and D. Xiu. High-frequency covariance estimates with noisy and asynchronous\n\n\ufb01nancial data. Journal of the American Statistical Association, 105(492):1504\u20131517, 2010.\n\n[2] D. Angluin and P. Laird. Learning from noisy examples. Machine Learning, 2(4):343\u2013370, 1988.\n\n[3] D. Arpit, S. Jastrz\u02dbebski, N. Ballas, D. Krueger, E. Bengio, M. Kanwal, T. Maharaj, A. Fischer, A. Courville,\n\nand Y. Bengio. A closer look at memorization in deep networks. In ICML, 2017.\n\n[4] S. Azadi, J. Feng, S. Jegelka, and T. Darrell. Auxiliary image regularization for deep cnns with noisy\n\nlabels. In ICLR, 2016.\n\n[5] M. Belkin, P. Niyogi, and V. Sindhwani. Manifold regularization: A geometric framework for learning\nfrom labeled and unlabeled examples. Journal of Machine Learning Research, 7(Nov):2399\u20132434, 2006.\n\n[6] Y. Cha and J. Cho. Social-network analysis using topic models. In SIGIR, 2012.\n\n[7] J. Deng, J. Krause, and L. Fei-Fei. Fine-grained crowdsourcing for \ufb01ne-grained recognition. In CVPR,\n\n2013.\n\n[8] Y. Dgani, H. Greenspan, and J. Goldberger. Training a neural network based on unreliable human annotation\n\nof medical images. In ISBI, 2018.\n\n[9] P. Domingos and M. Pazzani. On the optimality of the simple bayesian classi\ufb01er under zero-one loss.\n\nMachine Learning, 29(2-3):103\u2013130, 1997.\n\n[10] J. Goldberger and E. Ben-Reuven. Training deep neural-networks using a noise adaptation layer. In ICLR,\n\n2017.\n\n[11] I. Goodfellow, Y. Bengio, and A. Courville. Deep Learning. MIT Press, 2016.\n\n[12] I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio.\n\nGenerative adversarial nets. In NIPS, 2014.\n\n[13] N. Goodman. Sense and certainty. The Philosophical Review, pages 160\u2013167, 1952.\n\n[14] P. Grigolini, G. Aquino, M. Bologna, M. Lukovi\u00b4c, and B. West. A theory of 1/f noise in human cognition.\n\nPhysica A: Statistical Mechanics and its Applications, 388(19):4192\u20134204, 2009.\n\n[15] I. Gulrajani, F. Ahmed, M. Arjovsky, V. Dumoulin, and A. Courville. Improved training of wasserstein\n\ngans. In NIPS, 2017.\n\n[16] B. Han, I. Tsang, and L. Chen. On the convergence of a family of robust losses for stochastic gradient\n\ndescent. In ECML-PKDD, 2016.\n\n[17] D. Hendrycks, M. Mazeika, D. Wilson, and K. Gimpel. Using trusted data to train deep networks on labels\n\ncorrupted by severe noise. In NIPS, 2018.\n\n[18] J. Huang, A. Gretton, K. Borgwardt, B. Sch\u00f6lkopf, and A. Smola. Correcting sample selection bias by\n\nunlabeled data. In NIPS, 2007.\n\n[19] L. Jiang, Z. Zhou, T. Leung, L. Li, and L. Fei-Fei. Mentornet: Learning data-driven curriculum for very\n\ndeep neural networks on corrupted labels. In ICML, 2018.\n\n[20] S. Laine and T. Aila. Temporal ensembling for semi-supervised learning. In ICLR, 2017.\n\n[21] W. Li, L. Wang, W. Li, E. Agustsson, and L. Van Gool. Webvision database: Visual learning and\n\nunderstanding from web data. arXiv:1708.02862, 2017.\n\n[22] Y. Li, J. Yang, Y. Song, L. Cao, J. Luo, and J. Li. Learning from noisy labels with distillation. In ICCV,\n\n2017.\n\n[23] T. Liu and D. Tao. Classi\ufb01cation with noisy labels by importance reweighting. IEEE Transactions on\n\nPattern Analysis and Machine Intelligence, 38(3):447\u2013461, 2016.\n\n[24] W. Liu, Y. Jiang, J. Luo, and S. Chang. Noise resistant graph ranking for improved web image search. In\n\nCVPR, 2011.\n\n[25] X. Ma, Y. Wang, M. Houle, S. Zhou, S. Erfani, S. Xia, S. Wijewickrema, and J. Bailey. Dimensionality-\n\ndriven learning with noisy labels. In ICML, 2018.\n\n10\n\n\f[26] E. Malach and S. Shalev-Shwartz. Decoupling\" when to update\" from\" how to update\". In NIPS, 2017.\n\n[27] H. Masnadi-Shirazi and N. Vasconcelos. On the design of loss functions for classi\ufb01cation:\n\nrobustness to outliers, and savageboost. In NIPS, 2009.\n\ntheory,\n\n[28] A. Menon, B. Van Rooyen, C. Ong, and B. Williamson. Learning from corrupted binary labels via\n\nclass-probability estimation. In ICML, 2015.\n\n[29] R. Michalski, J. Carbonell, and T. Mitchell. Machine Learning: An Arti\ufb01cial Intelligence Approach.\n\nSpringer Science & Business Media, 2013.\n\n[30] T. Miyato, S. Maeda, M. Koyama, and S. Ishii. Virtual adversarial training: A regularization method for\n\nsupervised and semi-supervised learning. ICLR, 2016.\n\n[31] N. Natarajan, I. Dhillon, P. Ravikumar, and A. Tewari. Learning with noisy labels. In NIPS, 2013.\n\n[32] G. Patrini, A. Rozza, A. Menon, R. Nock, and L. Qu. Making deep neural networks robust to label noise:\n\nA loss correction approach. In CVPR, 2017.\n\n[33] V. Raykar, S. Yu, L. Zhao, G. Valadez, C. Florin, L. Bogoni, and L. Moy. Learning from crowds. Journal\n\nof Machine Learning Research, 11(Apr):1297\u20131322, 2010.\n\n[34] S. Reed, H. Lee, D. Anguelov, C. Szegedy, D. Erhan, and A. Rabinovich. Training deep neural networks\n\non noisy labels with bootstrapping. In ICLR, 2015.\n\n[35] M. Ren, W. Zeng, B. Yang, and R. Urtasun. Learning to reweight examples for robust deep learning. In\n\nICML, 2018.\n\n[36] V. Rockova and K. McAlinn. Dynamic variable selection with Spike-and-Slab process priors.\n\narXiv:1708.00085, 2017.\n\n[37] F. Rodrigues and F. Pereira. Deep learning from crowds. In AAAI, 2018.\n\n[38] T. Sanderson and C. Scott. Class proportion estimation with application to multiclass anomaly rejection.\n\nIn AISTATS, 2014.\n\n[39] S. Sukhbaatar, J. Bruna, M. Paluri, L. Bourdev, and R. Fergus. Training convolutional networks with noisy\n\nlabels. In ICLR workshop, 2015.\n\n[40] D. Tanaka, D. Ikami, T. Yamasaki, and K. Aizawa. Joint optimization framework for learning with noisy\n\nlabels. In CVPR, 2018.\n\n[41] A. Tarvainen and H. Valpola. Mean teachers are better role models: Weight-averaged consistency targets\n\nimprove semi-supervised deep learning results. In NIPS, 2017.\n\n[42] D. Tran, R. Ranganath, and D. Blei. Hierarchical implicit models and likelihood-free variational inference.\n\nIn NIPS, 2017.\n\n[43] A. Veit, N. Alldrin, G. Chechik, I. Krasin, A. Gupta, and S. Belongie. Learning from noisy large-scale\n\ndatasets with minimal supervision. In CVPR, 2017.\n\n[44] Y. Wang, W. Liu, X. Ma, J. Bailey, H. Zha, L. Song, and S. Xia. Iterative learning with open-set noisy\n\nlabels. In CVPR, 2018.\n\n[45] P. Welinder, S. Branson, P. Perona, and S. Belongie. The multidimensional wisdom of crowds. In NIPS,\n\n2010.\n\n[46] T. Xiao, T. Xia, Y. Yang, C. Huang, and X. Wang. Learning from massive noisy labeled data for image\n\nclassi\ufb01cation. In CVPR, 2015.\n\n[47] Y. Yan, R. Rosales, G. Fung, R. Subramanian, and J. Dy. Learning from multiple annotators with varying\n\nexpertise. Machine Learning, 95(3):291\u2013327, 2014.\n\n[48] C. Zhang, S. Bengio, M. Hardt, B. Recht, and O. Vinyals. Understanding deep learning requires rethinking\n\ngeneralization. In ICLR, 2017.\n\n[49] Z. Zhang and M. Sabuncu. Generalized cross entropy loss for training deep neural networks with noisy\n\nlabels. In NIPS, 2018.\n\n11\n\n\f", "award": [], "sourceid": 2820, "authors": [{"given_name": "Bo", "family_name": "Han", "institution": "RIKEN & UTS"}, {"given_name": "Jiangchao", "family_name": "Yao", "institution": "Shanghai Jiao Tong University"}, {"given_name": "Gang", "family_name": "Niu", "institution": "RIKEN"}, {"given_name": "Mingyuan", "family_name": "Zhou", "institution": "University of Texas at Austin"}, {"given_name": "Ivor", "family_name": "Tsang", "institution": "University of Technology, Sydney"}, {"given_name": "Ya", "family_name": "Zhang", "institution": "Cooperative Medianet Innovation Center, Shang hai Jiao Tong University"}, {"given_name": "Masashi", "family_name": "Sugiyama", "institution": "RIKEN / University of Tokyo"}]}