{"title": "Self-Supervised Generalisation with Meta Auxiliary Learning", "book": "Advances in Neural Information Processing Systems", "page_first": 1679, "page_last": 1689, "abstract": "Learning with auxiliary tasks can improve the ability of a primary task to generalise. However, this comes at the cost of manually labelling auxiliary data. We propose a new method which automatically learns appropriate labels for an auxiliary task, such that any supervised learning task can be improved without requiring access to any further data. The approach is to train two neural networks: a label-generation network to predict the auxiliary labels, and a multi-task network to train the primary task alongside the auxiliary task. The loss for the label-generation network incorporates the loss of the multi-task network, and so this interaction between the two networks can be seen as a form of meta learning with a double gradient. We show that our proposed method, Meta AuXiliary Learning (MAXL), outperforms single-task learning on 7 image datasets, without requiring any additional data. We also show that MAXL outperforms several other baselines for generating auxiliary labels, and is even competitive when compared with human-defined auxiliary labels. The self-supervised nature of our method leads to a promising new direction towards automated generalisation. Source code can be found at \\url{https://github.com/lorenmt/maxl}.", "full_text": "Self-Supervised Generalisation with\n\nMeta Auxiliary Learning\n\nShikun Liu\n\nAndrew J. Davison\n\nEdward Johns\n\nDepartment of Computing, Imperial College London\n\n{shikun.liu17, a.davison, e.johns}@imperial.ac.uk\n\nAbstract\n\nLearning with auxiliary tasks can improve the ability of a primary task to generalise.\nHowever, this comes at the cost of manually labelling auxiliary data. We propose a\nnew method which automatically learns appropriate labels for an auxiliary task,\nsuch that any supervised learning task can be improved without requiring access to\nany further data. The approach is to train two neural networks: a label-generation\nnetwork to predict the auxiliary labels, and a multi-task network to train the\nprimary task alongside the auxiliary task. The loss for the label-generation network\nincorporates the loss of the multi-task network, and so this interaction between the\ntwo networks can be seen as a form of meta learning with a double gradient. We\nshow that our proposed method, Meta AuXiliary Learning (MAXL), outperforms\nsingle-task learning on 7 image datasets, without requiring any additional data.\nWe also show that MAXL outperforms several other baselines for generating\nauxiliary labels, and is even competitive when compared with human-de\ufb01ned\nauxiliary labels. The self-supervised nature of our method leads to a promising\nnew direction towards automated generalisation. Source code can be found at\nhttps://github.com/lorenmt/maxl.\n\n1\n\nIntroduction\n\nAuxiliary learning is a method to improve the ability of a primary task to generalise to unseen data,\nby training on additional auxiliary tasks alongside this primary task. The sharing of features across\ntasks results in additional relevant features being available, which otherwise would not have been\nlearned from training only on the primary task. The broader support of these features, across new\ninterpretations of input data, then allows for better generalisation of the primary task. Auxiliary\nlearning is similar to multi-task learning [5], except that only the performance of the primary task is\nof importance, and the auxiliary tasks are included purely to assist the primary task.\nWe now rethink this generalisation by considering that not all auxiliary tasks are created equal. In\nsupervised auxiliary learning [21, 33], auxiliary tasks can be manually chosen to complement the\nprimary task. However, this requires both domain knowledge to choose the auxiliary tasks, and\nlabelled data to train the auxiliary tasks. Unsupervised auxiliary learning [11, 36, 35, 16, 1] removes\nthe need for labelled data, but at the expense of a limited set of auxiliary tasks which may not\nbe bene\ufb01cial for the primary task. By combining the merits of both supervised and unsupervised\nauxiliary learning, the ideal framework would be one with the \ufb02exibility to automatically determine\nthe optimal auxiliary tasks, but without the need to manually label these auxiliary tasks.\nIn this paper, we propose to achieve such a framework with a simple and general meta-learning\nalgorithm, which we call Meta AuXiliary Learning (MAXL). We \ufb01rst observe that in supervised\nlearning, de\ufb01ning a task can equate to de\ufb01ning the labels for that task. Therefore, for a given primary\ntask, an optimal auxiliary task is one which has optimal labels. The goal of MAXL is then to\nautomatically discover these auxiliary labels using only the labels for the primary task.\n\n33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada.\n\n\fThe approach is to train two neural networks.\nFirst, a multi-task network, which trains the\nprimary task and the auxiliary task, as in\nstandard auxiliary learning. Second, a label-\ngeneration network, which learns the labels\nfor the auxiliary task. The key idea behind\nMAXL is to then use the performance of the\nprimary task, when trained alongside the aux-\niliary task in one iteration, to improve the\nauxiliary labels for the next iteration. This\nis achieved by de\ufb01ning the loss for the label-\ngeneration network as a function of the multi-\ntask network\u2019s performance on primary task\ntraining data. In this way, the two networks are\ntightly coupled and can be trained end-to-end.\nIn our experiments on image classi\ufb01cation, we show three key results. First, MAXL outperforms\nsingle-task learning across seven image datasets, even though both methods use the same amount\nof labelled data. Second, MAXL outperforms a number of baseline methods for creating auxiliary\nlabels. Third, when manually-de\ufb01ned auxiliary labels exist, such as those from an image hierarchy,\nMAXL is at least as competitive, despite not actually using the manually-de\ufb01ned auxiliary labels.\nThis last result shows that MAXL is able to remove the need for manual labelling of auxiliary tasks,\nwhich brings the advantages of auxiliary learning to new datasets previously not compatible with\nauxiliary learning, due to the lack of auxiliary labels.\n\nFigure 1: Illustration of MAXL framework. The\nprimary task is trained with ground-truth labels,\nwhereas the auxiliary task is trained with learned\nlabels.\n\n2 Related Work\n\nThis work brings together ideas from a number of related areas of machine learning.\nMulti-task & Transfer Learning\nThe aim of multi-task learning (MTL) is to achieve shared\nrepresentations by simultaneously training a set of related learning tasks. In this case, the learned\nknowledge used to share across domains is encoded into the feature representations to improve\nperformance of each individual task, since knowledge distilled from related tasks are interdependent.\nThe success of deep neural networks has led to some recent methods advancing the multi-task\narchitecture design, such as applying a linear combination of task-speci\ufb01c features [25, 8, 17]. [23]\napplied soft-attention modules as feature selectors, allowing learning of both task-shared and task-\nspeci\ufb01c features in an end-to-end manner. Transfer learning is another common approach to improve\ngeneralisation, by incorporating knowledge learned from one or more related domains. Pre-training\na model with a large-scale dataset such as ImageNet [7] has become a standard practise in many\nvision-based applications.\nAuxiliary Learning Whilst in multi-task learning the goal is high test accuracy across all tasks,\nauxiliary learning differs in that high test accuracy is only required for a single primary task, and the\nrole of the auxiliary tasks is to assist in generalisation of this primary task. Applying related learning\ntasks is one straightforward approach to assist primary tasks. [33] applied auxiliary supervision\nwith phoneme recognition at intermediate low-level representations to improve the performance of\nconversational speech recognition. [21] chose auxiliary tasks which can be obtained with low effort,\nsuch as global descriptions of a scene, to boost the performance for single scene depth estimation\nand semantic segmentation. By carefully choosing a pair of learning tasks, we may also perform\nauxiliary learning without ground truth labels, in an unsupervised manner. [16] introduced a method\nfor improving agent learning in Atari games, by building unsupervised auxiliary tasks to predict\nthe onset of immediate rewards from a short historical context. [11, 36] proposed image synthesis\nnetworks to perform unsupervised monocular depth estimation by predicting the relative pose of\nmultiple cameras. [9] proposed to use cosine similarity as an adaptive task weighting to determine\nwhen a de\ufb01ned auxiliary task is useful. Differing from these works which require prior knowledge to\nmanually de\ufb01ne suitable auxiliary tasks, our proposed method requires no additional task knowledge,\nsince it generates useful auxiliary knowledge in a purely unsupervised fashion. The most similar\nwork to ours is [35], in which meta learning was used in auxiliary data selection. However, this still\nrequires manually-labelled data from which these selections are made, whilst our method is able to\ngenerate auxiliary data from scratch.\n\n2\n\nInputDataNeuralNetworkPrimaryTaskPredictionAuxiliaryTaskPredictionGroundTruthLabelsdogGeneratedLabels\fMeta Learning Meta learning (or learning to learn) aims to induce the learning algorithm itself.\nEarly works in meta learning explored automatically learning update rules for neural models [4, 3, 29].\nRecent approaches have focussed on learning optimisers for deep networks based on LSTMs [26] or\nsynthetic gradients [2, 15]. Meta learning has also been studied for \ufb01nding optimal hyper-parameters\n[20] and a good initialisation for few-shot learning [10]. [28] also investigated few shot learning\nvia an external memory module. [34, 31] realised few shot learning in the instance space via a\ndifferentiable nearest-neighbour approach. Related to meta learning, our framework is designed to\nlearn to generate useful auxiliary labels, which themselves are used in another learning procedure.\n\n3 Meta Auxiliary Learning\n\nWe now introduce our method for automatically generating optimal labels for an auxiliary task, which\nwe call Meta AuXiliary Learning (MAXL). In this paper, we only consider a single auxiliary task,\nalthough our method is general and could be modi\ufb01ed to include several auxiliary tasks. We only\nfocus on classi\ufb01cation tasks for both the primary and auxiliary tasks, but the overall framework could\nalso be extended to regression. As such, the auxiliary task is de\ufb01ned as a sub-class labelling problem,\nwhere each primary class is associated with a number of auxiliary classes, in a two-level hierarchy.\nFor example, if manually-de\ufb01ned labels were used, a primary class could be \u201cDog\", and one of the\nauxiliary classes could be \u201cLabrador\".\n\n3.1 Problem Setup\n\nThe goal of MAXL is to generate labels for the auxiliary task which, when trained alongside a primary\ntask, improve the performance of the primary task. To accomplish this, we train two networks: a\nmulti-task network, which trains on the primary and auxiliary task in a standard multi-task learning\nsetting, and a label-generation network, which generates the labels for the auxiliary task.\nWe denote the multi-task network as a function f\u03b81(x) with parameters \u03b81 which takes an input x,\nand the label-generation network as a function g\u03b82 (x) with parameters \u03b82 which takes the same input\nx. Parameters \u03b81 are updated by losses of both the primary and auxiliary tasks, as is standard in\nauxiliary learning. However, \u03b82 is updated only by the performance of the primary task.\nIn the multi-task network, we apply a hard parameter sharing approach [27] in which we predict\nboth the primary and auxiliary classes using the shared set of features \u03b81. At the \ufb01nal feature layer,\nf\u03b81 (x), we then further apply task-speci\ufb01c layers to output the corresponding prediction for each\ntask, using a SoftMax function. We denote the primary task predictions by f pri\n(x), and the auxiliary\n\u03b81\ntask predictions by f aux\n(x). And we denote the ground-truth primary task labels by ypri, and the\n\u03b81\ngenerated auxiliary task labels by yaux.\nWe found during experiments that training bene\ufb01ted from assigning each primary class its own unique\nset of possible auxiliary classes, rather than sharing all auxiliary classes across all primary classes.\nIn the label-generation network, we therefore de\ufb01ne a hierarchical structure \u03c8 which determines\nthe number of auxiliary classes for each primary class. At the output layer of the label-generation\nnetwork, we then apply a masked SoftMax function to ensure that each output node represents an\nauxiliary class corresponding to only one primary class, as described further in Section 3.3. Given\ninput data x, the label-generation network then takes in the hierarchy \u03c8 together with the ground-\ntruth primary task label ypri, and applies Mask SoftMax to predict the auxiliary labels, denoted by\nyaux = ggen\n(x, ypri, \u03c8). A visualisation of the overall MAXL framework is shown in Figure 2. Note\n\u03b82\nthat we allow soft assignment for the generated auxiliary labels, rather than one-hot encoding, which\nwe found during experiments enables greater \ufb02exibility to obtain optimal auxiliary labels.\n\n3.2 Model Objectives\n\nThe multi-task network is trained alongside the label-generation network, with two stages per epoch.\nIn the \ufb01rst stage, the multi-task network is trained using primary task ground-truth labels, and the\nauxiliary labels from the label-generation network. In the second stage, the label-generation network\nis updated by computing its gradients with respect to the multi-task network\u2019s prediction accuracy on\nthe primary task. We train both networks in an iterative manner until convergence.\nIn the \ufb01rst stage of each epoch, given target auxiliary labels as determined by the label-generation\nnetwork, the multi-task network is trained to predict these labels for the auxiliary task, alongside the\n\n3\n\n\f(a) Two networks applied in MAXL\n\n(b) SoftMax versus Mask SoftMax\n\nFigure 2: (a) Illustration of the two networks which make up MAXL. Dashed white boxes represent\ndata generated by neural networks, solid white boxes represent given data, and coloured boxes\nrepresent functions. The double arrow represents equivalence. (b) Illustration of vanilla SoftMax and\nMask SoftMax with 2 primary classes. Vanilla SoftMax outputs over all 4 auxiliary classes, whereas\nMask SoftMax outputs over a hierarchical structure \u03c8 = [2, 2].\n\nground-truth labels for the primary task. For both the primary and auxiliary tasks, we apply the focal\nloss [22] with a focusing parameter \u03b3 = 2, de\ufb01ned as:\n\nL(\u02c6y, y) = \u2212y(1 \u2212 \u02c6y)\u03b3 log(\u02c6y),\n\n(1)\nwhere \u02c6y is the predicted label and y is the target label. The focal loss helps to focus on the incorrectly\npredicted labels, which we found improved performance during our experimental evaluation compared\nwith the regular cross-entropy log loss.\nTo update parameters \u03b81 of the multi-task network, we de\ufb01ne the multi-task objective as follows:\n\n(x(i)), ypri\n\n(i)) + L(f aux\n\n\u03b81\n\n(x(i)), yaux\n(i) )\n\n(2)\n\n(cid:16)L(f pri\n\n\u03b81\n\narg min\n\n\u03b81\n\n(cid:17)\n\n(cid:17)\n\n(x(i), ypri\n\n(i) = ggen\n\nwhere (i) represents the ith batch from the training data, and yaux\nby the label-generation network.\nIn the second stage of each epoch, the label-generation network is then updated by encouraging\nauxiliary labels to be chosen such that, if the multi-task network were to be trained using these\nauxiliary labels, the performance of the primary task would be maximised on this same training data.\nLeveraging the performance of the multi-task network to train the label-generation network can be\nconsidered as a form of meta learning. Therefore, to update parameters \u03b82 of the label-generation\nnetwork, we de\ufb01ne the meta objective as follows:\nL(f pri\n\n(i), \u03c8) is generated\n\narg min\n\n\u03b82\n\n(3)\n\n(x(i)), ypri\n\n(i)) .\n\n\u03b82\n\n\u03b8+\n1\n\n1 represents the weights of the multi-task network after one gradient update using the\n\nHere, \u03b8+\nmulti-task loss de\ufb01ned in Equation 2:\n1 = \u03b81 \u2212 \u03b1\u2207\u03b81\n\u03b8+\n\n(cid:16)L(f\n\npri\n\u03b81\n\n(x(i)), y\n\npri\n\n(i)) + L(f aux\n\n\u03b81 (x(i)), yaux\n(i) )\n\n,\n\n(4)\n\nwhere \u03b1 is the learning rate.\nThe trick in this meta objective is that we perform a derivative over a derivative (a Hessian matrix) to\nupdate \u03b82, by using a retained computational graph of \u03b8+\n1 in order to compute derivatives with respect\nto \u03b82. This second derivative trick was also proposed in several other meta-learning frameworks such\nas [10] and [35].\nHowever, we found that the generated auxiliary labels can easily collapse, such that the label-\ngeneration network always generates the same auxiliary label. This leaves parameters \u03b82 in a local\n\n4\n\nLabel-GenerationNetworkMulti-taskNetworkxAAACKHicbVDLSsNAFJ3UV42vVpdugkVwY0lE0GXRjcsW7APaUCaTm3ToZBJmJmoJ/QK3+hN+jTvp1i9x0kawrRcuHM65l3vu8RJGpbLtmVHa2Nza3invmnv7B4dHlepxR8apINAmMYtFz8MSGOXQVlQx6CUCcOQx6Hrj+1zvPoGQNOaPapKAG+GQ04ASrDTVehlWanbdnpe1DpwC1FBRzWHVqA78mKQRcEUYlrLv2IlyMywUJQym5iCVkGAyxiH0NeQ4Aulmc6dT61wzvhXEQjdX1pz9u5HhSMpJ5OnJCKuRXNVy8j+tn6rg1s0oT1IFnCwOBSmzVGzlb1s+FUAUm2iAiaDaq0VGWGCidDhLV8ALNR/F3F/5RTxf/ipuljvwQdKQT01Th+isRrYOOld1x647reta466Is4xO0Rm6QA66QQ30gJqojQgC9Ire0LvxYXwaX8ZsMVoyip0TtFTG9w/mMaY1AAACKHicbVDLSsNAFJ3UV42vVpdugkVwY0lE0GXRjcsW7APaUCaTm3ToZBJmJmoJ/QK3+hN+jTvp1i9x0kawrRcuHM65l3vu8RJGpbLtmVHa2Nza3invmnv7B4dHlepxR8apINAmMYtFz8MSGOXQVlQx6CUCcOQx6Hrj+1zvPoGQNOaPapKAG+GQ04ASrDTVehlWanbdnpe1DpwC1FBRzWHVqA78mKQRcEUYlrLv2IlyMywUJQym5iCVkGAyxiH0NeQ4Aulmc6dT61wzvhXEQjdX1pz9u5HhSMpJ5OnJCKuRXNVy8j+tn6rg1s0oT1IFnCwOBSmzVGzlb1s+FUAUm2iAiaDaq0VGWGCidDhLV8ALNR/F3F/5RTxf/ipuljvwQdKQT01Th+isRrYOOld1x647reta466Is4xO0Rm6QA66QQ30gJqojQgC9Ire0LvxYXwaX8ZsMVoyip0TtFTG9w/mMaY1AAACKHicbVDLSsNAFJ3UV42vVpdugkVwY0lE0GXRjcsW7APaUCaTm3ToZBJmJmoJ/QK3+hN+jTvp1i9x0kawrRcuHM65l3vu8RJGpbLtmVHa2Nza3invmnv7B4dHlepxR8apINAmMYtFz8MSGOXQVlQx6CUCcOQx6Hrj+1zvPoGQNOaPapKAG+GQ04ASrDTVehlWanbdnpe1DpwC1FBRzWHVqA78mKQRcEUYlrLv2IlyMywUJQym5iCVkGAyxiH0NeQ4Aulmc6dT61wzvhXEQjdX1pz9u5HhSMpJ5OnJCKuRXNVy8j+tn6rg1s0oT1IFnCwOBSmzVGzlb1s+FUAUm2iAiaDaq0VGWGCidDhLV8ALNR/F3F/5RTxf/ipuljvwQdKQT01Th+isRrYOOld1x647reta466Is4xO0Rm6QA66QQ30gJqojQgC9Ire0LvxYXwaX8ZsMVoyip0TtFTG9w/mMaY1AAACKHicbVDLSsNAFJ3UV42vVpdugkVwY0lE0GXRjcsW7APaUCaTm3ToZBJmJmoJ/QK3+hN+jTvp1i9x0kawrRcuHM65l3vu8RJGpbLtmVHa2Nza3invmnv7B4dHlepxR8apINAmMYtFz8MSGOXQVlQx6CUCcOQx6Hrj+1zvPoGQNOaPapKAG+GQ04ASrDTVehlWanbdnpe1DpwC1FBRzWHVqA78mKQRcEUYlrLv2IlyMywUJQym5iCVkGAyxiH0NeQ4Aulmc6dT61wzvhXEQjdX1pz9u5HhSMpJ5OnJCKuRXNVy8j+tn6rg1s0oT1IFnCwOBSmzVGzlb1s+FUAUm2iAiaDaq0VGWGCidDhLV8ALNR/F3F/5RTxf/ipuljvwQdKQT01Th+isRrYOOld1x647reta466Is4xO0Rm6QA66QQ30gJqojQgC9Ire0LvxYXwaX8ZsMVoyip0TtFTG9w/mMaY1f\u27131(x)AAACOHicbVDLSsNAFJ34Nj7a6tJNsAi6sCQi6FJ047KCVaENYTK5SQcnkzBzo5bQL3GrP+GfuHMnbv0Cpw/Bth4YOJxzL/fMCXPBNbruuzU3v7C4tLyyaq+tb2xWqrWtG50VikGLZSJTdyHVILiEFnIUcJcroGko4Da8vxj4tw+gNM/kNfZy8FOaSB5zRtFIQbUSB2UHu4A08Pr7TwdBte423CGcWeKNSZ2M0QxqVq0TZaxIQSITVOu25+bol1QhZwL6dqfQkFN2TxNoGyppCtovh8n7zp5RIifOlHkSnaH6d6Okqda9NDSTKcWunvYG4n9eu8D41C+5zAsEyUaH4kI4mDmDGpyIK2AoeoZQprjJ6rAuVZShKWviCoSJ0dNMRlN/UY+Hv45fDhJEoHki+7ZtSvSmK5slN0cNz214V8f1s/NxnStkh+ySfeKRE3JGLkmTtAgjBXkmL+TVerM+rE/razQ6Z413tskErO8fiQ+sAg==AAACOHicbVDLSsNAFJ34Nj7a6tJNsAi6sCQi6FJ047KCVaENYTK5SQcnkzBzo5bQL3GrP+GfuHMnbv0Cpw/Bth4YOJxzL/fMCXPBNbruuzU3v7C4tLyyaq+tb2xWqrWtG50VikGLZSJTdyHVILiEFnIUcJcroGko4Da8vxj4tw+gNM/kNfZy8FOaSB5zRtFIQbUSB2UHu4A08Pr7TwdBte423CGcWeKNSZ2M0QxqVq0TZaxIQSITVOu25+bol1QhZwL6dqfQkFN2TxNoGyppCtovh8n7zp5RIifOlHkSnaH6d6Okqda9NDSTKcWunvYG4n9eu8D41C+5zAsEyUaH4kI4mDmDGpyIK2AoeoZQprjJ6rAuVZShKWviCoSJ0dNMRlN/UY+Hv45fDhJEoHki+7ZtSvSmK5slN0cNz214V8f1s/NxnStkh+ySfeKRE3JGLkmTtAgjBXkmL+TVerM+rE/razQ6Z413tskErO8fiQ+sAg==AAACOHicbVDLSsNAFJ34Nj7a6tJNsAi6sCQi6FJ047KCVaENYTK5SQcnkzBzo5bQL3GrP+GfuHMnbv0Cpw/Bth4YOJxzL/fMCXPBNbruuzU3v7C4tLyyaq+tb2xWqrWtG50VikGLZSJTdyHVILiEFnIUcJcroGko4Da8vxj4tw+gNM/kNfZy8FOaSB5zRtFIQbUSB2UHu4A08Pr7TwdBte423CGcWeKNSZ2M0QxqVq0TZaxIQSITVOu25+bol1QhZwL6dqfQkFN2TxNoGyppCtovh8n7zp5RIifOlHkSnaH6d6Okqda9NDSTKcWunvYG4n9eu8D41C+5zAsEyUaH4kI4mDmDGpyIK2AoeoZQprjJ6rAuVZShKWviCoSJ0dNMRlN/UY+Hv45fDhJEoHki+7ZtSvSmK5slN0cNz214V8f1s/NxnStkh+ySfeKRE3JGLkmTtAgjBXkmL+TVerM+rE/razQ6Z413tskErO8fiQ+sAg==AAACOHicbVDLSsNAFJ34Nj7a6tJNsAi6sCQi6FJ047KCVaENYTK5SQcnkzBzo5bQL3GrP+GfuHMnbv0Cpw/Bth4YOJxzL/fMCXPBNbruuzU3v7C4tLyyaq+tb2xWqrWtG50VikGLZSJTdyHVILiEFnIUcJcroGko4Da8vxj4tw+gNM/kNfZy8FOaSB5zRtFIQbUSB2UHu4A08Pr7TwdBte423CGcWeKNSZ2M0QxqVq0TZaxIQSITVOu25+bol1QhZwL6dqfQkFN2TxNoGyppCtovh8n7zp5RIifOlHkSnaH6d6Okqda9NDSTKcWunvYG4n9eu8D41C+5zAsEyUaH4kI4mDmDGpyIK2AoeoZQprjJ6rAuVZShKWviCoSJ0dNMRlN/UY+Hv45fDhJEoHki+7ZtSvSmK5slN0cNz214V8f1s/NxnStkh+ySfeKRE3JGLkmTtAgjBXkmL+TVerM+rE/razQ6Z413tskErO8fiQ+sAg==PrimaryTaskSoftMaxAuxiliaryTaskSoftMaxGeneratedTaskMaskSoftMaxfaux\u27131(x)AAACRXicbVDLSsNAFJ34tr5aXeoiWARdWBIRdFl041LBVqGNYTK5aQcnkzBzoy0hG7/Grf6E3+BHuBO3On0IWj0wcOace7n3niAVXKPjvFpT0zOzc/MLi6Wl5ZXVtXJlvamTTDFosEQk6jqgGgSX0ECOAq5TBTQOBFwFt6cD/+oOlOaJvMR+Cl5MO5JHnFE0kl/eim7yNkIPc5r1isI3ny4g9d1it7fnl6tOzRnC/kvcMamSMc79ilVphwnLYpDIBNW65TopejlVyJmAotTONKSU3dIOtAyVNAbt5cMzCnvHKKEdJco8ifZQ/dmR01jrfhyYyphiV096A/E/r5VhdOzlXKYZgmSjQVEmbEzsQSZ2yBUwFH1DKFPc7GqzLlWUoUnu1xQIOkaPExlO3KLu978dLx9sEILmHVmUSiZEdzKyv6R5UHOdmntxWK2fjONcIJtkm+wSlxyROjkj56RBGHkgj+SJPFsv1pv1bn2MSqescc8G+QXr8wsItbJBAAACRXicbVDLSsNAFJ34tr5aXeoiWARdWBIRdFl041LBVqGNYTK5aQcnkzBzoy0hG7/Grf6E3+BHuBO3On0IWj0wcOace7n3niAVXKPjvFpT0zOzc/MLi6Wl5ZXVtXJlvamTTDFosEQk6jqgGgSX0ECOAq5TBTQOBFwFt6cD/+oOlOaJvMR+Cl5MO5JHnFE0kl/eim7yNkIPc5r1isI3ny4g9d1it7fnl6tOzRnC/kvcMamSMc79ilVphwnLYpDIBNW65TopejlVyJmAotTONKSU3dIOtAyVNAbt5cMzCnvHKKEdJco8ifZQ/dmR01jrfhyYyphiV096A/E/r5VhdOzlXKYZgmSjQVEmbEzsQSZ2yBUwFH1DKFPc7GqzLlWUoUnu1xQIOkaPExlO3KLu978dLx9sEILmHVmUSiZEdzKyv6R5UHOdmntxWK2fjONcIJtkm+wSlxyROjkj56RBGHkgj+SJPFsv1pv1bn2MSqescc8G+QXr8wsItbJBAAACRXicbVDLSsNAFJ34tr5aXeoiWARdWBIRdFl041LBVqGNYTK5aQcnkzBzoy0hG7/Grf6E3+BHuBO3On0IWj0wcOace7n3niAVXKPjvFpT0zOzc/MLi6Wl5ZXVtXJlvamTTDFosEQk6jqgGgSX0ECOAq5TBTQOBFwFt6cD/+oOlOaJvMR+Cl5MO5JHnFE0kl/eim7yNkIPc5r1isI3ny4g9d1it7fnl6tOzRnC/kvcMamSMc79ilVphwnLYpDIBNW65TopejlVyJmAotTONKSU3dIOtAyVNAbt5cMzCnvHKKEdJco8ifZQ/dmR01jrfhyYyphiV096A/E/r5VhdOzlXKYZgmSjQVEmbEzsQSZ2yBUwFH1DKFPc7GqzLlWUoUnu1xQIOkaPExlO3KLu978dLx9sEILmHVmUSiZEdzKyv6R5UHOdmntxWK2fjONcIJtkm+wSlxyROjkj56RBGHkgj+SJPFsv1pv1bn2MSqescc8G+QXr8wsItbJBAAACRXicbVDLSsNAFJ34tr5aXeoiWARdWBIRdFl041LBVqGNYTK5aQcnkzBzoy0hG7/Grf6E3+BHuBO3On0IWj0wcOace7n3niAVXKPjvFpT0zOzc/MLi6Wl5ZXVtXJlvamTTDFosEQk6jqgGgSX0ECOAq5TBTQOBFwFt6cD/+oOlOaJvMR+Cl5MO5JHnFE0kl/eim7yNkIPc5r1isI3ny4g9d1it7fnl6tOzRnC/kvcMamSMc79ilVphwnLYpDIBNW65TopejlVyJmAotTONKSU3dIOtAyVNAbt5cMzCnvHKKEdJco8ifZQ/dmR01jrfhyYyphiV096A/E/r5VhdOzlXKYZgmSjQVEmbEzsQSZ2yBUwFH1DKFPc7GqzLlWUoUnu1xQIOkaPExlO3KLu978dLx9sEILmHVmUSiZEdzKyv6R5UHOdmntxWK2fjONcIJtkm+wSlxyROjkj56RBGHkgj+SJPFsv1pv1bn2MSqescc8G+QXr8wsItbJBg\u27132(x)AAACOHicbVDLSsNAFJ34rPHV6tJNsAi6sCRF0KXoxmUFW4U2hMnkNh2cTMLMjVpCvsSt/oR/4s6duPULnD4ErR4YOJxzL/fMCTPBNbruqzU3v7C4tFxZsVfX1jc2q7Wtjk5zxaDNUpGqm5BqEFxCGzkKuMkU0CQUcB3eno/86ztQmqfyCocZ+AmNJe9zRtFIQXUzDooeDgBp0Cz3Hw6Cat1tuGM4f4k3JXUyRSuoWbVelLI8AYlMUK27npuhX1CFnAko7V6uIaPslsbQNVTSBLRfjJOXzp5RIqefKvMkOmP150ZBE62HSWgmE4oDPeuNxP+8bo79E7/gMssRJJsc6ufCwdQZ1eBEXAFDMTSEMsVNVocNqKIMTVm/rkAYGz1JZTTzF3V/+O34xShBBJrHsrRtU6I3W9lf0mk2PLfhXR7VT8+mdVbIDtkl+8Qjx+SUXJAWaRNGcvJInsiz9WK9We/Wx2R0zprubJNfsD6/AIyprAQ=AAACOHicbVDLSsNAFJ34rPHV6tJNsAi6sCRF0KXoxmUFW4U2hMnkNh2cTMLMjVpCvsSt/oR/4s6duPULnD4ErR4YOJxzL/fMCTPBNbruqzU3v7C4tFxZsVfX1jc2q7Wtjk5zxaDNUpGqm5BqEFxCGzkKuMkU0CQUcB3eno/86ztQmqfyCocZ+AmNJe9zRtFIQXUzDooeDgBp0Cz3Hw6Cat1tuGM4f4k3JXUyRSuoWbVelLI8AYlMUK27npuhX1CFnAko7V6uIaPslsbQNVTSBLRfjJOXzp5RIqefKvMkOmP150ZBE62HSWgmE4oDPeuNxP+8bo79E7/gMssRJJsc6ufCwdQZ1eBEXAFDMTSEMsVNVocNqKIMTVm/rkAYGz1JZTTzF3V/+O34xShBBJrHsrRtU6I3W9lf0mk2PLfhXR7VT8+mdVbIDtkl+8Qjx+SUXJAWaRNGcvJInsiz9WK9We/Wx2R0zprubJNfsD6/AIyprAQ=AAACOHicbVDLSsNAFJ34rPHV6tJNsAi6sCRF0KXoxmUFW4U2hMnkNh2cTMLMjVpCvsSt/oR/4s6duPULnD4ErR4YOJxzL/fMCTPBNbruqzU3v7C4tFxZsVfX1jc2q7Wtjk5zxaDNUpGqm5BqEFxCGzkKuMkU0CQUcB3eno/86ztQmqfyCocZ+AmNJe9zRtFIQXUzDooeDgBp0Cz3Hw6Cat1tuGM4f4k3JXUyRSuoWbVelLI8AYlMUK27npuhX1CFnAko7V6uIaPslsbQNVTSBLRfjJOXzp5RIqefKvMkOmP150ZBE62HSWgmE4oDPeuNxP+8bo79E7/gMssRJJsc6ufCwdQZ1eBEXAFDMTSEMsVNVocNqKIMTVm/rkAYGz1JZTTzF3V/+O34xShBBJrHsrRtU6I3W9lf0mk2PLfhXR7VT8+mdVbIDtkl+8Qjx+SUXJAWaRNGcvJInsiz9WK9We/Wx2R0zprubJNfsD6/AIyprAQ=AAACOHicbVDLSsNAFJ34rPHV6tJNsAi6sCRF0KXoxmUFW4U2hMnkNh2cTMLMjVpCvsSt/oR/4s6duPULnD4ErR4YOJxzL/fMCTPBNbruqzU3v7C4tFxZsVfX1jc2q7Wtjk5zxaDNUpGqm5BqEFxCGzkKuMkU0CQUcB3eno/86ztQmqfyCocZ+AmNJe9zRtFIQXUzDooeDgBp0Cz3Hw6Cat1tuGM4f4k3JXUyRSuoWbVelLI8AYlMUK27npuhX1CFnAko7V6uIaPslsbQNVTSBLRfjJOXzp5RIqefKvMkOmP150ZBE62HSWgmE4oDPeuNxP+8bo79E7/gMssRJJsc6ufCwdQZ1eBEXAFDMTSEMsVNVocNqKIMTVm/rkAYGz1JZTTzF3V/+O34xShBBJrHsrRtU6I3W9lf0mk2PLfhXR7VT8+mdVbIDtkl+8Qjx+SUXJAWaRNGcvJInsiz9WK9We/Wx2R0zprubJNfsD6/AIyprAQ=fpri\u27131(x)AAACRHicbVBNb9NAEB2n0IYAJSlHOFitkMIhkc2lPVblUI5BIh9SYqz1epyssl5bu+O2keULx/4SrvALOPEf+A/cEFfE5gOpSRlppaf33szsvCiXwpDn/XBqew8e7h/UHzUeP3l6+KzZOhqYrNAc+zyTmR5FzKAUCvskSOIo18jSSOIwmr9d6sMr1EZk6gMtcgxSNlUiEZyRpcLmyyQsJzRDYqFffbQQb6jMtaiq9s3rsHnidb1VufeBvwEn553L228A0AtbTnMSZ7xIURGXzJix7+UUlEyT4BKrxqQwmDM+Z1McW6hYiiYoV2dU7ivLxG6SafsUuSv2bkfJUmMWaWSdKaOZ2dWW5P+0cUHJWVAKlReEiq8XJYV0KXOXmbix0MhJLixgXAv7V5fPmGacbHLbk/R1Z2qVNFNxUC43xWjEVG1dVmL0z1M1bIb+bmL3weBN1/e6/nsb5gWsqw4v4Bja4MMpnMM76EEfOHyCz/AFvjrfnZ/OL+f32lpzNj3PYaucP38BhLa0KA==AAACRHicbVBNS8NAEN34bf2qetRDUAQ9WBIveix6sMcKVoU2hs1m0i5uNmF3opaQi0d/iUf1R4h/wf/gTbyK21bBqgMLj/fezOy8IBVco+O8WCOjY+MTk1PTpZnZufmF8uLSiU4yxaDBEpGos4BqEFxCAzkKOEsV0DgQcBpcHPT000tQmifyGLspeDFtSx5xRtFQfnk18vMWdgCp7xbnBsI15qniRbF5veWX152K0y/7L3C/wHp1+/D2/vm2VvcXrXIrTFgWg0QmqNZN10nRy6lCzgQUpVamIaXsgrahaaCkMWgv759R2BuGCe0oUeZJtPvsz46cxlp348A4Y4od/Vvrkf9pzQyjPS/nMs0QJBssijJhY2L3MrFDroCh6BpAmeLmrzbrUEUZmuSGJ6mr7bZR4kSGXt7bFILmbTl0WQ7Bt6comQzd34n9BSc7FdepuEcmzH0yqCmyQtbIJnHJLqmSGqmTBmHkhtyRB/JoPVmv1pv1PrCOWF89y2SorI9P9Ia1sg==AAACRHicbVBNS8NAEN34bf2qetRDUAQ9WBIveix6sMcKVoU2hs1m0i5uNmF3opaQi0d/iUf1R4h/wf/gTbyK21bBqgMLj/fezOy8IBVco+O8WCOjY+MTk1PTpZnZufmF8uLSiU4yxaDBEpGos4BqEFxCAzkKOEsV0DgQcBpcHPT000tQmifyGLspeDFtSx5xRtFQfnk18vMWdgCp7xbnBsI15qniRbF5veWX152K0y/7L3C/wHp1+/D2/vm2VvcXrXIrTFgWg0QmqNZN10nRy6lCzgQUpVamIaXsgrahaaCkMWgv759R2BuGCe0oUeZJtPvsz46cxlp348A4Y4od/Vvrkf9pzQyjPS/nMs0QJBssijJhY2L3MrFDroCh6BpAmeLmrzbrUEUZmuSGJ6mr7bZR4kSGXt7bFILmbTl0WQ7Bt6comQzd34n9BSc7FdepuEcmzH0yqCmyQtbIJnHJLqmSGqmTBmHkhtyRB/JoPVmv1pv1PrCOWF89y2SorI9P9Ia1sg==AAACRHicbVDLSsNAFJ3Ud31VXeoiWARdWBI3uhTduKxgW6GNYTK5aYdOJmHmRi0hG7/GrX6E/+A/uBO34vQhWPXCwOGcc++de4JUcI2O82qVZmbn5hcWl8rLK6tr65WNzaZOMsWgwRKRqOuAahBcQgM5CrhOFdA4ENAK+udDvXULSvNEXuEgBS+mXckjzigayq/sRH7ewR4g9d3ixkC4xzxVvCj27w/8StWpOaOy/wJ3AqpkUnV/w6p0woRlMUhkgmrddp0UvZwq5ExAUe5kGlLK+rQLbQMljUF7+eiMwt4zTGhHiTJPoj1if3bkNNZ6EAfGGVPs6d/akPxPa2cYnXg5l2mGINl4UZQJGxN7mIkdcgUMxcAAyhQ3f7VZjyrK0CQ3PUndHXaNEicy9PLhphA078qpy3IIvj1F2WTo/k7sL2ge1Vyn5l461dOzSZqLZJvskn3ikmNySi5InTQIIw/kkTyRZ+vFerPerY+xtWRNerbIVFmfX1cYsio=y(cid:81)(cid:83)(cid:74)AAACMnicbVDLSsNAFJ3UV62vVpdugkVwY0lE0GXRjcsK9gFtLJPJbTt0MgkzN2oJ+Q23+hH+jO7ErR/h9CHY1gMDh3Pua44fC67Rcd6t3Mrq2vpGfrOwtb2zu1cs7Td0lCgGdRaJSLV8qkFwCXXkKKAVK6ChL6DpD6/HfvMBlOaRvMNRDF5I+5L3OKNopM7ovoPwhGmseNYtlp2KM4G9TNwZKZMZat2SVewEEUtCkMgE1brtOjF6KVXImYCs0Ek0xJQNaR/ahkoagvbSydGZfWyUwO5FyjyJ9kT925HSUOtR6JvKkOJAL3pj8T+vnWDv0ku5jBMEyaaLeomwMbLHCdgBV8BQjAyhTHFzq80GVFGGJqf5SerxtG+cMJKBl443BaB5X879LAX/tyYrmAzdxcSWSeOs4joV9/a8XL2apZknh+SInBCXXJAquSE1UieMxOSZvJBX6836sD6tr2lpzpr1HJA5WN8/ZeqrVg==AAACMnicbVDLSsNAFJ3UV62vVpdugkVwY0lE0GXRjcsK9gFtLJPJbTt0MgkzN2oJ+Q23+hH+jO7ErR/h9CHY1gMDh3Pua44fC67Rcd6t3Mrq2vpGfrOwtb2zu1cs7Td0lCgGdRaJSLV8qkFwCXXkKKAVK6ChL6DpD6/HfvMBlOaRvMNRDF5I+5L3OKNopM7ovoPwhGmseNYtlp2KM4G9TNwZKZMZat2SVewEEUtCkMgE1brtOjF6KVXImYCs0Ek0xJQNaR/ahkoagvbSydGZfWyUwO5FyjyJ9kT925HSUOtR6JvKkOJAL3pj8T+vnWDv0ku5jBMEyaaLeomwMbLHCdgBV8BQjAyhTHFzq80GVFGGJqf5SerxtG+cMJKBl443BaB5X879LAX/tyYrmAzdxcSWSeOs4joV9/a8XL2apZknh+SInBCXXJAquSE1UieMxOSZvJBX6836sD6tr2lpzpr1HJA5WN8/ZeqrVg==AAACMnicbVDLSsNAFJ3UV62vVpdugkVwY0lE0GXRjcsK9gFtLJPJbTt0MgkzN2oJ+Q23+hH+jO7ErR/h9CHY1gMDh3Pua44fC67Rcd6t3Mrq2vpGfrOwtb2zu1cs7Td0lCgGdRaJSLV8qkFwCXXkKKAVK6ChL6DpD6/HfvMBlOaRvMNRDF5I+5L3OKNopM7ovoPwhGmseNYtlp2KM4G9TNwZKZMZat2SVewEEUtCkMgE1brtOjF6KVXImYCs0Ek0xJQNaR/ahkoagvbSydGZfWyUwO5FyjyJ9kT925HSUOtR6JvKkOJAL3pj8T+vnWDv0ku5jBMEyaaLeomwMbLHCdgBV8BQjAyhTHFzq80GVFGGJqf5SerxtG+cMJKBl443BaB5X879LAX/tyYrmAzdxcSWSeOs4joV9/a8XL2apZknh+SInBCXXJAquSE1UieMxOSZvJBX6836sD6tr2lpzpr1HJA5WN8/ZeqrVg==AAACMnicbVDLSsNAFJ3UV62vVpdugkVwY0lE0GXRjcsK9gFtLJPJbTt0MgkzN2oJ+Q23+hH+jO7ErR/h9CHY1gMDh3Pua44fC67Rcd6t3Mrq2vpGfrOwtb2zu1cs7Td0lCgGdRaJSLV8qkFwCXXkKKAVK6ChL6DpD6/HfvMBlOaRvMNRDF5I+5L3OKNopM7ovoPwhGmseNYtlp2KM4G9TNwZKZMZat2SVewEEUtCkMgE1brtOjF6KVXImYCs0Ek0xJQNaR/ahkoagvbSydGZfWyUwO5FyjyJ9kT925HSUOtR6JvKkOJAL3pj8T+vnWDv0ku5jBMEyaaLeomwMbLHCdgBV8BQjAyhTHFzq80GVFGGJqf5SerxtG+cMJKBl443BaB5X879LAX/tyYrmAzdxcSWSeOs4joV9/a8XL2apZknh+SInBCXXJAquSE1UieMxOSZvJBX6836sD6tr2lpzpr1HJA5WN8/ZeqrVg==y(cid:66)(cid:86)(cid:89)AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=y(cid:81)(cid:83)(cid:74), AAACOXicbVDLSsNAFJ3UV62vVpdugkVwoSURQZdFNy4r2Ae0sUwmt+3QySTM3FhL6Ke41Y/wS1y6E7f+gNOHYFsPDBzOua85fiy4Rsd5tzIrq2vrG9nN3Nb2zu5evrBf01GiGFRZJCLV8KkGwSVUkaOARqyAhr6Aut+/Gfv1R1CaR/IehzF4Ie1K3uGMopHa+cLwoYXwhGms+Oi0FWvezhedkjOBvUzcGSmSGSrtgpVvBRFLQpDIBNW66ToxeilVyJmAUa6VaIgp69MuNA2VNATtpZPbR/axUQK7EynzJNoT9W9HSkOth6FvKkOKPb3ojcX/vGaCnSsv5TJOECSbLuokwsbIHgdhB1wBQzE0hDLFza0261FFGZq45iepwVnXOGEkAy8dbwpA866c+1kK/m/NKGcydBcTWya185LrlNy7i2L5epZmlhySI3JCXHJJyuSWVEiVMDIgz+SFvFpv1of1aX1NSzPWrOeAzMH6/gH1pa2NAAACOXicbVDLSsNAFJ3UV62vVpdugkVwoSURQZdFNy4r2Ae0sUwmt+3QySTM3FhL6Ke41Y/wS1y6E7f+gNOHYFsPDBzOua85fiy4Rsd5tzIrq2vrG9nN3Nb2zu5evrBf01GiGFRZJCLV8KkGwSVUkaOARqyAhr6Aut+/Gfv1R1CaR/IehzF4Ie1K3uGMopHa+cLwoYXwhGms+Oi0FWvezhedkjOBvUzcGSmSGSrtgpVvBRFLQpDIBNW66ToxeilVyJmAUa6VaIgp69MuNA2VNATtpZPbR/axUQK7EynzJNoT9W9HSkOth6FvKkOKPb3ojcX/vGaCnSsv5TJOECSbLuokwsbIHgdhB1wBQzE0hDLFza0261FFGZq45iepwVnXOGEkAy8dbwpA866c+1kK/m/NKGcydBcTWya185LrlNy7i2L5epZmlhySI3JCXHJJyuSWVEiVMDIgz+SFvFpv1of1aX1NSzPWrOeAzMH6/gH1pa2NAAACOXicbVDLSsNAFJ3UV62vVpdugkVwoSURQZdFNy4r2Ae0sUwmt+3QySTM3FhL6Ke41Y/wS1y6E7f+gNOHYFsPDBzOua85fiy4Rsd5tzIrq2vrG9nN3Nb2zu5evrBf01GiGFRZJCLV8KkGwSVUkaOARqyAhr6Aut+/Gfv1R1CaR/IehzF4Ie1K3uGMopHa+cLwoYXwhGms+Oi0FWvezhedkjOBvUzcGSmSGSrtgpVvBRFLQpDIBNW66ToxeilVyJmAUa6VaIgp69MuNA2VNATtpZPbR/axUQK7EynzJNoT9W9HSkOth6FvKkOKPb3ojcX/vGaCnSsv5TJOECSbLuokwsbIHgdhB1wBQzE0hDLFza0261FFGZq45iepwVnXOGEkAy8dbwpA866c+1kK/m/NKGcydBcTWya185LrlNy7i2L5epZmlhySI3JCXHJJyuSWVEiVMDIgz+SFvFpv1of1aX1NSzPWrOeAzMH6/gH1pa2NAAACOXicbVDLSsNAFJ3UV62vVpdugkVwoSURQZdFNy4r2Ae0sUwmt+3QySTM3FhL6Ke41Y/wS1y6E7f+gNOHYFsPDBzOua85fiy4Rsd5tzIrq2vrG9nN3Nb2zu5evrBf01GiGFRZJCLV8KkGwSVUkaOARqyAhr6Aut+/Gfv1R1CaR/IehzF4Ie1K3uGMopHa+cLwoYXwhGms+Oi0FWvezhedkjOBvUzcGSmSGSrtgpVvBRFLQpDIBNW66ToxeilVyJmAUa6VaIgp69MuNA2VNATtpZPbR/axUQK7EynzJNoT9W9HSkOth6FvKkOKPb3ojcX/vGaCnSsv5TJOECSbLuokwsbIHgdhB1wBQzE0hDLFza0261FFGZq45iepwVnXOGEkAy8dbwpA866c+1kK/m/NKGcydBcTWya185LrlNy7i2L5epZmlhySI3JCXHJJyuSWVEiVMDIgz+SFvFpv1of1aX1NSzPWrOeAzMH6/gH1pa2NPredictionTargetg(cid:72)(cid:70)(cid:79)\u27132(x,y(cid:81)(cid:83)(cid:74), )AAACVnicbVDLSsQwFE3raxxfVZduioOgoEMrgi4H3bhUcFSYqSVN73SCaVqSW3Uo/Qu/xq1+hP6MmHkIjnogcHLOvbm5J8oF1+h5H5Y9Mzs3v1BbrC8tr6yuOesb1zorFIM2y0SmbiOqQXAJbeQo4DZXQNNIwE10fzb0bx5AaZ7JKxzkEKQ0kbzHGUUjhU4zuSu7CE9YJiCrKjSXPiAND6vdp/3B3djKFa/2u7nme6HT8JreCO5f4k9Ig0xwEa5bTjfOWJGCRCao1h3fyzEoqULOBFT1bqEhp+yeJtAxVNIUdFCOFqvcHaPEbi9T5kh0R+rPjpKmWg/SyFSmFPv6tzcU//M6BfZOgpLLvECQbDyoVwgXM3eYkhtzBQzFwBDKFDd/dVmfKsrQZDn9kno8SIyTZjIOyuGkGDRP5NRmJUTfNVXdZOj/TuwvuT5s+l7TvzxqtE4nadbIFtkmu8Qnx6RFzskFaRNGnskLeSVv1rv1ac/ZC+NS25r0bJIp2M4XalO3EQ==AAACVnicbVDLSsQwFE3raxxfVZduioOgoEMrgi4H3bhUcFSYqSVN73SCaVqSW3Uo/Qu/xq1+hP6MmHkIjnogcHLOvbm5J8oF1+h5H5Y9Mzs3v1BbrC8tr6yuOesb1zorFIM2y0SmbiOqQXAJbeQo4DZXQNNIwE10fzb0bx5AaZ7JKxzkEKQ0kbzHGUUjhU4zuSu7CE9YJiCrKjSXPiAND6vdp/3B3djKFa/2u7nme6HT8JreCO5f4k9Ig0xwEa5bTjfOWJGCRCao1h3fyzEoqULOBFT1bqEhp+yeJtAxVNIUdFCOFqvcHaPEbi9T5kh0R+rPjpKmWg/SyFSmFPv6tzcU//M6BfZOgpLLvECQbDyoVwgXM3eYkhtzBQzFwBDKFDd/dVmfKsrQZDn9kno8SIyTZjIOyuGkGDRP5NRmJUTfNVXdZOj/TuwvuT5s+l7TvzxqtE4nadbIFtkmu8Qnx6RFzskFaRNGnskLeSVv1rv1ac/ZC+NS25r0bJIp2M4XalO3EQ==AAACVnicbVDLSsQwFE3raxxfVZduioOgoEMrgi4H3bhUcFSYqSVN73SCaVqSW3Uo/Qu/xq1+hP6MmHkIjnogcHLOvbm5J8oF1+h5H5Y9Mzs3v1BbrC8tr6yuOesb1zorFIM2y0SmbiOqQXAJbeQo4DZXQNNIwE10fzb0bx5AaZ7JKxzkEKQ0kbzHGUUjhU4zuSu7CE9YJiCrKjSXPiAND6vdp/3B3djKFa/2u7nme6HT8JreCO5f4k9Ig0xwEa5bTjfOWJGCRCao1h3fyzEoqULOBFT1bqEhp+yeJtAxVNIUdFCOFqvcHaPEbi9T5kh0R+rPjpKmWg/SyFSmFPv6tzcU//M6BfZOgpLLvECQbDyoVwgXM3eYkhtzBQzFwBDKFDd/dVmfKsrQZDn9kno8SIyTZjIOyuGkGDRP5NRmJUTfNVXdZOj/TuwvuT5s+l7TvzxqtE4nadbIFtkmu8Qnx6RFzskFaRNGnskLeSVv1rv1ac/ZC+NS25r0bJIp2M4XalO3EQ==AAACVnicbVDLSsQwFE3raxxfVZduioOgoEMrgi4H3bhUcFSYqSVN73SCaVqSW3Uo/Qu/xq1+hP6MmHkIjnogcHLOvbm5J8oF1+h5H5Y9Mzs3v1BbrC8tr6yuOesb1zorFIM2y0SmbiOqQXAJbeQo4DZXQNNIwE10fzb0bx5AaZ7JKxzkEKQ0kbzHGUUjhU4zuSu7CE9YJiCrKjSXPiAND6vdp/3B3djKFa/2u7nme6HT8JreCO5f4k9Ig0xwEa5bTjfOWJGCRCao1h3fyzEoqULOBFT1bqEhp+yeJtAxVNIUdFCOFqvcHaPEbi9T5kh0R+rPjpKmWg/SyFSmFPv6tzcU//M6BfZOgpLLvECQbDyoVwgXM3eYkhtzBQzFwBDKFDd/dVmfKsrQZDn9kno8SIyTZjIOyuGkGDRP5NRmJUTfNVXdZOj/TuwvuT5s+l7TvzxqtE4nadbIFtkmu8Qnx6RFzskFaRNGnskLeSVv1rv1ac/ZC+NS25r0bJIp2M4XalO3EQ==y(cid:66)(cid:86)(cid:89)AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=y(cid:81)(cid:83)(cid:74)=0AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==y(cid:81)(cid:83)(cid:74)=1AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=y(cid:66)(cid:86)(cid:89)AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=AAACMnicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuiy6cVnBPqCNZTK5bYdOJmHmRltCf8OtfoQ/oztx60c4aSvY1gMDh3Pua44fC67Rcd6tldW19Y3N3FZ+e2d3b79QPKjrKFEMaiwSkWr6VIPgEmrIUUAzVkBDX0DDH9xkfuMRlOaRvMdRDF5Ie5J3OaNopPbooY0wxJQmw3GnUHLKzgT2MnFnpERmqHaKVqEdRCwJQSITVOuW68TopVQhZwLG+XaiIaZsQHvQMlTSELSXTo4e2ydGCexupMyTaE/Uvx0pDbUehb6pDCn29aKXif95rQS7V17KZZwgSDZd1E2EjZGdJWAHXAFDMTKEMsXNrTbrU0UZmpzmJ6mns55xwkgGXpptCkDznpz7WQr+b804bzJ0FxNbJvXzsuuU3buLUuV6lmaOHJFjckpcckkq5JZUSY0wEpNn8kJerTfrw/q0vqalK9as55DMwfr+AWsbq1k=y(cid:81)(cid:83)(cid:74)=0AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==AAACN3icbVDLSgMxFM34rPXV6tJNsAhuLDMi6EYQ3bisYB/QjiWTuW2DmcyQ3FHL0C9xqx/hp7hyJ279A9OHYFsPBA7n3FdOkEhh0HXfnYXFpeWV1dxafn1jc2u7UNypmTjVHKo8lrFuBMyAFAqqKFBCI9HAokBCPbi/Gvr1B9BGxOoW+wn4Eesq0RGcoZXahe3+XQvhCbNEi8E5dduFklt2R6DzxJuQEpmg0i46hVYY8zQChVwyY5qem6CfMY2CSxjkW6mBhPF71oWmpYpFYPxsdPmAHlglpJ1Y26eQjtS/HRmLjOlHga2MGPbMrDcU//OaKXbO/EyoJEVQfLyok0qKMR3GQEOhgaPsW8K4FvZWyntMM442rOlJ+vGoa50oVqGfDTeFYERXTf0sg+C3ZpC3GXqzic2T2nHZc8vezUnp4nKSZo7skX1ySDxySi7INamQKuEkJc/khbw6b86H8+l8jUsXnEnPLpmC8/0DXcGsMg==y(cid:81)(cid:83)(cid:74)=1AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=AAACN3icbVDLSsNAFJ3UV62Ptrp0M1gEN5ZEBN0IRTcuFawKbSyTyW0dOpmEmRu1hHyJW/0IP8WVO3HrHzitEax6YOBwzn3NCRIpDLrui1OamZ2bXygvVpaWV1artfrahYlTzaHNYxnrq4AZkEJBGwVKuEo0sCiQcBkMj8f+5S1oI2J1jqME/IgNlOgLztBKvVp1dN1FuMcs0SI/pF6v1nCb7gT0L/EK0iAFTnt1p9YNY55GoJBLZkzHcxP0M6ZRcAl5pZsaSBgfsgF0LFUsAuNnk8tzumWVkPZjbZ9COlF/dmQsMmYUBbYyYnhjfntj8T+vk2L/wM+ESlIExb8W9VNJMabjGGgoNHCUI0sY18LeSvkN04yjDWt6kr7bGVgnilXoZ+NNIRgxUFM/yyD4rskrNkPvd2J/ycVu03Ob3tleo3VUpFkmG2STbBOP7JMWOSGnpE04SckDeSRPzrPz6rw571+lJafoWSdTcD4+AV+ErDM=(cid:46)(cid:66)(cid:84)(cid:76)(cid:52)(cid:80)(cid:71)(cid:3005)(cid:46)(cid:66)(cid:89) =[2,2]AAACTHicbVDRShtBFJ2NrdpYNWkf+zI0FvqgYTcU9EWQ9qUvgqWNCtkl3J29G4fMzmxn7qphyQ/4NX1tP6Lv/Y++lUInMQWjXhg4nHPuvXNPWirpKAx/BY2VJ09X19afNTeeb25tt9ovTp2prMC+MMrY8xQcKqmxT5IUnpcWoUgVnqXjDzP97BKtk0Z/oUmJSQEjLXMpgDw1bO3EhNdUH4Mb888mp2O4jr9WkPEpj0snDwe93V4ybHXCbjgv/hBEC9BhizoZtoNWnBlRFahJKHBuEIUlJTVYkkLhtBlXDksQYxjhwEMNBbqknp8z5W88k/HcWP808Tl7t6OGwrlJkXpnAXTh7msz8jFtUFF+kNRSlxWhFreL8kpxMnyWDc+kRUFq4gEIK/1fubgAC4J8gsuT7NXeyCuF0VlSzzZl6ORIL11WY/rfM236DKP7iT0Ep71uFHajT+86R+8Xaa6zV+w1e8sits+O2Ed2wvpMsBv2jX1nP4Kfwe/gT/D31toIFj0v2VI1Vv8B5LezOA==AAACTHicbVDRShtBFJ2NrdpYNWkf+zI0FvqgYTcU9EWQ9qUvgqWNCtkl3J29G4fMzmxn7qphyQ/4NX1tP6Lv/Y++lUInMQWjXhg4nHPuvXNPWirpKAx/BY2VJ09X19afNTeeb25tt9ovTp2prMC+MMrY8xQcKqmxT5IUnpcWoUgVnqXjDzP97BKtk0Z/oUmJSQEjLXMpgDw1bO3EhNdUH4Mb888mp2O4jr9WkPEpj0snDwe93V4ybHXCbjgv/hBEC9BhizoZtoNWnBlRFahJKHBuEIUlJTVYkkLhtBlXDksQYxjhwEMNBbqknp8z5W88k/HcWP808Tl7t6OGwrlJkXpnAXTh7msz8jFtUFF+kNRSlxWhFreL8kpxMnyWDc+kRUFq4gEIK/1fubgAC4J8gsuT7NXeyCuF0VlSzzZl6ORIL11WY/rfM236DKP7iT0Ep71uFHajT+86R+8Xaa6zV+w1e8sits+O2Ed2wvpMsBv2jX1nP4Kfwe/gT/D31toIFj0v2VI1Vv8B5LezOA==AAACTHicbVDRShtBFJ2NrdpYNWkf+zI0FvqgYTcU9EWQ9qUvgqWNCtkl3J29G4fMzmxn7qphyQ/4NX1tP6Lv/Y++lUInMQWjXhg4nHPuvXNPWirpKAx/BY2VJ09X19afNTeeb25tt9ovTp2prMC+MMrY8xQcKqmxT5IUnpcWoUgVnqXjDzP97BKtk0Z/oUmJSQEjLXMpgDw1bO3EhNdUH4Mb888mp2O4jr9WkPEpj0snDwe93V4ybHXCbjgv/hBEC9BhizoZtoNWnBlRFahJKHBuEIUlJTVYkkLhtBlXDksQYxjhwEMNBbqknp8z5W88k/HcWP808Tl7t6OGwrlJkXpnAXTh7msz8jFtUFF+kNRSlxWhFreL8kpxMnyWDc+kRUFq4gEIK/1fubgAC4J8gsuT7NXeyCuF0VlSzzZl6ORIL11WY/rfM236DKP7iT0Ep71uFHajT+86R+8Xaa6zV+w1e8sits+O2Ed2wvpMsBv2jX1nP4Kfwe/gT/D31toIFj0v2VI1Vv8B5LezOA==AAACTHicbVDRShtBFJ2NrdpYNWkf+zI0FvqgYTcU9EWQ9qUvgqWNCtkl3J29G4fMzmxn7qphyQ/4NX1tP6Lv/Y++lUInMQWjXhg4nHPuvXNPWirpKAx/BY2VJ09X19afNTeeb25tt9ovTp2prMC+MMrY8xQcKqmxT5IUnpcWoUgVnqXjDzP97BKtk0Z/oUmJSQEjLXMpgDw1bO3EhNdUH4Mb888mp2O4jr9WkPEpj0snDwe93V4ybHXCbjgv/hBEC9BhizoZtoNWnBlRFahJKHBuEIUlJTVYkkLhtBlXDksQYxjhwEMNBbqknp8z5W88k/HcWP808Tl7t6OGwrlJkXpnAXTh7msz8jFtUFF+kNRSlxWhFreL8kpxMnyWDc+kRUFq4gEIK/1fubgAC4J8gsuT7NXeyCuF0VlSzzZl6ORIL11WY/rfM236DKP7iT0Ep71uFHajT+86R+8Xaa6zV+w1e8sits+O2Ed2wvpMsBv2jX1nP4Kfwe/gT/D31toIFj0v2VI1Vv8B5LezOA==(cid:52)(cid:80)(cid:71)(cid:3005)(cid:46)(cid:66)(cid:89)AAACNHicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuhTduBEqWhXaWCaTmzo4mQkzN2oJ/Q+3+hH+i+BO3PoNTtoKtnpg4HDOfc0JU8ENet6bMzU9Mzs3X1ooLy4tr6xWqmuXRmWaQZMpofR1SA0ILqGJHAVcpxpoEgq4Cu+OC//qHrThSl5gL4UgoV3JY84oWummjfCI+bmK8ZQ+9juVmlf3BnD/En9EamSERqfqVNqRYlkCEpmgxrR8L8Ugpxo5E9AvtzMDKWV3tAstSyVNwAT54Oy+u2WVyI2Vtk+iO1B/d+Q0MaaXhLYyoXhrJr1C/M9rZRgfBDmXaYYg2XBRnAkXlVtk4EZcA0PRs4Qyze2tLrulmjK0SY1P0g87XeskSkZBXmyKwPCuHPtZDuFPTb9sM/QnE/tLLnfrvlf3z/Zqh0ejNEtkg2ySbeKTfXJITkiDNAkjmjyRZ/LivDrvzofzOSydckY962QMztc31A+sCg==AAACNHicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuhTduBEqWhXaWCaTmzo4mQkzN2oJ/Q+3+hH+i+BO3PoNTtoKtnpg4HDOfc0JU8ENet6bMzU9Mzs3X1ooLy4tr6xWqmuXRmWaQZMpofR1SA0ILqGJHAVcpxpoEgq4Cu+OC//qHrThSl5gL4UgoV3JY84oWummjfCI+bmK8ZQ+9juVmlf3BnD/En9EamSERqfqVNqRYlkCEpmgxrR8L8Ugpxo5E9AvtzMDKWV3tAstSyVNwAT54Oy+u2WVyI2Vtk+iO1B/d+Q0MaaXhLYyoXhrJr1C/M9rZRgfBDmXaYYg2XBRnAkXlVtk4EZcA0PRs4Qyze2tLrulmjK0SY1P0g87XeskSkZBXmyKwPCuHPtZDuFPTb9sM/QnE/tLLnfrvlf3z/Zqh0ejNEtkg2ySbeKTfXJITkiDNAkjmjyRZ/LivDrvzofzOSydckY962QMztc31A+sCg==AAACNHicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuhTduBEqWhXaWCaTmzo4mQkzN2oJ/Q+3+hH+i+BO3PoNTtoKtnpg4HDOfc0JU8ENet6bMzU9Mzs3X1ooLy4tr6xWqmuXRmWaQZMpofR1SA0ILqGJHAVcpxpoEgq4Cu+OC//qHrThSl5gL4UgoV3JY84oWummjfCI+bmK8ZQ+9juVmlf3BnD/En9EamSERqfqVNqRYlkCEpmgxrR8L8Ugpxo5E9AvtzMDKWV3tAstSyVNwAT54Oy+u2WVyI2Vtk+iO1B/d+Q0MaaXhLYyoXhrJr1C/M9rZRgfBDmXaYYg2XBRnAkXlVtk4EZcA0PRs4Qyze2tLrulmjK0SY1P0g87XeskSkZBXmyKwPCuHPtZDuFPTb9sM/QnE/tLLnfrvlf3z/Zqh0ejNEtkg2ySbeKTfXJITkiDNAkjmjyRZ/LivDrvzofzOSydckY962QMztc31A+sCg==AAACNHicbVDLSsNAFJ34rPXV6tJNsAhuLIkIuhTduBEqWhXaWCaTmzo4mQkzN2oJ/Q+3+hH+i+BO3PoNTtoKtnpg4HDOfc0JU8ENet6bMzU9Mzs3X1ooLy4tr6xWqmuXRmWaQZMpofR1SA0ILqGJHAVcpxpoEgq4Cu+OC//qHrThSl5gL4UgoV3JY84oWummjfCI+bmK8ZQ+9juVmlf3BnD/En9EamSERqfqVNqRYlkCEpmgxrR8L8Ugpxo5E9AvtzMDKWV3tAstSyVNwAT54Oy+u2WVyI2Vtk+iO1B/d+Q0MaaXhLYyoXhrJr1C/M9rZRgfBDmXaYYg2XBRnAkXlVtk4EZcA0PRs4Qyze2tLrulmjK0SY1P0g87XeskSkZBXmyKwPCuHPtZDuFPTb9sM/QnE/tLLnfrvlf3z/Zqh0ejNEtkg2ySbeKTfXJITkiDNAkjmjyRZ/LivDrvzofzOSydckY962QMztc31A+sCg==\fminimum without producing any extra useful knowledge. Therefore, to encourage the network to\nlearn more complex and informative auxiliary tasks, we further apply an entropy loss H(yaux) as a\nregularisation term in the meta objective. A detailed explanation of the entropy loss and the collapsing\nlabel problem is given in Section 3.4. Finally, we update MAXL\u2019s label generation network by\n\n\u03b82 \u2190 \u03b82 \u2212 \u03b2\u2207\u03b82\n\n(x(i)), ypri\n\n(i)) + \u03bbH(yaux\n(i) )\n\n.\n\n(5)\n\n(cid:16)L(f pri\n\n\u03b8+\n1\n\n(cid:17)\n\nOverall, the entire MAXL algorithm is de\ufb01ned as follows:\n\nAlgorithm 1: The MAXL algorithm\nInitialise: Network parameters: \u03b81, \u03b82; Hierarchical structure: \u03c8\nInitialise: Learning rate: \u03b1, \u03b2; Entropy weighting: \u03bb\nwhile not converged do\n\nfor each training iteration i do\n\n(i)) \u2208 (x, y)\n\n# fetch one batch of training data\n(x(i), ypri\n# auxiliary-training step\nUpdate: \u03b81 \u2190 \u03b81 \u2212 \u03b1\u2207\u03b81\n\n(cid:16)L(f pri\n\n\u03b81\n\nend\nfor each training iteration i do\n\n(x(i)), ypri\n\n(i)) + L(f aux\n\n\u03b81\n\n(cid:17)\n\n(x(i)), g\u03b82 (x(i), ypri\n\n(i), \u03c8))\n\n(i)) \u2208 (x, y)\n\n# fetch one batch of training data\n(x(i), ypri\n# retain training computational graph\n1 = \u03b81 \u2212 \u03b1\u2207\u03b81\n(x(i)), ypri\nCompute: \u03b8+\n# meta-training step (second derivative trick)\nUpdate: \u03b82 \u2190 \u03b82 \u2212 \u03b2\u2207\u03b82\n(x(i)), ypri\n\n(cid:16)L(f pri\n(cid:16)L(f pri\n\n\u03b81\n\n\u03b8+\n1\n\n(i)) + L(f aux\n(i)) + \u03bbH(yaux\n(i) )\n\n\u03b81\n\n(cid:17)\n\n(cid:17)\n\n(x(i)), g\u03b82(x(i), ypri\n\n(i), \u03c8))\n\nend\n\nend\n\n3.3 Mask SoftMax for Hierarchical Predictions\n\nexp \u02c6yi(cid:80)\n\n(cid:80)\nexp M (cid:12) \u02c6yi\ni exp M (cid:12) \u02c6yi\n\nAs previously discussed, we include a hierarchy \u03c8 which de\ufb01nes the number of auxiliary classes\nper primary class. To implement this, we designed a modi\ufb01ed SoftMax function, which we call\nMask SoftMax, to predict auxiliary labels only for certain auxiliary classes. This takes ground-truth\nprimary task label y, and the hierarchy \u03c8, and creates a binary mask M = B(y, \u03c8). The mask is\nzero everywhere, except for ones across the set of auxiliary classes associated with y. For example,\nconsider a primary task with 2 classes y = 0, 1, and a hierarchy of \u03c8 = [2, 2] as in Figure 2b. In this\ncase, the binary masks are M = [1, 1, 0, 0] for y = 0, and [0, 0, 1, 1] for y = 1.\nApplying this mask element-wise to the standard SoftMax function then allows the label-prediction\nnetwork to assign auxiliary labels only to relevant auxiliary classes:\n\nSoftMax: p(\u02c6yi) =\n\n(6)\nwhere p(\u02c6yi) represents the probability of the generated auxiliary label \u02c6y over class i, and (cid:12) represents\nelement-wise multiplication. Note that no domain knowledge is required to de\ufb01ne the hierarchy, and\nMAXL performs well across a range of values for \u03c8 as shown in Section 4.2.\n\nMask SoftMax: p(\u02c6yi) =\n\ni exp \u02c6yi\n\n,\n\n,\n\n3.4 The Collapsing Class Problem\n\nAs previously discussed, we introduce an additional regularisation loss, which we call the entropy\nloss H(\u02c6y(i)). This encourages high entropy across the auxiliary class prediction space, which in\nturn encourages the label-prediction network to fully utilise all auxiliary classes. The entropy\nloss calculates the KL divergence between the predicted auxiliary label space \u02c6y(i), and a uniform\n\n5\n\n\fdistribution U, for each ith batch. This is equivalent to calculating the entropy of the predicted label\nspace, and is de\ufb01ned as:\n\nN(cid:88)\n\nn=1\n\n\u02c6yk\n(i)[n].\n\n(7)\n\nK(cid:88)\n\nk=1\n\nH(\u02c6y(i)) =\n\n(i) log \u02c6yk\n\u02c6yk\n\n(i),\n\n\u02c6yk\n(i) =\n\n1\nN\n\nwhere K is the total number of auxiliary classes, and N is the training batch size.\n\n4 Experiments\n\nIn this section, we present experimental results to evaluate MAXL with respect to several baselines\nand datasets on image classi\ufb01cation.\n\n4.1 Experimental Setup\n\nDatasets We evaluated on seven different datasets, with varying sizes and complexities. One\nof these, CIFAR-100 [18], contains a manually-de\ufb01ned 2-level hierarchical structure, which we\nexpanded into a 4-level hierarchy by manually assigning data for the new levels, to create a hierarchy\nof {3, 10, 20, 100} classes . This hierarchy was then used for ground-truth auxiliary labels for the\nHuman baseline (see below). For the other six datasets: MNIST [19], SVHN [12], CIFAR-10 [18],\nImageNet [7], CINIC-10 [6] and UCF-101 [32], a hierarchy is either not available or dif\ufb01cult to\naccess, and so no ground-truth auxiliary labels exist. All larger datasets were rescaled to resolution\n[32 \u00d7 32] to accelerate training.\nBaselines We compare MAXL to a number of baselines. First, we compare with Single Task, which\ntrains only with the primary class label and does not employ auxiliary learning. This comparison was\ndone to determine whether MAXL could improve classi\ufb01cation performance without needing any\nextra labelled data. Then, we compare to three baselines for generating auxiliary labels: Random,\nK-Means, and Human, to evaluate the effectiveness of MAXL\u2019s meta-learning for label generation.\nRandom assigns each training image to random auxiliary classes in a randomly generated (well-\nbalanced) hierarchy. K-Means determines auxiliary labels via unsupervised clustering using K-Means\n[13], performed on the latent representation of an auto-encoder, with clustering updated after every\ntraining iteration. Human uses the human-de\ufb01ned hierarchy of CIFAR-100, where the auxiliary\nclasses are at a lower (\ufb01ner-grained) level hierarchy to the primary classes. Note that in order to\ncompare these baselines to Human, they were only evaluated on CIFAR-100 because this is the only\ndataset containing a human-de\ufb01ned hierarchy (and hence ground-truth auxiliary labels).\n\n4.2 Comparison to Single Task Learning\n\nFirst, we compare MAXL to a single-task learning baseline, to determine whether MAXL can\nimprove recognition accuracy without needing access to any additional data. To test the robustness\nof MAXL, we evaluate it on 3 different networks: a simple 4-layer ConvNet, VGG-16 [30], and\nResNet-32 [14]. We used hyper-parameter search for all networks and applied regularisation methods\nin order to achieve optimal performance . Since the power of MAXL lies in its ability to work without\ndomain knowledge, we tested MAXL across a range of hierarchies \u03c8, to study if it is effective without\nneeding to tune this hierarchy for each dataset. Here, the hierarchies are well balanced such that \u03c8[i]\nis the same for all i (for all primary classes).\n\nDatasets\n\nBackbone\n\nSingle\n\n4-layer ConvNet\n4-layer ConvNet\n\nMNIST\nSVHN\nCIFAR-10 VGG-16\nVGG-16\nImageNet\nResNet-32\nCINIC-10\nUCF-101\nResNet-32\n\n99.57 \u00b1 0.02\n94.05 \u00b1 0.07\n92.77 \u00b1 0.13\n46.67 \u00b1 0.12\n85.12 \u00b1 0.08\n53.15 \u00b1 0.12\n\nMAXL, \u03c8[i] =\n\n2\n\n99.56 \u00b1 0.04\n94.39 \u00b1 0.08\n93.27 \u00b1 0.09\n46.82 \u00b1 0.14\n85.66 \u00b1 0.07\n54.19 \u00b1 0.18\n\n3\n\n99.71 \u00b1 0.02\n94.38 \u00b1 0.07\n93.47 \u00b1 0.08\n46.97 \u00b1 0.10\n85.72 \u00b1 0.07\n55.39 \u00b1 0.16\n\n5\n\n99.59 \u00b1 0.03\n94.59 \u00b1 0.12\n93.49 \u00b1 0.05\n47.02 \u00b1 0.11\n85.83 \u00b1 0.08\n54.70 \u00b1 0.12\n\n10\n\n99.57 \u00b1 0.02\n94.41 \u00b1 0.09\n93.10 \u00b1 0.08\n46.85 \u00b1 0.11\n85.80 \u00b1 0.10\n54.32 \u00b1 0.18\n\nTable 1: Comparison of MAXL with single-task learning, across a range of hierarchies. We reported\nresults from three individual runs, and the best performance for each dataset is marked with bold.\n\n6\n\n\fTable 1 shows the test accuracy of MAXL and single-task learning, with each accuracy averaged\nover three individual runs. We see that MAXL consistently outperforms single-task learning across\nall six datasets, despite both methods using exactly the same training data. We also see that MAXL\noutperforms single-task learning across almost all tested values of \u03c8, showing the robustness of our\nmethod without requiring domain knowledge or a manually-de\ufb01ned hierarchy.\n\n4.3 Comparison to Auxiliary Label Generation Baselines\n\nNext, we compare MAXL to a number of baseline methods for generating auxiliary labels, on\nCIFAR-100. Here, all the baselines were trained without any regularisation, to isolate the effect\nof auxiliary learning and test generalisation ability purely from auxiliary tasks. This dataset has a\nmanually-de\ufb01ned hierarchy, which is used in Human for ground-truth auxiliary labels. However,\nMAXL, Random, and K-Means do not require any human knowledge or manually-de\ufb01ned hierarchy\nto generate auxiliary labels. Therefore, as in Section 4.2, a hierarchy \u03c8 is de\ufb01ned, assigning each\nprimary class a set of auxiliary classes. We created well-balanced hierarchies by assigning an equal\nnumber of auxiliary classes per primary class. For cases where the hierarchy was unbalanced by one\nauxiliary class, we randomly chose which primary classes are assigned each number of auxiliary\nclasses in \u03c8. We ran each experiment three times and averaged the results,\n\nFigure 3: Learning curves for the CIFAR-100 test dataset, comparing MAXL with baseline methods\nfor generating auxiliary labels. Our version of CIFAR-100 has a four-level hierarchy of {3, 10, 20,\n100} classes per level, and we use this to create the hierarchy \u03c8 for auxiliary learning.\n\nTest accuracy curves are presented in Figure 3, using all possible combinations of the numbers of\nprimary classes and total auxiliary classes in CIFAR-100 (where the auxiliary classes are at a lower\nhierarchical level to the primary classes). We observe that MAXL outperforms Single Task, Random,\nand K-Means. Note that K-Means required signi\ufb01cantly longer training time than MAXL due to the\nneed to run clustering after each iteration. Also note that the superior performance of MAXL over\nthese three baselines occurs despite all four methods using exactly the same data. Finally, we observe\nthat MAXL performs similarly to Human, despite this baseline requiring manually-de\ufb01ned auxiliary\nlabels for the entire training dataset. With performance of MAXL similar to that of a system using\nhuman-de\ufb01ned auxiliary labels, we see strong evidence that MAXL is able to learn to generalise\neffectively in a self-supervised manner.\n\n4.4 Understanding the Utility of Auxiliary Labels\n\nIn [9], the cosine similarity between gradients produced by the auxiliary and primary losses was used\nto determine the task weighting in the overall loss function. We use this same idea to visualise the\nutility of a set of auxiliary labels for improving the performance of the primary task. Intuitively, a\ncosine similarity of -1 indicates that the auxiliary labels work against the primary task. A cosine\nsimilarity of 0 indicates that the auxiliary labels have no impact on the primary task. And a cosine\nsimilarity of 1 indicates that the auxiliary labels are learning the same features as the primary task,\nand so offer no useful information. Therefore, the cosine similarity for the gradient produced from\noptimal auxiliary labels should be between 0 and 1 to ensure that they assist the primary task.\n\n7\n\n0100200Epoch0.8500.8750.900TestPerformance(Acc.)0100200Epoch0.8500.8750.900TestPerformance(Acc.)0100200Epoch0.8500.8750.900TestPerformance(Acc.)0100200Epoch0.7250.7500.775TestPerformance(Acc.)0100200Epoch0.7250.7500.775TestPerformance(Acc.)0100200Epoch0.6750.7000.725TestPerformance(Acc.)HumanMAXLSingleTaskPRI3|AUX10PRI3|AUX20PRI20|AUX100PRI10|AUX100PRI10|AUX20PRI3|AUX100RandomK-Means\fIn Figure 4, we show the cosine similarity measure-\nments of gradients in the shared layers of the multi-\ntask network, trained on 3 primary classes and 10, 20\nand 100 total auxiliary classes from CIFAR-100. We\nobserve that baseline methods Human and Random,\nwith \ufb01xed auxiliary labels, reach their maximal sim-\nilarity at an early stage during training, which then\ndrops signi\ufb01cantly afterwards. K-Means produces\nsmooth auxiliary gradients throughout training, but\nits similarity depends on the number of auxiliary\nclasses. In comparison, MAXL produces auxiliary\ngradients with high similarity throughout the entire\ntraining period, and consistently so across the num-\nber of auxiliary classes. Whilst we cannot say what\nthe optimal cosine similarity should be, it is clear\nthat MAXL\u2019s auxiliary labels affect primary task\nperformance in a very different way to the other\nbaselines.\nDue to MAXL\u2019s cosine similarity measurements being greater than zero across the entire training\nstage, a standard gradient update rule for shared feature space is then guaranteed to converge to a\nlocal minima given a small learning rate [9].\n\nFigure 4: Cosine similarity measurement be-\ntween the auxiliary loss gradient and primary\nloss gradient, on the shared representation in\nthe multi-task network.\n\n4.5 Visualisations of Generated Knowledge\n\nIn Figure 5, we visualise 2D embeddings of examples from the CIFAR-100 test dataset, on two\ndifferent hierarchies. The visualisations are computed using t-SNE [24] on the \ufb01nal feature layer of\nthe multi-task network, and compared across three methods: our MAXL method, the Human baseline,\nand the Single Task baseline.\n\nFigure 5: t-SNE visualisation of the learned \ufb01nal layer of the multi-task network, trained on CIFAR-\n100 with two different hierarchies. Colours represent the primary classes.\n\nThis visualisation shows the separability of primary classes after being trained with the multi-task\nnetwork. Qualitatively, we see that both MAXL and Human show better separation of the primary\nclasses than with Single Task, owing to the generalisation effect of the auxiliary learning. This again\nshows the effectiveness of MAXL whilst requiring no additional human knowledge.\nWe also show examples of images assigned to the same auxiliary class through MAXL\u2019s label-\ngeneration network. Figure 6 shows example images with the highest prediction probabilities for\nthree random auxiliary classes from CIFAR-100, using the hierarchy of 20 primary classes and 100\ntotal auxiliary classes (5 auxiliary classes per primary class), which showed the best performance of\nMAXL in Figure 3. In addition, we also present examples on MNIST, in which 3 auxiliary classes\nwere used for each of the 10 primary classes.\nTo our initial surprise, only part of the generated auxiliary labels visualised in both dataset show\nhuman-understandable knowledge. We can observe that the auxiliary classes #1 and #2 of digit nine\nare clustered by the direction of the \u2018tail\u2019, and auxiliary classes #2 and #3 of digit seven are clustered\nby the distinction of the \u2018horizontal line\u2019. But in most cases, there are no obvious similarities within\neach auxiliary class in terms of shape, colour, style, structure or semantic meaning. However, this\nmakes more sense when we re-consider the role of the label-generation network, which is to assign\nauxiliary labels which assist the primary task, rather than grouping images in terms of semantic or\nvisual similarity. The label-generation network would therefore be more effective if it were to group\n\n8\n\nHumanMAXLRandomK-Means050100150200Epoch0.00.20.40.6CosineSimilarity050100150200Epoch0.00.20.40.6CosineSimilarity050100150200Epoch0.00.20.40.6CosineSimilarityPRI3|AUX20PRI3|AUX100PRI3|AUX10SingleTaskHumanMAXLSingleTaskHumanMAXLPRI3|AUX10PRI20|AUX100\fAuxiliary Class #1\n\nAuxiliary Class #2\n\nAuxiliary Class #3\n\nPrimary Class\n\naquatic\nmammals\nhousehold\nfurnitures\n\ninvertebrates\n\nnature\nscenes\n\n\ufb02owers\n\ndigit 0\n\ndigit 3\n\ndigit 5\n\ndigit 7\n\ndigit 9\n\nFigure 6: Visualisation of 5 test examples with the highest prediction probability, for each of 3\nrandomly selected auxiliary classes, for different primary classes. We present the visualisation for\nCIFAR-100 (top) when trained with 20 primary classes and 5 auxiliary classes per primary class, and\nfor MNIST (bottom) when trained with 10 primary classes and 3 auxiliary classes per primary class.\n\nimages in terms of a shared aspect of reasoning which the primary task is currently struggling to\nlearn, which may not be human intepretable.\nFurthermore, we discovered that the generated auxiliary knowledge is not deterministic, since the top\npredicted candidates are different when we re-train the network from scratch. We therefore speculate\nthat using a human-de\ufb01ned hierarchy is just one out of a potentially in\ufb01nite number of local optima,\nand on each run of training, the label-generation network produces another of these local optima.\n\n5 Conclusion & Future Work\n\nIn this paper, we have presented Meta AuXiliary Learning (MAXL) for generating optimal auxiliary\nlabels which, when trained alongside a primary task in a multi-task setup, improve the performance\nof the primary task. Rather than employing domain knowledge and human-de\ufb01ned auxiliary tasks\nas is typically required, MAXL is self-supervised and, combined with its general nature, has the\npotential to automate the process of generalisation to new levels.\nOur evaluation on multiple datasets has shown the performance of MAXL in an image classi\ufb01cation\nsetup, where the auxiliary task is to predict sub-class, hierarchical labels for an image. We have\nshown that MAXL signi\ufb01cantly outperforms other baselines for generating auxiliary labels, and is\ncompetitive even when human-de\ufb01ned knowledge is used to manually construct the auxiliary labels.\nThe general nature of MAXL also opens up questions about how self-supervised auxiliary learning\nmay be used to learn generic auxiliary tasks, beyond sub-class image classi\ufb01cation. During our\nexperiments, we also ran preliminary experiments on predicting arbitrary vectors such that the\nauxiliary task becomes a regression, but results so far have been inconclusive. However, the ability of\nMAXL to potentially learn \ufb02exible auxiliary tasks which can automatically be tuned for the primary\ntask, now offers an exciting direction towards automated generalisation across a wide range of more\ncomplex tasks.\n\nAcknowledgements\n\nWe would like to thank Michael Bloesch, Fabian Falck, and Stephen James, for insightful discussions.\n\n9\n\n\fReferences\n\n[1] Rishabh Agarwal, Chen Liang, Dale Schuurmans, and Mohammad Norouzi. Learning to generalize from\nsparse and underspeci\ufb01ed rewards. In International Conference on Machine Learning, pages 130\u2013140,\n2019.\n\n[2] Marcin Andrychowicz, Misha Denil, Sergio Gomez, Matthew W Hoffman, David Pfau, Tom Schaul,\nBrendan Shillingford, and Nando De Freitas. Learning to learn by gradient descent by gradient descent. In\nAdvances in Neural Information Processing Systems, pages 3981\u20133989, 2016.\n\n[3] Samy Bengio, Yoshua Bengio, Jocelyn Cloutier, and Jan Gecsei. On the optimization of a synaptic learning\nrule. In Preprints Conf. Optimality in Arti\ufb01cial and Biological Neural Networks, pages 6\u20138. Univ. of Texas,\n1992.\n\n[4] Yoshua Bengio, Samy Bengio, and Jocelyn Cloutier. Learning a synaptic learning rule. Universit\u00e9 de\n\nMontr\u00e9al, D\u00e9partement d\u2019informatique et de recherche op\u00e9rationnelle, 1990.\n\n[5] Rich Caruana. Multitask learning. In Learning to learn, pages 95\u2013133. Springer, 1998.\n[6] Luke N Darlow, Elliot J Crowley, Antreas Antoniou, and Amos J Storkey. Cinic-10 is not imagenet or\n\ncifar-10. arXiv preprint arXiv:1810.03505, 2018.\n\n[7] Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. Imagenet: A large-scale hierarchical\nimage database. In 2009 IEEE conference on computer vision and pattern recognition, pages 248\u2013255.\nIEEE, 2009.\n\n[8] Carl Doersch and Andrew Zisserman. Multi-task self-supervised visual learning. In Proceedings of the\n\nIEEE International Conference on Computer Vision, pages 2051\u20132060, 2017.\n\n[9] Yunshu Du, Wojciech M Czarnecki, Siddhant M Jayakumar, Razvan Pascanu, and Balaji Lakshminarayanan.\n\nAdapting auxiliary losses using gradient similarity. arXiv preprint arXiv:1812.02224, 2018.\n\n[10] Chelsea Finn, Pieter Abbeel, and Sergey Levine. Model-agnostic meta-learning for fast adaptation of deep\n\nnetworks. In International Conference on Machine Learning, pages 1126\u20131135, 2017.\n\n[11] John Flynn, Ivan Neulander, James Philbin, and Noah Snavely. Deepstereo: Learning to predict new\nviews from the world\u2019s imagery. In Proceedings of the IEEE Conference on Computer Vision and Pattern\nRecognition, pages 5515\u20135524, 2016.\n\n[12] Ian J Goodfellow, Yaroslav Bulatov, Julian Ibarz, Sacha Arnoud, and Vinay Shet. Multi-digit number recog-\nnition from street view imagery using deep convolutional neural networks. arXiv preprint arXiv:1312.6082,\n2013.\n\n[13] John A Hartigan and Manchek A Wong. Algorithm as 136: A k-means clustering algorithm. Journal of\n\nthe Royal Statistical Society. Series C (Applied Statistics), 28(1):100\u2013108, 1979.\n\n[14] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition.\nIn Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770\u2013778, 2016.\n[15] Max Jaderberg, Wojciech Marian Czarnecki, Simon Osindero, Oriol Vinyals, Alex Graves, David Sil-\nver, and Koray Kavukcuoglu. Decoupled neural interfaces using synthetic gradients. In International\nConference on Machine Learning, pages 1627\u20131635, 2017.\n\n[16] Max Jaderberg, Volodymyr Mnih, Wojciech Marian Czarnecki, Tom Schaul, Joel Z Leibo, David Silver, and\nKoray Kavukcuoglu. Reinforcement learning with unsupervised auxiliary tasks. International Conference\non Learning Representations, 2017.\n\n[17] Iasonas Kokkinos. Ubernet: Training a universal convolutional neural network for low-, mid-, and\nhigh-level vision using diverse datasets and limited memory. In Proceedings of the IEEE Conference on\nComputer Vision and Pattern Recognition, pages 6129\u20136138, 2017.\n\n[18] Alex Krizhevsky and Geoffrey Hinton. Learning multiple layers of features from tiny images. Technical\n\nreport, Citeseer, 2009.\n\n[19] Yann LeCun, L\u00e9on Bottou, Yoshua Bengio, and Patrick Haffner. Gradient-based learning applied to\n\ndocument recognition. Proceedings of the IEEE, 86(11):2278\u20132324, 1998.\n\n[20] Zhenguo Li, Fengwei Zhou, Fei Chen, and Hang Li. Meta-sgd: Learning to learn quickly for few shot\n\nlearning. arXiv preprint arXiv:1707.09835, 2017.\n\n[21] Lukas Liebel and Marco K\u00f6rner. Auxiliary tasks in multi-task learning. arXiv preprint arXiv:1805.06334,\n\n2018.\n\n[22] Tsung-Yi Lin, Priya Goyal, Ross Girshick, Kaiming He, and Piotr Doll\u00e1r. Focal loss for dense object\ndetection. In Proceedings of the IEEE international conference on computer vision, pages 2980\u20132988,\n\n10\n\n\f2017.\n\n[23] Shikun Liu, Edward Johns, and Andrew J Davison. End-to-end multi-task learning with attention. In\nProceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 1871\u20131880,\n2019.\n\n[24] Laurens van der Maaten and Geoffrey Hinton. Visualizing data using t-sne. Journal of machine learning\n\nresearch, 9(Nov):2579\u20132605, 2008.\n\n[25] Ishan Misra, Abhinav Shrivastava, Abhinav Gupta, and Martial Hebert. Cross-stitch networks for multi-task\nlearning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages\n3994\u20134003, 2016.\n\n[26] Sachin Ravi and Hugo Larochelle. Optimization as a model for few-shot learning. In International\n\nConference on Learning Representations, 2016.\n\n[27] Sebastian Ruder. An overview of multi-task learning in deep neural networks.\n\narXiv:1706.05098, 2017.\n\narXiv preprint\n\n[28] Adam Santoro, Sergey Bartunov, Matthew Botvinick, Daan Wierstra, and Timothy Lillicrap. Meta-\nlearning with memory-augmented neural networks. In International conference on machine learning,\npages 1842\u20131850, 2016.\n\n[29] J\u00fcrgen Schmidhuber. Learning complex, extended sequences using the principle of history compression.\n\nNeural Computation, 4(2):234\u2013242, 1992.\n\n[30] Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large-scale image recogni-\n\ntion. In International Conference on Learning Representations, 2015.\n\n[31] Jake Snell, Kevin Swersky, and Richard Zemel. Prototypical networks for few-shot learning. In Advances\n\nin Neural Information Processing Systems, pages 4077\u20134087, 2017.\n\n[32] Khurram Soomro, Amir Roshan Zamir, and Mubarak Shah. Ucf101: A dataset of 101 human actions\n\nclasses from videos in the wild. arXiv preprint arXiv:1212.0402, 2012.\n\n[33] Shubham Toshniwal, Hao Tang, Liang Lu, and Karen Livescu. Multitask learning with low-level auxiliary\n\ntasks for encoder-decoder based speech recognition. Proc. Interspeech 2017, pages 3532\u20133536, 2017.\n\n[34] Oriol Vinyals, Charles Blundell, Tim Lillicrap, Daan Wierstra, et al. Matching networks for one shot\n\nlearning. In Advances in Neural Information Processing Systems, pages 3630\u20133638, 2016.\n\n[35] Yabin Zhang, Hui Tang, and Kui Jia. Fine-grained visual categorization using meta-learning optimization\nwith sample selection of auxiliary data. In Proceedings of the European Conference on Computer Vision,\npages 233\u2013248, 2018.\n\n[36] Tinghui Zhou, Matthew Brown, Noah Snavely, and David G Lowe. Unsupervised learning of depth\nand ego-motion from video. In Proceedings of the IEEE Conference on Computer Vision and Pattern\nRecognition, pages 1851\u20131858, 2017.\n\n11\n\n\f", "award": [], "sourceid": 941, "authors": [{"given_name": "Shikun", "family_name": "Liu", "institution": "Imperial College London"}, {"given_name": "Andrew", "family_name": "Davison", "institution": "Imperial College London"}, {"given_name": "Edward", "family_name": "Johns", "institution": "Imperial College London"}]}